lmstudio-mcp-server
Version:
LM Studio MCP server with concurrent multi-agent support and memory safety
136 lines (95 loc) • 3.36 kB
Markdown
# @samscarrow/lmstudio-mcp
LM Studio MCP Server with concurrent multi-agent support and memory safety features.
## Quick Start
```bash
# Run directly with NPX (recommended)
npx @samscarrow/lmstudio-mcp
# Or install globally
npm install -g @samscarrow/lmstudio-mcp
lmstudio-mcp
```
## Requirements
- **Node.js 16+** - To run this NPM package
- **Python 3.7+** - For the MCP server itself
- **LM Studio** - Running with a model loaded
The Python components are automatically downloaded and set up on first run.
## Configuration
### Environment Variables
```bash
# LM Studio API endpoint (default: http://localhost:1234/v1)
export LMSTUDIO_API_BASE="http://localhost:1234/v1"
# Logging level
export LOG_LEVEL="INFO"
```
### Command Line Options
```bash
# Custom API base
npx @samscarrow/lmstudio-mcp --api-base http://192.168.1.100:1234/v1
# Custom log level
npx @samscarrow/lmstudio-mcp --log-level DEBUG
# Show help
npx @samscarrow/lmstudio-mcp --help
# Show version
npx @samscarrow/lmstudio-mcp --version
```
## Features
- 🚀 **Concurrent Multi-Agent Support** - Handle multiple requests simultaneously
- 🛡️ **Memory Safety** - Intelligent memory management and limits
- 📊 **Real-time Monitoring** - Track performance and resource usage
- 🔄 **Auto-retry Logic** - Robust error handling and recovery
- 🎯 **Load Balancing** - Distribute requests across available resources
- 📈 **Scaling Support** - Handle increased load automatically
## Programmatic Usage
You can also use this package programmatically in your Node.js applications:
```javascript
const LMStudioMCP = require('@samscarrow/lmstudio-mcp');
// Create instance with custom options
const server = new LMStudioMCP({
apiBase: 'http://localhost:1234/v1',
logLevel: 'INFO'
});
// Start the server
server.start()
.then(() => console.log('Server started'))
.catch(console.error);
```
## Troubleshooting
### Python Not Found
If you get Python-related errors:
```bash
# Install Python 3 (macOS with Homebrew)
brew install python3
# Install Python 3 (Ubuntu/Debian)
sudo apt update && sudo apt install python3 python3-pip
# Windows - download from python.org
```
### LM Studio Connection Issues
- Ensure LM Studio is running
- Check that a model is loaded in LM Studio
- Verify the API endpoint is correct (default: http://localhost:1234/v1)
- Try using `127.0.0.1` instead of `localhost`
### Port Already in Use
The MCP server uses stdio communication, not HTTP ports. If you see port errors, they're likely from LM Studio itself.
## Installation Locations
Files are installed to:
- **macOS/Linux**: `~/.lmstudio-mcp/`
- **Windows**: `%USERPROFILE%\.lmstudio-mcp\`
## Development
```bash
# Clone the repository
git clone https://github.com/samscarrow/lmstudio-mcp.git
# Install dependencies
npm install
# Run locally
node bin/lmstudio-mcp.js
```
## License
MIT - see LICENSE file for details.
## Support
- 📖 **Documentation**: https://github.com/samscarrow/lmstudio-mcp
- 🐛 **Issues**: https://github.com/samscarrow/lmstudio-mcp/issues
- 💬 **Discussions**: https://github.com/samscarrow/lmstudio-mcp/discussions
## Related
- [LM Studio](https://lmstudio.ai) - Local LLM runtime
- [Model Context Protocol](https://modelcontextprotocol.io) - MCP specification
- [Claude Code](https://claude.ai/code) - AI coding assistant