lamplighter-mcp
Version:
An intelligent context engine for AI-assisted software development
379 lines (279 loc) β’ 14 kB
Markdown
# Lamplighter-MCP



Lamplighter-MCP is a backend service designed to act as an intelligent context engine for software development teams using AI coding assistants like Cursor, particularly within large and complex codebases. It addresses the challenges of providing relevant, up-to-date project context (code structure, feature specifications, task status) to AI models and developers, and automates the breakdown of large feature specifications into manageable tasks.
## π Features
- **Codebase Analysis**: Automatically analyze project structure and generate a summary
- **Confluence Integration**: Process Confluence specifications into actionable tasks
- **Task Management**: Track and update task status through MCP tools
- **Intelligent Context**: Provide structured context to AI coding assistants
- **History Logging**: Keep a record of all system events and actions
- **MCP Integration**: Seamlessly connect with Cursor through the Model Context Protocol
## π Table of Contents
- [Project Overview](#project-overview)
- [Installation](#installation)
- [Configuration](#configuration)
- [Usage](#usage)
- [Architecture](#architecture)
- [Directory Structure](#directory-structure)
- [Testing](#testing)
- [Development Roadmap](#development-roadmap)
- [Contributing](#contributing)
- [License](#license)
## π Project Overview
Lamplighter-MCP operates exclusively through the Model Context Protocol (MCP) for seamless integration with compatible tools. It serves as a centralized context provider for AI coding assistants, enhancing their understanding and code generation capabilities by maintaining:
- Structured codebase summaries
- Feature specifications and task breakdowns
- Task status tracking
- Project history logging
The project draws thematic inspiration from "The Little Prince," specifically the Lamplighter character, symbolizing guidance and dutiful tracking.
## π Installation
### Prerequisites
- Node.js 18.x or higher
- npm 8.x or higher
- Access to Confluence API (for specification processing)
- Access to an LLM API (OpenAI, Google Gemini, etc.)
### Setup
1. Clone the repository:
```bash
git clone <repository-url>
cd lamplighter-mcp
```
2. Install dependencies:
```bash
npm install
```
3. Build the TypeScript code:
```bash
npm run build
```
4. Create a `.env` file based on `.env.example`:
```bash
cp .env.example .env
# Edit .env with your configuration values
```
## βοΈ Configuration
Configure the application by setting the following environment variables in your `.env` file:
```
# Server Configuration
PORT=3001
LAMPLIGHTER_CONTEXT_DIR=./lamplighter_context
# Confluence Configuration (for specification processing)
CONFLUENCE_URL=https://your-domain.atlassian.net
CONFLUENCE_API_TOKEN=your-api-token
# AI Service Configuration
AI_PROVIDER=openai # openai or google
OPENAI_API_KEY=your-openai-key
GOOGLE_API_KEY=your-google-key
AI_MODEL=gpt-4 # or google's model name
```
## π Usage
### Using NPX (Recommended)
The easiest way to use Lamplighter-MCP is through `npx`:
```bash
# Initialize a new project with Lamplighter-MCP
npx lamplighter-mcp init
# Start the Lamplighter-MCP server
npx lamplighter-mcp start
# Get help with available commands
npx lamplighter-mcp help
```
You can also install it globally for easier access:
```bash
npm install -g lamplighter-mcp
lamplighter-mcp start
```
This approach is similar to how `claude-task-master` works, making it easy to integrate with your existing workflow.
### Starting the Server Manually
Run the development server with hot reload:
```bash
npm run dev
```
Run the production server:
```bash
npm start
```
### Cursor Integration
To integrate Lamplighter-MCP with Cursor, you can use one of these approaches:
#### Option 1: Use Local MCP Server
1. Start the Lamplighter-MCP server using `npx lamplighter-mcp start`
2. Configure your Cursor application to connect to the local server by enabling MCP in Cursor preferences
3. Access Lamplighter-MCP features directly through Cursor by prompting the AI to use the available tools
#### Option 2: Direct NPX Integration in Cursor
Add this to your `.cursor/settings.json` file:
```json
{
"mcpServers": {
"lamplighter-mcp": {
"command": "npx",
"args": ["-y", "lamplighter-mcp", "start"],
"env": {
"CONFLUENCE_URL": "https://your-domain.atlassian.net",
"CONFLUENCE_API_TOKEN": "your-api-token",
"CONFLUENCE_USERNAME": "your-username",
"AI_PROVIDER": "openai",
"OPENAI_API_KEY": "your-openai-key"
}
}
}
}
```
This configuration allows Cursor to automatically start the Lamplighter-MCP server when needed.
### Available MCP Tools
- `analyze_codebase`: Analyzes project structure and generates a summary
- `process_confluence_spec`: Processes a Confluence specification into tasks
- `update_task_status`: Updates the status of a specific task
- `suggest_next_task`: Identifies the next task to work on
- `get_codebase_summary`: Retrieves the codebase summary
- `get_history_log`: Retrieves the history log
- `get_feature_tasks`: Retrieves tasks for a specific feature
## π’ Deployment & Testing
### Deployment Options
Lamplighter-MCP can be deployed in several ways, depending on your needs:
1. **Manual Deployment**
- Run on a development server via Node.js directly
- Use a process manager like PM2: `pm2 start dist/server.js --name lamplighter-mcp`
- Setup appropriate environment variables in `.env`
2. **Containerized Deployment (Docker)**
```bash
# Build the Docker image
docker build -t lamplighter-mcp:latest .
# Run the container
docker run -d -p 3001:3001 --env-file .env --name lamplighter-mcp lamplighter-mcp:latest
```
3. **Cloud Deployment (AWS/Azure/GCP)**
- Deploy on a cloud VM with Node.js installed
- Use a service like AWS Elastic Beanstalk or GCP App Engine
- Configure environment variables in the cloud service console
For detailed deployment instructions, see [documents/deployment_plan.md](./documents/deployment_plan.md).
### Health Check
The server provides a health endpoint at `/health` that can be used to verify server status:
```bash
curl http://localhost:3001/health
```
Expected response:
```json
{
"status": "ok",
"uptime": 1234.56,
"timestamp": "2025-04-13T06:20:01.618Z",
"version": "1.0.0",
"name": "Lamplighter-MCP"
}
```
### Testing
For comprehensive end-to-end testing, we follow the test plan outlined in [documents/e2e_testing_plan.md](./documents/e2e_testing_plan.md). Key testing approaches include:
1. **Manual Testing**: Using Cursor to interact with the MCP server
2. **MCP Inspector**: Direct testing of individual MCP tools
3. **Automated API Tests**: For validating endpoint responses
## π§ System Overview
Lamplighter-MCP serves as an intelligent context engine for software development teams. It operates as a bridge between AI coding assistants (like Cursor) and the complex context of software projects.
### System Purpose
Lamplighter-MCP performs four key functions:
1. **Understanding Codebase Structure**: Analyzes your codebase to create a structured overview of its components, modules, and relationships.
2. **Breaking Down Feature Specifications**: Processes feature specifications from Confluence and breaks them into actionable tasks.
3. **Tracking Task Status**: Maintains the status of each task (ToDo, InProgress, Done) and suggests which task to work on next.
4. **Maintaining Development History**: Logs all interactions and changes to create a timeline of development activities.
### Context Files
Lamplighter-MCP manages the following Markdown files that provide valuable context to both AI assistants and developers:
- **`codebase_summary.md`**: Contains a structured overview of the project, including key directories, modules, and architecture.
- **`feature_tasks/feature_[ID]_tasks.md`**: Contains the task breakdown for each feature specification, with task statuses.
- **`history_log.md`**: Contains a chronological log of all system events and actions.
### Key Workflows
#### Codebase Analysis
1. **AI Assistant Suggestion**: "I notice we're working in a complex codebase. Would you like me to analyze it to better understand its structure?"
2. **User Confirmation**: "Yes, please analyze the codebase."
3. **AI Execution**: The AI uses the `analyze_codebase` tool, which generates a `codebase_summary.md` file.
4. **AI Response**: "I've analyzed the codebase. Here's a summary of its structure: [key insights from the analysis]"
#### Feature Specification Processing
1. **User Request**: "I have a new feature to implement from this Confluence page: [URL]"
2. **AI Execution**: The AI uses the `process_confluence_spec` tool with the URL parameter.
3. **AI Response**: "I've processed the specification and broken it down into tasks. Here they are: [list of tasks]"
#### Task Management
1. **AI Suggestion**: "Based on your feature '[feature_id]', the next task you should work on is: [task description]"
2. **User Confirmation**: "I've completed that task."
3. **AI Execution**: The AI uses the `update_task_status` tool to mark the task as Done.
4. **AI Follow-up**: "Task marked as complete. The next task to work on is: [next task description]"
### AI Suggestion + User Confirmation Protocol
For any action that modifies data (especially task statuses), the AI should:
1. **Suggest the action**: "Would you like me to mark the task '[task]' as complete?"
2. **Wait for confirmation**: The user should explicitly confirm before the AI proceeds.
3. **Execute the action**: Only after confirmation should the AI use the appropriate tool.
4. **Confirm completion**: The AI should inform the user that the action has been completed.
This two-step process ensures that users maintain control over their project state while benefiting from AI assistance.
## ποΈ Architecture
Lamplighter-MCP follows a modular architecture with clear separation of concerns:
- **Core Modules** (`src/modules/`): Contain the core business logic
- **Services** (`src/services/`): Integrate with external services (AI, Confluence)
- **MCP Server** (`src/server.ts`): Provides MCP tools and handles client connections
- **Context Files** (`lamplighter_context/`): Store structured Markdown files as the system's state
### Key Components
- **CodebaseAnalyzer**: Analyzes the project structure and generates summaries
- **HistoryLogger**: Maintains an event log of system actions
- **FeatureSpecProcessor**: Processes specifications into actionable tasks
- **TaskManager**: Manages task status and retrieval
- **AIService**: Abstracts interactions with different LLM providers
- **ConfluenceReader**: Fetches content from Confluence
## π Directory Structure
```
lamplighter-mcp/
βββ .cursor/ # Cursor configuration
β βββ mcp.json # MCP connection configuration
β βββ rules/ # AI guidance rules
β βββ *.mdc # Markdown rule files
βββ __tests__/ # Test files
β βββ codebaseAnalyzer.test.ts
β βββ historyLogger.test.ts
β βββ server.test.ts
β βββ README.md # Testing documentation
βββ documents/ # Project documentation
β βββ prd.md # Product Requirements Document
β βββ tasks.md # Implementation Tasks
β βββ erd.md # Entity Relationship Document
β βββ mcp_*.md # MCP documentation
βββ lamplighter_context/ # Context storage directory
β βββ codebase_summary.md # Generated by CodebaseAnalyzer
β βββ feature_tasks/ # Generated task files
β βββ history_log.md # Generated by HistoryLogger
βββ src/ # Source code
β βββ modules/ # Core logic modules
β β βββ codebaseAnalyzer.ts
β β βββ confluenceReader.ts
β β βββ featureSpecProcessor.ts
β β βββ historyLogger.ts
β β βββ taskManager.ts
β βββ services/ # External service integration
β β βββ aiService.ts
β βββ server.ts # MCP server implementation
βββ .env.example # Example environment variables
βββ jest.config.ts # Jest configuration
βββ package.json # Node.js dependencies
βββ tsconfig.json # TypeScript configuration
βββ README.md # This file
```
## π§ͺ Testing
Lamplighter-MCP uses Jest for testing. The test suite includes:
- Unit tests for core modules
- Mocked external dependencies
- Component tests for server elements
### Running Tests
```bash
npm test -- --config=jest.config.ts
```
For more details, see the [testing documentation](./__tests__/README.md).
## πΊοΈ Development Roadmap
See [tasks.md](./documents/tasks.md) for the detailed implementation roadmap. The project is being developed in phases:
1. **Phase 1**: Project Setup & Core Infrastructure β
2. **Phase 2**: Core Logic Modules Implementation π
3. **Phase 3**: MCP Tool Implementation & Integration π
4. **Phase 4**: Cursor Integration & Finalization π
5. **Phase 5**: Deployment & Testing π
## π€ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
*Lamplighter-MCP: Illuminating the path for AI-assisted software development.*