UNPKG

@chainlink/mcp-server

Version:
310 lines (233 loc) 9.8 kB
# Table of Contents - [Installation](#installation) - [Prerequisites](#prerequisites) - [Quick Install](#quick-install) - [IDE Setup](#ide-setup) - [Cursor IDE / Claude Desktop](#cursor-ide--claude-desktop) - [VSCode / Gemini CLI](#vscode--gemini-cli) - [Configuration](#configuration) - [AI Service Options](#ai-service-options) - [Optional: Local Ollama Setup](#optional-local-ollama-setup) - [Available Tools](#available-tools) - [chainlink_developer_assistant](#chainlink_developer_assistant) - [Usage](#usage) - [Troubleshooting](#troubleshooting) - [Developing This MCP Server](#developing-this-mcp-server) - [Support](#support) - [License](#license) - [Terms of Use](#disclaimer) # Chainlink MCP Server A Model Context Protocol (MCP) server that provides AI-powered access to Chainlink documentation, focusing on CCIP (Cross-Chain Interoperability Protocol) and other Chainlink products. ## Installation ### Prerequisites - Node.js 18+ - npm or pnpm ### Quick Install The MCP server is available as an npm package and can be used directly with MCP-compatible IDEs: ```bash npm install -g @chainlink/mcp-server ``` ## IDE Setup ### Cursor IDE / Claude Desktop **Cursor IDE**: Add to `$HOME/.cursor/mcp.json` (macOS) or `%USERPROFILE%\.cursor\mcp.json` (Windows) **Claude Desktop**: Add to your Claude Desktop configuration file ```json { "mcpServers": { "chainlink": { "command": "npx", "args": ["-y", "@chainlink/mcp-server"], "env": { "MCP_AI_SERVICE": "openai", "ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here", "OPENAI_API_KEY": "your-openai-api-key-here", "GEMINI_API_KEY": "your-google-api-key-here" } } } } ``` ### VSCode / Gemini CLI **VSCode**: Add to your VSCode MCP configuration **Gemini CLI**: Add to your Gemini CLI configuration ```json { "servers": { "chainlink": { "command": "npx", "args": ["-y", "@chainlink/mcp-server"], "env": { "MCP_AI_SERVICE": "openai", "ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here", "OPENAI_API_KEY": "your-openai-api-key-here", "GEMINI_API_KEY": "your-google-api-key-here" } } } } ``` ## Configuration ### Required Configuration This MCP server requires both an AI service AND an embedding provider to function. The embedding provider is used for semantic search through Chainlink documentation, which is essential for providing accurate responses. ### AI Service Options The server supports multiple AI services. Configure which one to use by setting `MCP_AI_SERVICE` in your IDE's MCP configuration: #### **OpenAI GPT (Default)** ```json { "env": { "OPENAI_API_KEY": "your-openai-api-key" } } ``` > **Note**: Even when using Anthropic for AI responses, OpenAI API key is required for embeddings unless using Ollama. #### **Anthropic Claude** ```json { "env": { "MCP_AI_SERVICE": "anthropic", "ANTHROPIC_API_KEY": "sk-your-anthropic-api-key" } } ``` > **Note**: When using OpenAI, the same API key is used for both AI responses and embeddings. #### **Google Gemini** ```json { "env": { "MCP_AI_SERVICE": "gemini", "GEMINI_API_KEY": "your-google-api-key" } } ``` > **Note**: Even when using Gemini for AI responses, OpenAI API key is required for embeddings unless using Ollama. #### **Ollama (Local/Free)** ```json { "env": { "MCP_AI_SERVICE": "ollama", "OLLAMA_MODEL": "llama3.2:3b", "OLLAMA_URL": "http://localhost:11434", "EMBEDDINGS_PROVIDER": "ollama" } } ``` > **Note**: Ollama is the only option that handles both AI responses and embeddings locally without requiring external API keys. > **Note**: You can include API keys for multiple services in your configuration and switch between them by changing only the `MCP_AI_SERVICE` value. #### **Embeddings Configuration** The server uses OpenAI embeddings by default for document search functionality. If you're not using OpenAI as your AI service, you'll need to configure embeddings separately: **Using OpenAI Embeddings (Default):** ```json { "env": { "OPENAI_API_KEY": "your-openai-api-key", } } ``` **Using Local Ollama Embeddings:** ```json { "env": { "OPENAI_API_KEY": "your-openai-api-key", "EMBEDDINGS_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } ``` > **Important**: If you're not using OpenAI as your primary AI service, you'll still need either an OpenAI API key for embeddings OR set `EMBEDDINGS_PROVIDER=ollama` to use local embeddings. ### Optional: Local Ollama Setup For privacy and no API costs, you can run everything locally using Ollama: 1. **Install Ollama**: ```bash # macOS brew install ollama # Linux curl -fsSL https://ollama.ai/install.sh | sh ``` 2. **Pull a model**: ```bash ollama pull llama3.2:3b ``` 3. **Start Ollama**: ```bash ollama serve & ``` 4. **Update your MCP configuration**: ```json { "env": { "MCP_AI_SERVICE": "ollama", "OLLAMA_MODEL": "llama3.2:3b", "OLLAMA_URL": "http://localhost:11434" } } ``` ## Available Tools ### `chainlink_developer_assistant` Comprehensive Chainlink developer assistant that handles any type of developer query spanning code examples, configurations, and concepts. Provides access to: - **Fetched API Data**: CCIP chain configurations and supported tokens (fallback data source) - **Documentation Search**: Semantic search across Chainlink documentation - **Code Examples**: Smart Solidity contract examples and implementation patterns - **Configuration Help**: Network configurations, contract addresses, and deployment guidance - **Best Practices**: Security recommendations and development patterns **Example Queries:** - "How do I send a cross-chain message using CCIP?" - "What are the supported chains for CCIP?" - "Show me a complete CCIP contract example" - "Help me configure CCIP token transfers" ## Usage Once configured in your IDE, you can ask questions about Chainlink development directly in your chat interface. The MCP server will automatically provide relevant documentation, code examples, and configuration data. ## Troubleshooting ### "No MCP servers configured" Error Make sure your `mcp.json` file is in the correct location and properly formatted: - **Cursor IDE**: `$HOME/.cursor/mcp.json` (macOS) or `%USERPROFILE%\.cursor\mcp.json` (Windows) - **Claude Desktop**: Claude Desktop configuration file - **VSCode/Gemini CLI**: Your respective IDE's MCP configuration ### Authentication Errors Verify your API key is correctly set in your IDE's MCP configuration: - **Anthropic**: Key should start with `sk-` - **OpenAI**: Key should start with `sk-` - **Gemini**: Valid Google API key - **Ollama**: No API key required, but ensure Ollama is running ### Embedding Provider Errors "Embedding provider is required for vector database operations" This error occurs when no embedding provider is configured. The MCP server requires an embedding provider to perform semantic search through Chainlink documentation. **Solutions**: 1. **For OpenAI embeddings**: Set `OPENAI_API_KEY` in your configuration 2. **For local embeddings**: Set up Ollama with `EMBEDDINGS_PROVIDER=ollama` ```json { "env": { "MCP_AI_SERVICE": "anthropic", "MCP_ANTHROPIC_API_KEY": "sk-your-anthropic-key", "OPENAI_API_KEY": "sk-your-openai-key" } } ``` ### AI Service Configuration Issues If you get errors about missing API keys or unavailable services: 1. **Check your `MCP_AI_SERVICE` setting**: Ensure it matches one of: `openai`, `anthropic`, `gemini`, or `ollama` 2. **Verify API key**: Make sure the corresponding API key is set (e.g., `OPENAI_API_KEY` for OpenAI) 3. **For Ollama**: Ensure Ollama is running locally ### Ollama Connection Issues If using Ollama and getting connection errors: ```bash # Check if Ollama is running ollama list # Start Ollama if needed ollama serve # Test the model ollama run llama3.2:3b "Hello" ``` **Common Ollama Issues:** - Ollama not running: Start with `ollama serve` - Model not available: Pull with `ollama pull llama3.2:3b` - Wrong URL: Check `OLLAMA_URL` in your configuration (default: `http://localhost:11434`) ## Developing This MCP Server See [DEVELOPMENT.md](./DEVELOPMENT.md) for development setup ## Support - **Issues**: Report bugs on [GitHub Issues](https://github.com/smartcontractkit/mcp-server/issues) - **Discussions**: Ask questions in [GitHub Discussions](https://github.com/smartcontractkit/mcp-server/discussions) ## License MIT License - see [LICENSE](./LICENSE) for details. ## Disclaimer The Chainlink MCP Server (npm package @chainlink/mcp-server) is in the “Early Access” stage of development, which means that it currently has functionality which is under development and may be changed in later versions. There is no guarantee any of the contemplated features of the MCP Server will be implemented as specified. The MCP Server is provided on an “AS IS” and “AS AVAILABLE” basis without any representations, warranties, covenants, or conditions of any kind. Use at your own risk. Users remain fully responsible for reviewing, auditing, and deploying any code or contracts. Do not use the code in this example in a production environment without completing your own audits and application of best practices, including compliance with applicable licenses governing any code used. Neither Chainlink Labs, the Chainlink Foundation, nor Chainlink node operators are responsible for unintended outputs that are generated due to errors in code. Please review the Chainlink Terms of Service which provides important information and disclosures. By using the MCP Server, you expressly acknowledge and agree to accept these terms.