
Company News
Socket Named Top Sales Organization by RepVue
Socket won two 2026 Reppy Awards from RepVue, ranking in the top 5% of all sales orgs. AE Alexandra Lister shares what it's like to grow a sales career here.
@r34dy/claude-context-mcp
Advanced tools
Model Context Protocol (MCP) integration for Claude Context - A powerful MCP server that enables AI assistants and agents to index and search codebases using semantic search.
📖 New to Claude Context? Check out the main project README for an overview and setup instructions.
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Before using the MCP server, make sure you have:
💡 Setup Help: See the main project setup guide for detailed installation instructions.
Claude Context MCP supports multiple embedding providers. Choose the one that best fits your needs:
📋 Quick Reference: For a complete list of environment variables and their descriptions, see the Environment Variables Guide.
# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
EMBEDDING_PROVIDER=OpenAI
OpenAI provides high-quality embeddings with excellent performance for code understanding.
# Required: Your OpenAI API key
OPENAI_API_KEY=sk-your-openai-api-key
# Optional: Specify embedding model (default: text-embedding-3-small)
EMBEDDING_MODEL=text-embedding-3-small
# Optional: Custom API base URL (for Azure OpenAI or other compatible services)
OPENAI_BASE_URL=https://api.openai.com/v1
Available Models:
See getSupportedModels in openai-embedding.ts for the full list of supported models.
Getting API Key:
VoyageAI offers specialized code embeddings optimized for programming languages.
# Required: Your VoyageAI API key
VOYAGEAI_API_KEY=pa-your-voyageai-api-key
# Optional: Specify embedding model (default: voyage-code-3)
EMBEDDING_MODEL=voyage-code-3
Available Models:
See getSupportedModels in voyageai-embedding.ts for the full list of supported models.
Getting API Key:
Google's Gemini provides competitive embeddings with good multilingual support.
# Required: Your Gemini API key
GEMINI_API_KEY=your-gemini-api-key
# Optional: Specify embedding model (default: gemini-embedding-001)
EMBEDDING_MODEL=gemini-embedding-001
# Optional: Custom API base URL (for custom endpoints)
GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta
Available Models:
See getSupportedModels in gemini-embedding.ts for the full list of supported models.
Getting API Key:
Ollama allows you to run embeddings locally without sending data to external services.
# Required: Specify which Ollama model to use
EMBEDDING_MODEL=nomic-embed-text
# Optional: Specify Ollama host (default: http://127.0.0.1:11434)
OLLAMA_HOST=http://127.0.0.1:11434
Setup Instructions:
Install Ollama from ollama.ai
Pull the embedding model:
ollama pull nomic-embed-text
Ensure Ollama is running:
ollama serve
Claude Context needs a vector database. You can sign up on Zilliz Cloud to get an API key.

Copy your Personal Key to replace your-zilliz-cloud-api-key in the configuration examples.
MILVUS_TOKEN=your-zilliz-cloud-api-key
You can set the embedding batch size to optimize the performance of the MCP server, depending on your embedding model throughput. The default value is 100.
EMBEDDING_BATCH_SIZE=512
You can configure custom file extensions and ignore patterns globally via environment variables:
# Additional file extensions to include beyond defaults
CUSTOM_EXTENSIONS=.vue,.svelte,.astro,.twig
# Additional ignore patterns to exclude files/directories
CUSTOM_IGNORE_PATTERNS=temp/**,*.backup,private/**,uploads/**
These settings work in combination with tool parameters - patterns from both sources will be merged together.
Use the command line interface to add the Claude Context MCP server:
# Add the Claude Context MCP server
claude mcp add claude-context -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_TOKEN=your-zilliz-cloud-api-key -- npx @zilliz/claude-context-mcp@latest
See the Claude Code MCP documentation for more details about MCP server management.
Codex CLI uses TOML configuration files:
Create or edit the ~/.codex/config.toml file.
Add the following configuration:
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
[mcp_servers.claude-context]
command = "npx"
args = ["@zilliz/claude-context-mcp@latest"]
env = { "OPENAI_API_KEY" = "your-openai-api-key", "MILVUS_TOKEN" = "your-zilliz-cloud-api-key" }
# Optional: override the default 10s startup timeout
startup_timeout_ms = 20000
Gemini CLI requires manual configuration through a JSON file:
Create or edit the ~/.gemini/settings.json file.
Add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Create or edit the ~/.qwen/settings.json file and add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.
OpenAI Configuration (Default):
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "OpenAI",
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
VoyageAI Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "VoyageAI",
"VOYAGEAI_API_KEY": "your-voyageai-api-key",
"EMBEDDING_MODEL": "voyage-code-3",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Gemini Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Gemini",
"GEMINI_API_KEY": "your-gemini-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Ollama Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Ollama",
"EMBEDDING_MODEL": "nomic-embed-text",
"OLLAMA_HOST": "http://127.0.0.1:11434",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Go to: Settings -> MCP -> Add MCP Server
Add the following configuration to your Void MCP settings:
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Add to your Claude Desktop configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
The Claude Context MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
claude-contextSTDIOnpx["@zilliz/claude-context-mcp@latest"]OPENAI_API_KEY: your-openai-api-keyMILVUS_TOKEN: your-zilliz-cloud-api-keyCline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
Open Cline and click on the MCP Servers icon in the top navigation bar.
Select the Installed tab, then click Advanced MCP Settings.
In the cline_mcp_settings.json file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
Click the hamburger menu.
Select Settings.
Navigate to the Tools section.
Click the + Add MCP button.
Enter the following command:
npx @zilliz/claude-context-mcp@latest
Name the MCP: Claude Context.
Click the Add button.
mcpServers array in the augment.advanced object"augment.advanced": {
"mcpServers": [
{
"name": "claude-context",
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"]
}
]
}
Roo Code utilizes a JSON configuration file for MCP servers:
Open Roo Code and navigate to Settings → MCP Servers → Edit Global Config.
In the mcp_settings.json file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Zencoder offers support for MCP tools and servers in both its JetBrains and VS Code plugin versions.
ToolsAdd Custom MCPClaude Context and server configuration from below, and make sure to hit the Install button{
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
Install button.For LangChain/LangGraph integration examples, see this example.
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
npx @zilliz/claude-context-mcp@latest
index_codebaseIndex a codebase directory for hybrid search (BM25 + dense vector).
Parameters:
path (required): Absolute path to the codebase directory to indexforce (optional): Force re-indexing even if already indexed (default: false)splitter (optional): Code splitter to use - 'ast' for syntax-aware splitting with automatic fallback, 'langchain' for character-based splitting (default: "ast")customExtensions (optional): Additional file extensions to include beyond defaults (e.g., ['.vue', '.svelte', '.astro']). Extensions should include the dot prefix or will be automatically added (default: [])ignorePatterns (optional): Additional ignore patterns to exclude specific files/directories beyond defaults (e.g., ['static/', '*.tmp', 'private/']) (default: [])search_codeSearch the indexed codebase using natural language queries with hybrid search (BM25 + dense vector).
Parameters:
path (required): Absolute path to the codebase directory to search inquery (required): Natural language query to search for in the codebaselimit (optional): Maximum number of results to return (default: 10, max: 50)extensionFilter (optional): List of file extensions to filter results (e.g., ['.ts', '.py']) (default: [])clear_indexClear the search index for a specific codebase.
Parameters:
path (required): Absolute path to the codebase directory to clear index forget_indexing_statusGet the current indexing status of a codebase. Shows progress percentage for actively indexing codebases and completion status for indexed codebases.
Parameters:
path (required): Absolute path to the codebase directory to check status forThis package is part of the Claude Context monorepo. Please see:
MIT - See LICENSE for details
FAQs
Model Context Protocol integration for Claude Context
We found that @r34dy/claude-context-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Company News
Socket won two 2026 Reppy Awards from RepVue, ranking in the top 5% of all sales orgs. AE Alexandra Lister shares what it's like to grow a sales career here.

Security News
NIST will stop enriching most CVEs under a new risk-based model, narrowing the NVD's scope as vulnerability submissions continue to surge.

Company News
/Security News
Socket is an initial recipient of OpenAI's Cybersecurity Grant Program, which commits $10M in API credits to defenders securing open source software.