New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

ai-embedding-mcp-server

Package Overview
Dependencies
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ai-embedding-mcp-server

MCP Server for AI Embedding and RAG functionality

latest
Source
npmnpm
Version
1.0.2
Version published
Maintainers
1
Created
Source

AI Embedding MCP Server

A Model Context Protocol (MCP) server that provides AI embedding and RAG (Retrieval-Augmented Generation) functionality. This server converts your existing AI embedding application into an MCP-compatible service.

Features

  • Document Ingestion: Ingest documents and create vector embeddings
  • RAG Chat: Chat with AI using business rules context from ingested documents
  • Vector Search: Search for similar documents using vector similarity
  • Embedding Generation: Generate embeddings for text using HuggingFace models
  • Project Management: Organize documents by project ID
  • Multiple LLM Support: Works with Ollama (local) and HuggingFace models

Prerequisites

  • Node.js 18+
  • PostgreSQL with pgvector extension
  • Ollama (for local LLM) or HuggingFace API key
  • Python 3.8+ (for some dependencies)

Installation

  • Clone and install dependencies:

    npm install
    
  • Set up PostgreSQL with pgvector:

    # Install pgvector extension
    sudo -u postgres psql -c "CREATE EXTENSION vector;"
    
    # Run the setup script
    psql -U your_username -d your_database -f setup.sql
    
  • Configure environment:

    cp env.example .env
    # Edit .env with your configuration
    
  • Install Ollama (optional, for local LLM):

    # Install Ollama
    curl -fsSL https://ollama.ai/install.sh | sh
    
    # Pull a model
    ollama pull mistral
    ollama pull qwen2.5-coder:7b
    ollama pull nomic-embed-text
    

Configuration

Edit your .env file with the following variables:

# Database Configuration
DATABASE_URL=postgresql://username:password@localhost:5432/embedding_db

# HuggingFace API Configuration
HUGGINGFACE_API_KEY=your_huggingface_api_key_here

# OpenAI API Configuration (optional)
OPENAI_API_KEY=your_openai_api_key_here

# Ollama Configuration (for local LLM)
OLLAMA_BASE_URL=http://localhost:11434

# Server Configuration
PORT=3000
NODE_ENV=development

Usage

Starting the MCP Server

npm start

The server will run on stdio and can be connected to MCP clients.

Available Tools

1. ingest_document

Ingest a document into the vector database.

Parameters:

  • filePath (string): Path to the document file
  • projectId (string): Project ID to associate with

Example:

{
  "name": "ingest_document",
  "arguments": {
    "filePath": "./docs/business-rules.mdc",
    "projectId": "project1"
  }
}

2. chat_with_rules

Chat with AI using business rules context.

Parameters:

  • message (string): User's question or message
  • projectId (string): Project ID to search for context

Example:

{
  "name": "chat_with_rules",
  "arguments": {
    "message": "What are the business rules for the resource module?",
    "projectId": "project1"
  }
}

3. generate_embedding

Generate embeddings for text.

Parameters:

  • text (string): Text to generate embeddings for

Example:

{
  "name": "generate_embedding",
  "arguments": {
    "text": "This is sample text for embedding"
  }
}

Search for similar documents using vector similarity.

Parameters:

  • query (string): Search query text
  • projectId (string): Project ID to search within
  • topK (number, optional): Number of results (default: 5)

Example:

{
  "name": "vector_search",
  "arguments": {
    "query": "business rules",
    "projectId": "project1",
    "topK": 3
  }
}

5. list_projects

List all projects with their document counts.

Example:

{
  "name": "list_projects",
  "arguments": {}
}

6. get_project_documents

Get all documents for a specific project.

Parameters:

  • projectId (string): Project ID to get documents for

Example:

{
  "name": "get_project_documents",
  "arguments": {
    "projectId": "project1"
  }
}

Integration with MCP Clients

Claude Desktop

Add to your Claude Desktop configuration:

{
  "mcpServers": {
    "ai-embedding": {
      "command": "node",
      "args": ["/path/to/your/ai-embedding/server.js"],
      "env": {
        "DATABASE_URL": "postgresql://username:password@localhost:5432/embedding_db",
        "HUGGINGFACE_API_KEY": "your_api_key"
      }
    }
  }
}

Other MCP Clients

The server follows the MCP protocol and can be integrated with any MCP-compatible client by running:

node server.js

Architecture

The MCP server is built on top of your existing codebase:

  • server.js: Main MCP server with tool definitions
  • rag-ollama-embedding.js: RAG functionality with Ollama integration
  • rag-emedding.js: HuggingFace embedding functionality
  • emedding-docs.js: Document processing utilities

Supported File Types

  • Text files (.txt, .md, .mdc)
  • PDF files (.pdf)
  • Word documents (.docx)

Database Schema

The server uses a PostgreSQL database with the following schema:

CREATE TABLE documents (
    id SERIAL PRIMARY KEY,
    project_id VARCHAR(255) NOT NULL,
    file_name VARCHAR(500) NOT NULL,
    content TEXT NOT NULL,
    embedding vector(384),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Troubleshooting

Common Issues

  • Database Connection Error: Ensure PostgreSQL is running and the connection string is correct
  • pgvector Extension Missing: Install the pgvector extension in your PostgreSQL database
  • Ollama Connection Error: Ensure Ollama is running and the models are pulled
  • HuggingFace API Error: Check your API key and rate limits

Debug Mode

Run with debug logging:

DEBUG=* npm start

Development

Adding New Tools

To add new tools to the MCP server:

  • Add the tool definition to the ListToolsRequestSchema handler
  • Add the tool implementation to the CallToolRequestSchema handler
  • Update this README with the new tool documentation

Testing

Test individual components:

# Test document ingestion
node -e "require('./rag-ollama-embedding.js').ingestDocument('./docs/business-rules.mdc', 'test')"

# Test chat functionality
node -e "require('./rag-ollama-embedding.js').chatWithRules('test message', 'test')"

License

ISC License - see LICENSE file for details.

Contributing

  • Fork the repository
  • Create a feature branch
  • Make your changes
  • Test thoroughly
  • Submit a pull request

Support

For issues and questions:

  • Check the troubleshooting section
  • Review the logs for error messages
  • Ensure all dependencies are properly installed
  • Verify your environment configuration

Keywords

mcp

FAQs

Package last updated on 25 Oct 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts