
Security News
MCP Steering Committee Launches Official MCP Registry in Preview
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.
A Model Context Protocol server enabling AI-to-AI collaboration across multiple providers.
A Model Context Protocol (MCP) server that enables AI models to consult with another AI model through tool calling. This server acts as a bridge, allowing AI assistants to ask questions to different AI models when they need help, alternative perspectives, code reviews, or expert opinions on complex problems. It supports multiple AI providers and is designed for seamless integration with MCP-compatible clients.
The easiest way to use this MCP server is with npx:
{
"mcpServers": {
"ask-ai": {
"command": "npx",
"args": ["ask-ai-mcp"],
"env": {
"PROVIDER": "openai",
"MODEL": "o3-2025-04-16",
"API_KEY": "your_api_key_here"
}
}
}
}
For development or customization:
git clone <link to repository>
cd ask-ai-mcp
pnpm install
pnpm run build
Then reference the built server in your MCP configuration:
{
"mcpServers": {
"ask-ai": {
"command": "node",
"args": ["/path/to/ask-ai-mcp/dist/server.js"],
"env": {
"PROVIDER": "openai",
"MODEL": "o3-2025-04-16",
"API_KEY": "your_api_key_here"
}
}
}
}
Variable | Description |
---|---|
PROVIDER | The AI provider to use. openai , anthropic , google , perplexity , openai-compatible |
MODEL | The model name from the selected provider. |
API_KEY | Your API key for the chosen provider. |
Variable | Description |
---|---|
TEMPERATURE | Sampling temperature for the model (e.g., 0.7 ). |
MAX_TOKENS | Maximum number of tokens to generate (e.g., 10000 ). |
BASE_URL | Custom endpoint for openai-compatible providers. |
REASONING_EFFORT | For reasoning models: low , medium , high , max , or a budget string (e.g., "10000" ). |
PROVIDER=openai
MODEL=o3-2025-04-16
API_KEY=your_openai_api_key
# Optional
REASONING_EFFORT=medium # low, medium, high, max, or budget like "10000"
PROVIDER=anthropic
MODEL=claude-opus-4-20250514
API_KEY=your_anthropic_api_key
PROVIDER=google
MODEL=gemini-2.5-pro-preview-06-05
API_KEY=your_google_api_key
PROVIDER=perplexity
MODEL=sonar-pro
API_KEY=your_perplexity_api_key
PROVIDER=openai-compatible
MODEL=your_model_name
API_KEY=your_api_key
BASE_URL=https://your-endpoint.com/v1
Once integrated, AI models can use the ask_ai
tool:
{
"name": "ask_ai",
"arguments": {
"question": "How can I optimize this React component?",
"context": "I have a React component that renders 1000+ items and re-renders frequently causing performance issues. Current code: [include your code here]"
}
}
This project is licensed under the GNU General Public License, version 2.
FAQs
A Model Context Protocol server enabling AI-to-AI collaboration across multiple providers.
We found that ask-ai-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.
Product
Socket’s new Pull Request Stories give security teams clear visibility into dependency risks and outcomes across scanned pull requests.
Research
/Security News
npm author Qix’s account was compromised, with malicious versions of popular packages like chalk-template, color-convert, and strip-ansi published.