A CLI tool for interacting with Model Context Protocol (MCP) servers using natural language. Built with mcp-use and powered by OpenAI's GPT models.
Features
- 🤖 Natural language interface for MCP servers
- 🔧 Built-in filesystem MCP server support
- 💬 Interactive chat interface with tool call visualization
- ⚡ Direct integration with mcp-use (no API layer needed)
- 🚀 Single command installation
- 🔄 Multiple LLM providers (OpenAI, Anthropic, Google, Mistral)
- ⚙️ Slash commands for configuration (like Claude Code)
- 🔑 Smart API key prompting - automatically asks for keys when needed
- 💾 Persistent secure storage - encrypted keys and settings saved across sessions
Install
$ npm install --global mcp-use-cli
Quick Start
-
Install and run:
$ npm install --global mcp-use-cli
$ mcp-use-cli
-
Choose your model (CLI handles API key setup automatically):
/model openai gpt-4o-mini
/model anthropic claude-3-5-sonnet-20241022
/model google gemini-1.5-pro
-
Get API keys when prompted from:
Keys are stored securely encrypted in ~/.mcp-use-cli/config.json and persist across sessions.
Alternative Setup
If you prefer environment variables:
export OPENAI_API_KEY=your_key_here
export ANTHROPIC_API_KEY=your_key_here
Usage
$ mcp-use-cli --help
Usage
$ mcp-use-cli
Options
--name Your name (optional)
--config Path to MCP configuration file (optional)
Examples
$ mcp-use-cli
$ mcp-use-cli --name=Jane
$ mcp-use-cli --config=./mcp-config.json
Environment Variables
OPENAI_API_KEY Required - Your OpenAI API key
Setup
1. Set your OpenAI API key: export OPENAI_API_KEY=your_key_here
2. Run: mcp-use-cli
3. Start chatting with MCP servers!
Configuration
By default, the CLI connects to a filesystem MCP server in /tmp. You can provide a custom configuration file:
{
"servers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/directory"],
"env": {}
},
"other-server": {
"command": "your-mcp-server-command",
"args": ["--arg1", "value1"],
"env": {}
}
}
}
Slash Commands
Switch LLM providers and configure settings using slash commands (similar to Claude Code):
/setkey openai sk-1234567890abcdef...
/setkey anthropic ant_1234567890abcdef...
/clearkeys
/model openai gpt-4o
/model anthropic claude-3-5-sonnet-20241022
/model google gemini-1.5-pro
/model mistral mistral-large-latest
/models
/models anthropic
/config temp 0.5
/config tokens 4000
/status
/help
Chat Examples
- "List files in the current directory"
- "Create a new file called hello.txt with the content 'Hello, World!'"
- "Search for files containing 'TODO'"
- "What's the structure of this project?"
Architecture
This CLI uses:
- Frontend: React + Ink for the terminal UI
- Agent: mcp-use MCPAgent for LLM + MCP integration
- LLM: OpenAI GPT-4o-mini
- Transport: Direct TypeScript integration (no API layer)
License
MIT