
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
ai-cli-mcp
Advanced tools
MCP server for AI CLI tools (Claude, Codex, and Gemini) with background process management
📦 Package Migration Notice: This package was formerly
@mkxultra/claude-code-mcpand has been renamed toai-cli-mcpto reflect its expanded support for multiple AI CLI tools.
An MCP (Model Context Protocol) server that allows running AI CLI tools (Claude, Codex, and Gemini) in background processes with automatic permission handling.
Did you notice that Cursor sometimes struggles with complex, multi-step edits or operations? This server, with its powerful unified run tool, enables multiple AI agents to handle your coding tasks more effectively.
This MCP server provides tools that can be used by LLMs to interact with AI CLI tools. When integrated with MCP clients, it allows LLMs to:
--dangerously-skip-permissions)--full-auto)-y)You can instruct your main agent to run multiple tasks in parallel like this:
Launch agents for the following 3 tasks using acm mcp run:
- Refactor
src/backendcode usingsonnet- Create unit tests for
src/frontendusinggpt-5.2-codex- Update docs in
docs/usinggemini-2.5-proWhile they run, please update the TODO list. Once done, use the
waittool to wait for all completions and report the results together.
You can reuse heavy context (like large codebases) using session IDs to save costs while running multiple tasks.
- First, use
acm mcp runwithopusto read all files insrc/and understand the project structure.- Use the
waittool to wait for completion and retrieve thesession_idfrom the result.- Using that
session_id, run the following two tasks in parallel withacm mcp run:
- Create refactoring proposals for
src/utilsusingsonnet- Add architecture documentation to
README.mdusinggpt-5.2-codex- Finally,
waitagain to combine both results.
The only prerequisite is that the AI CLI tools you want to use are locally installed and correctly configured.
claude doctor passes, and execution with --dangerously-skip-permissions is approved (you must run it manually once to login and accept terms).There are now two primary ways to use this package:
ai-cli-mcp: MCP server entrypointai-cli: human-facing CLI for background AI runsnpxThe recommended way to use the MCP server is via npx.
"ai-cli-mcp": {
"command": "npx",
"args": [
"-y",
"ai-cli-mcp@latest"
]
},
claude mcp add ai-cli '{"name":"ai-cli","command":"npx","args":["-y","ai-cli-mcp@latest"]}'
If you want to use the production CLI directly from your shell, install the package globally:
npm install -g ai-cli-mcp
This exposes both commands:
ai-cliai-cli-mcpExamples:
ai-cli doctor
ai-cli models
ai-cli run --cwd "$PWD" --model sonnet --prompt "summarize this repository"
ai-cli ps
ai-cli result 12345
ai-cli wait 12345 --timeout 300
ai-cli kill 12345
ai-cli cleanup
ai-cli-mcp
npxBecause the published package name is still ai-cli-mcp, the shortest npx form for the CLI is:
npx -y --package ai-cli-mcp@latest ai-cli run --cwd "$PWD" --model sonnet --prompt "hello"
Before the MCP server can use Claude, you must first run the Claude CLI manually once with the --dangerously-skip-permissions flag, login and accept the terms.
npm install -g @anthropic-ai/claude-code
claude --dangerously-skip-permissions
Follow the prompts to accept. Once this is done, the MCP server will be able to use the flag non-interactively.
For Codex, ensure you're logged in and have accepted any necessary terms:
codex login
For Gemini, ensure you're logged in and have configured your credentials:
gemini auth login
macOS might ask for folder permissions the first time any of these tools run. If the first run fails, subsequent runs should work.
ai-cli currently supports:
runpsresultwaitkillcleanupdoctormodelsmcpExample flow:
ai-cli doctor
ai-cli models
ai-cli run --cwd "$PWD" --model codex-ultra --prompt "fix failing tests"
ai-cli ps
ai-cli wait 12345
ai-cli result 12345
ai-cli cleanup
run accepts --cwd as the primary working-directory flag and also accepts the older aliases --workFolder / --work-folder for compatibility.
doctor checks only binary existence and path resolution. It does not verify login state or terms acceptance.
Background CLI runs are stored under:
~/.local/state/ai-cli/cwds/<normalized-cwd>/<pid>/
Each PID directory contains:
meta.jsonstdout.logstderr.logUse ai-cli cleanup to remove completed and failed runs. Running processes are preserved.
Detached ai-cli runs do not currently persist natural process exit codes. As a result, the CLI can report process output and running/completed state, but it does not yet guarantee exitCode for naturally finished background runs.
After setting up the server, add the configuration to your MCP client's settings file (e.g., mcp.json for Cursor, mcp_config.json for Windsurf).
If the file doesn't exist, create it and add the ai-cli-mcp configuration.
This server exposes the following tools:
runExecutes a prompt using Claude CLI, Codex CLI, or Gemini CLI. The appropriate CLI is automatically selected based on the model name.
Arguments:
prompt (string, optional): The prompt to send to the AI agent. Either prompt or prompt_file is required.prompt_file (string, optional): Path to a file containing the prompt. Either prompt or prompt_file is required. Can be absolute path or relative to workFolder.workFolder (string, required): The working directory for the CLI execution. Must be an absolute path.
Models:claude-ultra (defaults to high effort), codex-ultra (defaults to xhigh reasoning), gemini-ultrasonnet, sonnet[1m], opus, opusplan, haikugpt-5.4, gpt-5.3-codex, gpt-5.2-codex, gpt-5.1-codex-mini, gpt-5.1-codex-max, gpt-5.2, gpt-5.1, gpt-5gemini-2.5-pro, gemini-2.5-flash, gemini-3.1-pro-preview, gemini-3-pro-preview, gemini-3-flash-previewreasoning_effort (string, optional): Reasoning control for Claude and Codex. Claude uses --effort (allowed: "low", "medium", "high"). Codex uses model_reasoning_effort (allowed: "low", "medium", "high", "xhigh").session_id (string, optional): Optional session ID to resume a previous session. Supported for: haiku, sonnet, opus, gemini-2.5-pro, gemini-2.5-flash, gemini-3.1-pro-preview, gemini-3-pro-preview, gemini-3-flash-preview.waitWaits for multiple AI agent processes to complete and returns their combined results. Blocks until all specified PIDs finish or a timeout occurs.
Arguments:
pids (array of numbers, required): List of process IDs to wait for (returned by the run tool).timeout (number, optional): Maximum wait time in seconds. Defaults to 180 (3 minutes).list_processesLists all running and completed AI agent processes with their status, PID, and basic info.
get_resultGets the current output and status of an AI agent process by PID.
Arguments:
pid (number, required): The process ID returned by the run tool.kill_processTerminates a running AI agent process by PID.
Arguments:
pid (number, required): The process ID to terminate.npx, ensure npx itself is working.ai-cli): If installed globally, ensure your npm global bin directory is in PATH. If using npx, use npx -y --package ai-cli-mcp@latest ai-cli ....claude/doctor or check its documentation.MCP_CLAUDE_DEBUG is true, error messages or logs might interfere with MCP's JSON parsing. Set to false for normal operation.For development setup, testing, and contribution guidelines, see the Development Guide.
Normally not required, but useful for customizing CLI paths or debugging.
CLAUDE_CLI_NAME: Override the Claude CLI binary name or provide an absolute path (default: claude)CODEX_CLI_NAME: Override the Codex CLI binary name or provide an absolute path (default: codex)GEMINI_CLI_NAME: Override the Gemini CLI binary name or provide an absolute path (default: gemini)MCP_CLAUDE_DEBUG: Enable debug logging (set to true for verbose output)CLI Name Specification:
CLAUDE_CLI_NAME=claude-customCLAUDE_CLI_NAME=/path/to/custom/claude
Relative paths are not supported. "ai-cli-mcp": {
"command": "npx",
"args": [
"-y",
"ai-cli-mcp@latest"
],
"env": {
"CLAUDE_CLI_NAME": "claude-custom",
"CODEX_CLI_NAME": "codex-custom"
}
},
MIT
FAQs
MCP server for AI CLI tools (Claude, Codex, and Gemini) with background process management
The npm package ai-cli-mcp receives a total of 100 weekly downloads. As such, ai-cli-mcp popularity was classified as not popular.
We found that ai-cli-mcp demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.