cursor-agent-api-proxy

ä¸ć–‡ć–‡ćˇŁ
OpenAI-compatible API proxy for the Cursor CLI. Lets any OpenAI client use your Cursor subscription.
Prerequisites
- Node.js 20+
- Active Cursor subscription (Pro / Business)
Install
1. Install the Cursor CLI and log in:
curl https://cursor.com/install -fsS | bash
irm 'https://cursor.com/install?win32=true' | iex
agent login
agent --list-models
Headless? Skip agent login, generate a key at cursor.com/settings and export CURSOR_API_KEY=<key>.
2. Install and start the proxy:
npm install -g cursor-agent-api-proxy
cursor-agent-api
cursor-agent-api status
3. Verify:
curl http://localhost:4646/health
Other commands:
cursor-agent-api stop
cursor-agent-api restart
cursor-agent-api start 8080
cursor-agent-api run
Logs: ~/.cursor-agent-api/server.log
Use with OpenClaw
First-time setup (onboarding wizard)
If you haven't set up OpenClaw yet, run the onboarding wizard:
openclaw onboard
When the wizard asks you to configure Model/Auth:
- Provider type → choose Custom Provider (OpenAI-compatible)
- Base URL →
http://localhost:4646/v1
- API Key → type
not-needed (if you ran agent login)
- Default model →
auto (or any model from agent --list-models)
Existing setup (edit config)
Already have OpenClaw running? Edit the config file directly:
{
env: {
// "not-needed" = already logged in via agent login
// or set your Cursor API Key here to forward it per-request
OPENAI_API_KEY: "not-needed",
OPENAI_BASE_URL: "http://localhost:4646/v1",
},
agents: {
defaults: {
model: { primary: "openai/auto" },
},
},
}
Models
Model IDs match agent --list-models output directly:
auto
gpt-5.2
gpt-5.3-codex
opus-4.6-thinking
sonnet-4.5-thinking
gemini-3-pro
Full list: curl http://localhost:4646/v1/models or agent --list-models.
API
/health | GET | Health check |
/v1/models | GET | List models |
/v1/chat/completions | POST | Chat completion (supports stream: true) |
Configuration
PORT | 4646 | Listen port (or cursor-agent-api start 8080) |
CURSOR_API_KEY | - | Alternative to agent login |
Auto-start (boot)
To start the proxy automatically on system boot:
cursor-agent-api install
cursor-agent-api uninstall
- macOS → LaunchAgent
- Windows → Task Scheduler
- Linux → systemd user service
Other Clients
Python (openai SDK)
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:4646/v1",
api_key="not-needed",
)
resp = client.chat.completions.create(
model="auto",
messages=[{"role": "user", "content": "Hello!"}],
)
print(resp.choices[0].message.content)
Continue.dev
{
"models": [{
"title": "Cursor",
"provider": "openai",
"model": "auto",
"apiBase": "http://localhost:4646/v1",
"apiKey": "not-needed"
}]
}
curl
curl -X POST http://localhost:4646/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"auto","messages":[{"role":"user","content":"Hello!"}]}'
How it Works
Client → POST /v1/chat/completions (OpenAI format)
→ cursor-agent-api-proxy
→ spawn agent CLI (stream-json)
→ Cursor subscription
→ AI response → OpenAI format → Client
Contributing
git clone https://github.com/tageecc/cursor-agent-api-proxy.git
cd cursor-agent-api-proxy
pnpm install && pnpm run build
pnpm start
License
MIT