
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
A proxy server providing Claude API access through two simultaneous modes:
/sdk/*) - Uses locally installed Claude Code CLI via Agent SDK/api/*) - Direct API calls with OAuth tokensEach mode supports both Anthropic and OpenAI-compatible formats:
/sdk/anthropic/* or /api/anthropic/* - Anthropic Messages API format/sdk/openai/* or /api/openai/* - OpenAI Chat Completions API formatBoth modes run simultaneously. Choose based on your setup.
graph LR
Client[Client<br/>SDKs, Apps, CLI] --> Proxy[HKCC Proxy Server]
Proxy --> SDK[SDK Mode<br/>/sdk/*]
Proxy --> OAuth[OAuth Mode<br/>/api/*]
SDK --> CLI[Claude Code CLI] --> API[Claude API]
OAuth --> Transform[Transform Request<br/>+ OAuth Token<br/>+ Magic Headers] --> API
Proxy --> DB[(Database<br/>Users & Credentials)]
style SDK fill:#e3f2fd
style OAuth fill:#fff3e0
style DB fill:#e8f5e9
Two Modes:
Multi-User Support: Session cookies and per-user API keys with database-backed credential storage
⚠️ Runtime Requirement This package MUST be run with Bun (>=1.0.0). It uses
bun:sqliteunder the hood for the database layer, which is not compatible with Node.js.
See TECHNICAL.md for detailed architecture diagrams and implementation details.
bun install
bun dev
Server starts at http://127.0.0.1:3000
Build both server and frontend dashboard:
bun run build.ts
This creates:
dist/
├── index.js # Bundled server
└── public/ # Frontend dashboard
├── index.css
├── index.html
└── index.js
Run the built server:
bun run dist/index.js
HKCC can be run as a CLI tool via npx or bunx, perfect for quick local usage without Docker.
⚠️ Bun Required: Even when using
npx, Bun must be installed on your system because the bundled CLI usesbun:sqliteinternally.
# Via bunx (recommended - uses Bun directly)
bunx hkcc --help
# Via npx (Bun must be installed on system)
npx hkcc --help
# Or run directly from the repo
bun run bin/hkcc.ts --help
Local SQLite (simplest setup):
# Run with local SQLite database in current directory
bunx hkcc --sqlite-path ./data/hkcc.db --db-driver bun-sqlite
# Or run from anywhere
bunx hkcc --sqlite-path ~/hkcc-data/db.sqlite --port 8080
Turso Cloud Database:
bunx hkcc \
--db-driver turso \
--turso-url libsql://your-db.turso.io \
--turso-token your-auth-token \
--port 3000
CLI Arguments:
| Argument | Alias | Description | Default |
|---|---|---|---|
--port | -p | Server port | 3000 or PORT env var |
--host | -h | Server host | 127.0.0.1 or HOST env var |
--db-driver | Database driver (turso or bun-sqlite) | Auto-detect | |
--turso-url | Turso connection URL | TURSO_CONNECTION_URL env var | |
--turso-token | Turso auth token | TURSO_AUTH_TOKEN env var | |
--sqlite-path | SQLite database path (relative to CWD) | ./data/hkcc.db | |
--claude-path | Path to Claude Code CLI executable | Auto-detect | |
--production | Enable production mode (static file serving) | Enabled |
Key Features:
CLI vs Server Mode:
| Feature | CLI (bunx hkcc) | Server (bun run src/index.ts) |
|---|---|---|
| Entry Point | bin/hkcc.ts | src/index.ts |
| Args | CLI arguments | Environment variables |
| CWD | Database relative to CWD | Database relative to project root |
| Use Case | Quick start, any directory | Development in project |
| Static Files | Production mode | Dev mode (hot reload) |
The server hosts a web dashboard at the root URL (/) for managing OAuth credentials and viewing server status. The dashboard is built with React and Tailwind CSS.
All images require database configuration via environment variables.
# Default (with Claude Code CLI, bundled JS)
docker build -f Dockerfile -t hkcc:latest .
docker run -p 3000:3000 \
-e TURSO_CONNECTION_URL=libsql://your-db.turso.io \
-e TURSO_AUTH_TOKEN=your-token \
-v hkcc-claude:/home/claude/.claude \
hkcc:latest
# OAuth-only (smaller image, bundled JS)
docker build -f Dockerfile.oauth -t hkcc:oauth .
docker run -p 3000:3000 \
-e TURSO_CONNECTION_URL=libsql://your-db.turso.io \
-e TURSO_AUTH_TOKEN=your-token \
hkcc:oauth
# Compiled binary (no Bun runtime needed, with Claude Code CLI)
docker build -f Dockerfile.compiled -t hkcc:compiled .
docker run -p 3000:3000 \
-e TURSO_CONNECTION_URL=libsql://your-db.turso.io \
-e TURSO_AUTH_TOKEN=your-token \
-v hkcc-claude:/home/claude/.claude \
hkcc:compiled
# Compiled binary (no Bun runtime, OAuth-only, smallest)
docker build -f Dockerfile.compiled-oauth -t hkcc:compiled-oauth .
docker run -p 3000:3000 \
-e TURSO_CONNECTION_URL=libsql://your-db.turso.io \
-e TURSO_AUTH_TOKEN=your-token \
hkcc:compiled-oauth
Volume mounts:
Dockerfile, Dockerfile.compiled): Volume at /home/claude/.claude
Dockerfile.oauth, Dockerfile.compiled-oauth): No volume needed
Build images for both ARM64 (Apple Silicon) and AMD64 (Intel/AMD) architectures:
# Create and use buildx builder (one-time setup)
docker buildx create --use
# Build for multiple platforms and push to registry
docker buildx build --platform linux/amd64,linux/arm64 \
-t your-registry/hkcc:latest \
-f Dockerfile \
--push .
# Build for multiple platforms (load local - only works for single arch)
docker buildx build --platform linux/amd64,linux/arm64 \
-t hkcc:latest \
-f Dockerfile \
--output type=local,dest=./output .
Note: All Dockerfiles now support Docker buildx for multi-platform builds. The compiled variants automatically detect the target platform and compile the appropriate binary.
| Dockerfile | Claude Code CLI | Runtime | Base OS | Size | Best For |
|---|---|---|---|---|---|
Dockerfile | Yes | Bun JS | Alpine | 327MB | Development, hot reload, full features |
Dockerfile.oauth | No | Bun JS | Alpine | 107MB | OAuth-only deployments, smaller footprint |
Dockerfile.compiled | Yes | Native binary | Debian | 425MB | Production with SDK mode, faster startup |
Dockerfile.compiled-oauth | No | Native binary | Debian | 204MB | Production OAuth-only, smallest image |
Key Differences:
.js): Requires Bun runtime at runtime. Larger base image but faster builds./sdk/*). Adds ~220MB to image size.For easier deployment and configuration management, use Docker Compose. Create a docker-compose.yml file:
Basic setup with Turso cloud database (recommended for production):
version: "3.8"
services:
hkcc:
image: huakunshen/hkcc:latest # or build locally: build: .
container_name: hkcc-proxy
restart: unless-stopped
ports:
- "3000:3000"
environment:
# Server configuration
- HOST=0.0.0.0
- PORT=3000
# Database - Turso (cloud)
- TURSO_CONNECTION_URL=libsql://your-db.turso.io
- TURSO_AUTH_TOKEN=your-auth-token
# Optional: Disable registration after creating accounts
# - HKCC_REGISTRATION_ENABLED=false
volumes:
# Persist Claude Code CLI authentication (SDK mode only)
- hkcc-claude:/home/claude/.claude
volumes:
hkcc-claude:
OAuth-only deployment (smaller image, no SDK mode):
version: "3.8"
services:
hkcc:
image: huakunshen/hkcc:oauth # Uses OAuth-only image (smaller)
container_name: hkcc-oauth-proxy
restart: unless-stopped
ports:
- "3000:3000"
environment:
- HOST=0.0.0.0
- PORT=3000
- TURSO_CONNECTION_URL=libsql://your-db.turso.io
- TURSO_AUTH_TOKEN=your-auth-token
Using local SQLite database (no external database needed):
version: "3.8"
services:
hkcc:
image: huakunshen/hkcc:latest
container_name: hkcc-proxy
restart: unless-stopped
ports:
- "3000:3000"
environment:
- HOST=0.0.0.0
- PORT=3000
# No TURSO variables = uses local SQLite at ./data/hkcc.db
# Optional: Custom database path
# - SQLITE_DATABASE_PATH=/app/data/hkcc.db
volumes:
# Persist local SQLite database
- hkcc-data:/app/data
# Persist Claude Code CLI authentication (SDK mode)
- hkcc-claude:/home/claude/.claude
volumes:
hkcc-data:
hkcc-claude:
Compiled binary with faster startup (production-optimized):
version: "3.8"
services:
hkcc:
image: huakunshen/hkcc:compiled # Native binary, no Bun runtime
container_name: hkcc-compiled
restart: unless-stopped
ports:
- "3000:3000"
environment:
- HOST=0.0.0.0
- PORT=3000
- TURSO_CONNECTION_URL=libsql://your-db.turso.io
- TURSO_AUTH_TOKEN=your-auth-token
volumes:
- hkcc-claude:/home/claude/.claude
volumes:
hkcc-claude:
Using environment file for secrets:
Create a .env file (git-ignored):
# .env
TURSO_CONNECTION_URL=libsql://your-db.turso.io
TURSO_AUTH_TOKEN=your-auth-token
HKCC_REGISTRATION_ENABLED=false
Update docker-compose.yml to use it:
services:
hkcc:
image: huakunshen/hkcc:latest
container_name: hkcc-proxy
restart: unless-stopped
ports:
- "3000:3000"
env_file:
- .env
volumes:
- hkcc-claude:/home/claude/.claude
volumes:
hkcc-claude:
Deploy with Docker Compose:
# Start the service
docker compose up -d
# View logs
docker compose logs -f
# Stop the service
docker compose down
# Stop and remove volumes (WARNING: deletes data)
docker compose down -v
# Rebuild with local changes
docker compose up -d --build
Key considerations:
| Image Variant | SDK Mode | OAuth Mode | Volume Needed | Best Use Case |
|---|---|---|---|---|
hkcc:latest | ✅ | ✅ | /home/claude/.claude | Full features, development |
hkcc:oauth | ❌ | ✅ | ❌ | OAuth-only, smaller image |
hkcc:compiled | ✅ | ✅ | /home/claude/.claude | Production, faster startup |
hkcc:compiled-oauth | ❌ | ✅ | ❌ | OAuth-only, production |
Database options:
TURSO_CONNECTION_URL and TURSO_AUTH_TOKEN - Best for production, multi-instance setups/app/data/hkcc.db - Best for single-instance, developmentcurl http://127.0.0.1:3000/health
curl http://127.0.0.1:3000/oauth/status
/sdk/*) - Requires Claude Code CLI# Anthropic Messages API
curl -X POST http://127.0.0.1:3000/sdk/anthropic/v1/messages \
-H "Content-Type: application/json" \
-d '{"model":"claude-sonnet-4-5-20250929","max_tokens":1024,"messages":[{"role":"user","content":"Hello!"}]}'
# OpenAI Chat Completions API
curl -X POST http://127.0.0.1:3000/sdk/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello!"}]}'
/api/*) - Requires OAuth Authentication# Anthropic Messages API
curl -X POST http://127.0.0.1:3000/api/anthropic/v1/messages \
-H "Content-Type: application/json" \
-d '{"model":"claude-sonnet-4-5-20250929","max_tokens":1024,"messages":[{"role":"user","content":"Hello!"}]}'
# OpenAI Chat Completions API
curl -X POST http://127.0.0.1:3000/api/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4","messages":[{"role":"user","content":"Hello!"}]}'
| OpenAI Model | Claude Model |
|---|---|
gpt-4, gpt-4-turbo, gpt-4o | claude-sonnet-4-5-20250929 |
gpt-3.5-turbo | claude-haiku-3-5-20241022 |
The server stores user accounts, OAuth credentials, and API keys in a database. Both cloud (Turso) and local SQLite databases are supported.
The server automatically detects which database to use:
TURSO_CONNECTION_URL and TURSO_AUTH_TOKEN are set → uses Turso (cloud)./data/hkcc.dbFor new users, no configuration is needed - the server will automatically use local SQLite.
# No configuration needed - uses ./data/hkcc.db by default
bun dev
To customize the database file location:
SQLITE_DATABASE_PATH=./custom/path/db.sqlite
Set the database environment variables in .env:
TURSO_CONNECTION_URL=libsql://your-db.turso.io
TURSO_AUTH_TOKEN=your-auth-token
Override auto-detection with:
DB_DRIVER=turso # Force Turso
# or
DB_DRIVER=bun-sqlite # Force local SQLite
Migrations run automatically on server startup. For manual migration management:
bun drizzle-kit push # Apply schema to database
bun drizzle-kit studio # Open database GUI
# Register
curl -X POST http://127.0.0.1:3000/auth/register \
-H "Content-Type: application/json" \
-d '{"email": "user@example.com", "password": "password123", "username": "myuser"}'
# Login
curl -X POST http://127.0.0.1:3000/auth/login \
-H "Content-Type: application/json" \
-d '{"email": "user@example.com", "password": "password123"}'
# Get current user
curl http://127.0.0.1:3000/auth/me -b "hkcc_session=SESSION_TOKEN"
# Logout
curl -X POST http://127.0.0.1:3000/auth/logout -b "hkcc_session=SESSION_TOKEN"
Registration can be disabled by setting HKCC_REGISTRATION_ENABLED=false after creating your account.
After logging in, authenticate with OAuth providers via the dashboard or API:
# Get Anthropic authorization URL
curl http://127.0.0.1:3000/oauth/anthropic/login?mode=max
# Get OpenAI/Codex authorization URL
curl http://127.0.0.1:3000/oauth/openai/login
# Check OAuth status
curl http://127.0.0.1:3000/oauth/anthropic/status
curl http://127.0.0.1:3000/oauth/openai/status
The server uses OAuth 2.0 with PKCE (Proof Key for Code Exchange) for secure authentication:
sequenceDiagram
participant User
participant Browser
participant Proxy as HKCC Proxy
participant Claude as claude.ai
User->>Browser: Click "Login with Claude"
Browser->>Proxy: GET /oauth/anthropic/login
Proxy->>Proxy: Generate PKCE<br/>code_verifier<br/>code_challenge
Proxy-->>Browser: Authorization URL + state
Browser->>Claude: Redirect to authorization URL
Claude->>User: Show consent screen
User->>Claude: Approve access
Claude-->>Browser: Redirect with auth code
Browser->>Proxy: GET /oauth/anthropic/callback?code=xxx
Proxy->>Claude: POST /oauth/token<br/>(code + verifier)
Claude-->>Proxy: access_token + refresh_token
Proxy->>Proxy: Store credentials in database
Proxy-->>Browser: Redirect to dashboard
Browser->>User: Authentication complete
Security features:
| Mode | Authentication | Credentials Used |
|---|---|---|
| Dashboard (browser) | Session cookie (hkcc_session) | User's database credentials |
| Per-user API key | hkcc_sk_xxx in header | User's database credentials |
Each registered user can create up to 5 personal API keys for external applications like Cherry Studio, Cursor, or custom scripts.
Create an API key (via dashboard or API):
# Create key via API (requires session cookie)
curl -X POST http://127.0.0.1:3000/auth/api-keys \
-H "Content-Type: application/json" \
-b "hkcc_session=SESSION_TOKEN" \
-d '{"name": "Cherry Studio"}'
# Response includes the full key (shown only once):
# {"success":true,"key":"hkcc_sk_a1b2c3d4e5f6...","apiKey":{"id":1,"keyPrefix":"hkcc_sk_a1b2...","name":"Cherry Studio"}}
Use the API key in external clients:
curl -X POST http://127.0.0.1:3000/api/anthropic/v1/messages \
-H "x-api-key: hkcc_sk_a1b2c3d4e5f6..." \
-H "Content-Type: application/json" \
-d '{"model":"claude-sonnet-4-5-20250929","max_tokens":100,"messages":[{"role":"user","content":"Hello"}]}'
# Or use Authorization header:
curl -X POST http://127.0.0.1:3000/api/anthropic/v1/messages \
-H "Authorization: Bearer hkcc_sk_a1b2c3d4e5f6..." \
...
Manage API keys:
# List keys
curl http://127.0.0.1:3000/auth/api-keys -b "hkcc_session=SESSION_TOKEN"
# Delete a key
curl -X DELETE http://127.0.0.1:3000/auth/api-keys/1 -b "hkcc_session=SESSION_TOKEN"
A cron job runs every hour to automatically refresh tokens expiring within 60 minutes. Claude access tokens have an 8-hour TTL (480 minutes).
import { createAnthropic } from "@ai-sdk/anthropic";
import { generateText } from "ai";
const anthropic = createAnthropic({
baseURL: "http://127.0.0.1:3000/api/anthropic", // OAuth mode
apiKey: "hkcc_sk_your_personal_api_key", // Per-user API key
});
const { text } = await generateText({
model: anthropic("claude-sonnet-4-5-20250929"),
prompt: "Hello!",
});
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
baseURL: "http://127.0.0.1:3000/api/anthropic", // OAuth mode
apiKey: "hkcc_sk_your_personal_api_key", // Per-user API key
});
const message = await client.messages.create({
model: "claude-sonnet-4-5-20250929",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello!" }],
});
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "http://127.0.0.1:3000/api/anthropic/openai/v1", // OAuth mode with OpenAI format
apiKey: "hkcc_sk_your_personal_api_key", // Per-user API key
});
const completion = await client.chat.completions.create({
model: "claude-sonnet-4-5-20250929",
messages: [{ role: "user", content: "Hello!" }],
});
# Server Configuration
PORT=3000 # Server port (default: 3000)
HOST=127.0.0.1 # Server host (default: 127.0.0.1)
# Database (Required)
TURSO_CONNECTION_URL=libsql://your-db.turso.io
TURSO_AUTH_TOKEN=your-auth-token
# Registration Control
HKCC_REGISTRATION_ENABLED=true # Set to "false" to disable new user registration
bun run dev # Run with hot reload
bun run build.ts # Build server and frontend
bun test # Run tests
bun run format # Format code
src/
├── index.ts # Server entry point (development)
├── app.ts # Elysia app factory
├── auth/ # OAuth & user authentication, API key management
├── db/ # Drizzle ORM schema and database connection
├── cron/ # Scheduled jobs (token refresh)
├── proxy/ # SDK and OAuth proxy implementations
├── routes/ # OAuth and auth endpoints
├── adapters/ # Request/response format adapters
├── transform/ # Request/response transformations
├── middleware/ # API key validation, error handling
├── mcp-bridge/ # MCP tool bridging for client tools
└── utils/ # Utilities (Claude Code detection)
bin/
└── hkcc.ts # CLI entry point (npx/bunx)
public/ # Frontend source (React + Tailwind)
├── index.tsx # React entry point
└── pages/ # Dashboard pages
build.ts # Build script for server, CLI, and frontend
See TECHNICAL.md for architecture details.
bun:sqlite)# Build and push with timestamp tags for all variants
DATE_TAG=$(date +%Y-%m-%d-%H-%M-%S)
# Default (latest + oauth + compiled + compiled-oauth)
docker buildx build --push --platform linux/amd64,linux/arm64 \
-t huakunshen/hkcc:latest \
-t huakunshen/hkcc:latest-${DATE_TAG} \
-f Dockerfile .
docker buildx build --push --platform linux/amd64,linux/arm64 \
-t huakunshen/hkcc:oauth \
-t huakunshen/hkcc:oauth-${DATE_TAG} \
-f Dockerfile.oauth .
docker buildx build --push --platform linux/amd64,linux/arm64 \
-t huakunshen/hkcc:compiled \
-t huakunshen/hkcc:compiled-${DATE_TAG} \
-f Dockerfile.compiled .
docker buildx build --push --platform linux/amd64,linux/arm64 \
-t huakunshen/hkcc:compiled-oauth \
-t huakunshen/hkcc:compiled-oauth-${DATE_TAG} \
-f Dockerfile.compiled-oauth .
Note: Each image is tagged with both the standard tag (e.g., latest, oauth) and a timestamped tag (e.g., oauth-2026-01-29-21-03-17) for version tracking. All images support both linux/amd64 and linux/arm64 platforms.
FAQs
Claude proxy server with Agent SDK and OAuth modes
We found that hkcc demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.