LLMs
A universal LLM API transformation server, initially developed for the claude-code-router.
How it works
The LLM API transformation server acts as a middleware to standardize requests and responses between different LLM providers (Anthropic, Gemini, Deepseek, etc.). It uses a modular transformer system to handle provider-specific API formats.
Key Components
Data Flow
-
Request:
- Incoming provider-specific requests are transformed into the unified format.
- The unified request is processed by the server.
-
Response:
- The server's unified response is transformed back into the provider's format.
- Streaming responses are handled with chunked data conversion.
Example Transformers
- Anthropic: Converts between OpenAI-style and Anthropic-style message formats.
- Gemini: Adjusts tool definitions and parameter formats for Gemini compatibility.
- Deepseek: Enforces token limits and handles reasoning content in streams.
Run this repo
- Install dependencies:
npm install
- Development:
npm run dev
- Build:
npm run build
- Test:
npm test
- Path alias:
@
is mapped to the src
directory, use import xxx from '@/xxx'
.
- Environment variables:
- Supports
.env
and config.json
, see src/services/config.ts
.
Working with this repo
👉 Contributing Guide