
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
Proxy server that translates various LLM API formats (Codestral, OpenAI, etc.) to Ollama
Proxy server that translates various LLM API formats to Ollama.
Use case: Run local LLM completions with tools that only support cloud APIs (like Zed editor's Codestral integration).
| Provider | Endpoints | Status |
|---|---|---|
| Codestral | /v1/fim/completions, /v1/models | ✅ |
| OpenAI | /v1/chat/completions | 🔜 Planned |
| Anthropic | /v1/messages | 🔜 Planned |
npm install -g 2ollama
Or run directly with npx:
npx 2ollama
2ollama
-p, --port <PORT> Port to listen on (default: 8787)
-o, --ollama-url <URL> Ollama server URL (default: http://localhost:11434)
-m, --model <MODEL> Default model to use (default: codestral:latest)
-d, --daemon Run in background
-h, --help Show help
-v, --version Show version
PORT=8787
OLLAMA_URL=http://localhost:11434
DEFAULT_MODEL=codestral:latest
# Use a different model
2ollama --model qwen2.5-coder:7b
# Run on a different port
2ollama --port 8080
# Run in background
2ollama --daemon
# Connect to remote Ollama
2ollama --ollama-url http://192.168.1.100:11434
Start the proxy: 2ollama
Configure Zed's settings.json:
{
"language_models": {
"codestral": {
"api_url": "http://localhost:8787"
}
},
"features": {
"edit_prediction_provider": "codestral"
}
}
Any Ollama model with FIM (fill-in-middle) support works. For instance:
codestral:latest - Mistral's code model (22B, best quality)qwen2.5-coder:7b - Good balance of speed and qualityqwen2.5-coder:1.5b - Fast, lower resource usagedeepseek-coder-v2:16b - Strong coding performancestarcoder2:7b - Solid alternativePull your preferred model:
ollama pull qwen2.5-coder:7b
| Endpoint | Method | Description |
|---|---|---|
/ | GET | Health check |
/health | GET | Health check |
/v1/fim/completions | POST | FIM completions (Codestral format) |
/v1/models | GET | List available models |
import { startServer } from "2ollama";
startServer({
port: 8787,
ollamaUrl: "http://localhost:11434",
defaultModel: "codestral:latest",
});
2ollama uses a pluggable provider system. Each provider defines routes that translate incoming API requests to Ollama's format.
src/
provider.ts # Provider interface
providers/
index.ts # Provider registry
codestral/ # Codestral provider
index.ts # Route handlers
types.ts # Request/Response types
transform.ts # Format transformations
import type { Provider } from "2ollama";
export const myProvider: Provider = {
name: "my-provider",
routes: [
{
method: "POST",
path: "/v1/my/endpoint",
handler: async (req, res, ctx) => {
// Transform request, call Ollama, return response
},
},
],
};
# Install dependencies
bun install
# Run in development mode
bun run dev
# Build
bun run build
# Type check
bun run typecheck
# Lint/format
bun run check
MIT
FAQs
Proxy server that translates various LLM API formats (Codestral, OpenAI, etc.) to Ollama
The npm package 2ollama receives a total of 6 weekly downloads. As such, 2ollama popularity was classified as not popular.
We found that 2ollama demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.