
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
See what your LLM calls cost. One command. No signup.
LLMFlow is a local observability tool for LLM applications. Point your SDK at it, see your costs, tokens, and latency in real-time.
npx llmflow
Dashboard: localhost:1337 · Proxy: localhost:8080

# Option A: npx (recommended)
npx llmflow
# Option B: Clone and run
git clone https://github.com/HelgeSverre/llmflow.git
cd llmflow && npm install && npm start
# Option C: Docker
docker run -p 1337:1337 -p 8080:8080 helgesverre/llmflow
# Python
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8080/v1")
// JavaScript
const client = new OpenAI({ baseURL: "http://localhost:8080/v1" });
// PHP
$client = OpenAI::factory()->withBaseUri('http://localhost:8080/v1')->make();
Open localhost:1337 to see your traces, costs, and token usage.
| Feature | Description |
|---|---|
| Cost Tracking | Real-time pricing for 2000+ models |
| Request Logging | See every request/response with latency |
| Multi-Provider | OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, and more |
| OpenTelemetry | Accept traces from LangChain, LlamaIndex, etc. |
| Zero Config | Just run it, point your SDK, done |
| Local Storage | SQLite database, no external services |
Use path prefixes or the X-LLMFlow-Provider header:
| Provider | URL |
|---|---|
| OpenAI | http://localhost:8080/v1 (default) |
| Anthropic | http://localhost:8080/anthropic/v1 |
| Gemini | http://localhost:8080/gemini/v1 |
| Ollama | http://localhost:8080/ollama/v1 |
| Groq | http://localhost:8080/groq/v1 |
| Mistral | http://localhost:8080/mistral/v1 |
| Azure OpenAI | http://localhost:8080/azure/v1 |
| Cohere | http://localhost:8080/cohere/v1 |
| Together | http://localhost:8080/together/v1 |
| OpenRouter | http://localhost:8080/openrouter/v1 |
| Perplexity | http://localhost:8080/perplexity/v1 |
If you're using LangChain, LlamaIndex, or other instrumented frameworks:
# Python - point OTLP exporter to LLMFlow
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
exporter = OTLPSpanExporter(endpoint="http://localhost:1337/v1/traces")
// JavaScript
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
new OTLPTraceExporter({ url: "http://localhost:1337/v1/traces" });
| Variable | Default | Description |
|---|---|---|
PROXY_PORT | 8080 | Proxy port |
DASHBOARD_PORT | 1337 | Dashboard port |
DATA_DIR | ~/.llmflow | Data directory |
MAX_TRACES | 10000 | Max traces to retain |
VERBOSE | 0 | Enable verbose logging |
Set provider API keys as environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.) if you want the proxy to forward requests.
# Clone and install
git clone https://github.com/HelgeSverre/llmflow.git
cd llmflow
npm install
# Install frontend dependencies
cd frontend && npm install && cd ..
# Run development mode (backend only)
npm start
# Run frontend dev server with hot reload (separate terminal)
cd frontend && npm run dev
# Build frontend for production
cd frontend && npm run build
# Run E2E tests
npm run test:e2e
The frontend is built with Svelte 5 + Vite and outputs to public/. The backend serves static files from public/ in production.
For advanced usage, see the docs/ folder:
MIT © Helge Sverre
FAQs
See what your LLM calls cost. One command. No signup.
We found that llmflow demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.