New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

llmflow

Package Overview
Dependencies
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llmflow

See what your LLM calls cost. One command. No signup.

latest
Source
npmnpm
Version
0.4.1
Version published
Maintainers
1
Created
Source

LLMFlow

See what your LLM calls cost. One command. No signup.

LLMFlow is a local observability tool for LLM applications. Point your SDK at it, see your costs, tokens, and latency in real-time.

npx llmflow

Dashboard: localhost:1337 · Proxy: localhost:8080

LLMFlow Dashboard

Quick Start

1. Start LLMFlow

# Option A: npx (recommended)
npx llmflow

# Option B: Clone and run
git clone https://github.com/HelgeSverre/llmflow.git
cd llmflow && npm install && npm start

# Option C: Docker
docker run -p 1337:1337 -p 8080:8080 helgesverre/llmflow

2. Point Your SDK

# Python
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8080/v1")
// JavaScript
const client = new OpenAI({ baseURL: "http://localhost:8080/v1" });
// PHP
$client = OpenAI::factory()->withBaseUri('http://localhost:8080/v1')->make();

3. View Dashboard

Open localhost:1337 to see your traces, costs, and token usage.

Who Is This For?

  • Solo developers building with OpenAI, Anthropic, etc.
  • Hobbyists who want to see what their AI projects cost
  • Anyone who doesn't want to pay for or set up a SaaS observability tool

Features

FeatureDescription
Cost TrackingReal-time pricing for 2000+ models
Request LoggingSee every request/response with latency
Multi-ProviderOpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, and more
OpenTelemetryAccept traces from LangChain, LlamaIndex, etc.
Zero ConfigJust run it, point your SDK, done
Local StorageSQLite database, no external services

Supported Providers

Use path prefixes or the X-LLMFlow-Provider header:

ProviderURL
OpenAIhttp://localhost:8080/v1 (default)
Anthropichttp://localhost:8080/anthropic/v1
Geminihttp://localhost:8080/gemini/v1
Ollamahttp://localhost:8080/ollama/v1
Groqhttp://localhost:8080/groq/v1
Mistralhttp://localhost:8080/mistral/v1
Azure OpenAIhttp://localhost:8080/azure/v1
Coherehttp://localhost:8080/cohere/v1
Togetherhttp://localhost:8080/together/v1
OpenRouterhttp://localhost:8080/openrouter/v1
Perplexityhttp://localhost:8080/perplexity/v1

OpenTelemetry Support

If you're using LangChain, LlamaIndex, or other instrumented frameworks:

# Python - point OTLP exporter to LLMFlow
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

exporter = OTLPSpanExporter(endpoint="http://localhost:1337/v1/traces")
// JavaScript
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";

new OTLPTraceExporter({ url: "http://localhost:1337/v1/traces" });

Configuration

VariableDefaultDescription
PROXY_PORT8080Proxy port
DASHBOARD_PORT1337Dashboard port
DATA_DIR~/.llmflowData directory
MAX_TRACES10000Max traces to retain
VERBOSE0Enable verbose logging

Set provider API keys as environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.) if you want the proxy to forward requests.

Development

# Clone and install
git clone https://github.com/HelgeSverre/llmflow.git
cd llmflow
npm install

# Install frontend dependencies
cd frontend && npm install && cd ..

# Run development mode (backend only)
npm start

# Run frontend dev server with hot reload (separate terminal)
cd frontend && npm run dev

# Build frontend for production
cd frontend && npm run build

# Run E2E tests
npm run test:e2e

The frontend is built with Svelte 5 + Vite and outputs to public/. The backend serves static files from public/ in production.

Advanced Features

For advanced usage, see the docs/ folder:

License

MIT © Helge Sverre

Keywords

llm

FAQs

Package last updated on 01 Mar 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts