🚨 Shai-Hulud Strikes Again:834 Packages Compromised.Technical Analysis →
Socket
Book a DemoInstallSign in
Socket

mcpusage

Package Overview
Dependencies
Maintainers
1
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

mcpusage

CLI for measuring MCP server tool advertisement token usage

latest
Source
npmnpm
Version
0.1.0
Version published
Maintainers
1
Created
Source

MCP Tool Surface Token Meter

This repository implements a TypeScript CLI (mcpusage) that reproduces the workflow described in Measuring Token Cost of MCP Tool Advertisements in TypeScript. It fetches every tool definition from an MCP server, serializes the resulting “tool surface”, and measures how many tokens that block would occupy for one or more OpenAI and Anthropic models (starting with GPT-5 / GPT-5.1 families, Codex variants, and Claude Sonnet).

Features

  • Connects to any MCP server via stdio (launching a local process) or HTTP/SSE endpoints.
  • Automatically paginates list_tools responses so the entire tool surface is counted.
  • Serializes tool metadata deterministically (sorted by tool name, optional _meta stripping).
  • Uses @dqbd/tiktoken (OpenAI models) and @anthropic-ai/tokenizer (Claude family) for token accounting with model-aware or custom encoding fallbacks; Anthropic API mode reports official counts and includes the local tokenizer total alongside for comparison.
  • Supports multiple models per run and highlights when tokenizers are assumed vs. officially published.
  • Prints a Claude-style per-tool breakdown so you can see which MCP schema objects are most expensive.
  • Can export the captured tool advertisement JSON for offline diffing or audits.

Getting Started

npm install
npm run build # compiles the CLI to dist/cli.js

Or install from npm (Node.js 20+):

# quick, no global install
npx mcpusage --help

# or install globally
npm install -g mcpusage
mcpusage --help

For rapid iteration you can run the CLI directly from TypeScript:

npm run dev -- --help

After building, invoke the binary via:

node dist/cli.js --help
# or install globally / link if desired:
npm link
mcpusage --help

Usage Examples

Measure a local stdio MCP server

mcpusage \
  --stdio npx -y @modelcontextprotocol/server-filesystem ./sample_files \
  --model gpt-5 --model gpt-5.1-codex

Measure a remote server with Streamable HTTP

mcpusage \
  --url https://mcp.example.com/mcp \
  --header Authorization:"Bearer <token>" \
  --transport streamable \
  --dump-tools ./snapshots/server-a.json

Launch via a shell script

mcpusage --stdio-shell "poetry run python ./scripts/server.py" --format json

CLI Flags (summary)

  • --stdio <cmd...> / --stdio-shell <string>: launch a server via stdio; combine with --stdio-cwd, --stdio-env, and --stdio-echo-stderr (stdout is always silent, stderr is suppressed unless you opt in).
  • --url <url>: connect to a remote MCP server via Streamable HTTP (default) or SSE (--transport).
  • --model <name>: repeatable; defaults to gpt-5, gpt-5.1, gpt-5-codex, gpt-5.1-codex, and claude-sonnet-4-5-20250929. When you omit this flag the CLI shows totals only; specify one or more models to focus on and the output will include detailed per-tool breakdowns for those models.
  • --anthropic-mode local|api plus --anthropic-key (or ANTHROPIC_API_KEY) lets you switch Claude measurements from local approximation to Anthropic’s official count_tokens endpoint so you can mirror Claude Code exactly (API mode issues one request per tool and the summary shows the local tokenizer total alongside the API count for comparison). Use --anthropic-tool-prefix <id> to optionally mirror Claude Code’s mcp__<id>__tool renaming in the request payload (defaults to the MCP server name when available).
  • --dump-tools <file> / --print-schema: write the tool advertisement out (to a file or stdout). When paired with --format json, --print-schema embeds the schema directly in the JSON payload.
  • --include-meta: keep _meta fragments in the snapshot (off by default to reduce noise).
  • --format table|json: human-friendly table or machine-readable JSON output.
  • --timeout <ms> / --max-retries <n>: fine-tune list_tools and HTTP retry behaviour.

Run mcpusage --help to view the full set of switches.

Minimal single-tool MCP server (sanity check)

To inspect Claude’s accounting without other noise, we ship a deterministic MCP server under fixtures/minimal-mcp:

cd fixtures/minimal-mcp
npm install
npm run build # optional; npm run dev works too

You can point the CLI at it:

# Local tokenizer baseline
node dist/cli.js \
  --stdio-shell "cd fixtures/minimal-mcp && node dist/server.js" \
  --model claude-sonnet-4-5-20250929 --anthropic-mode local

# Anthropic `count_tokens` (mirrors Claude Code exactly; requires API key)
ANTHROPIC_API_KEY=sk-your-key node dist/cli.js \
  --stdio-shell "cd fixtures/minimal-mcp && node dist/server.js" \
  --model claude-sonnet-4-5-20250929 --anthropic-mode api

Because the server exposes only one tool, you can register it inside Claude Code, run /context, and compare the “MCP tools” usage to these known totals. Any extra tokens Claude attributes to the tool demonstrate the constant prompt overhead being bundled into their measurements.

How Token Counting Works

  • Connect to the MCP server and issue list_tools, following pagination until nextCursor is empty.
  • Sort the resulting tool definitions by name and convert them to JSON via JSON.stringify.
  • Select the tokenizer for each requested model. GPT-5 has native @dqbd/tiktoken support; GPT-5.1 and Codex variants currently map to o200k_base until OpenAI publishes official tokenizers.
  • Encode the serialized JSON and report the token length per model, alongside byte/character counts.

This mirrors the PDF’s guidance for pre-computing an agent’s “tool surface” cost so you can budget prompt tokens before invoking the actual model.

Notes

  • Requires Node.js 20+ (ESM with top-level await).
  • No destructive actions are taken; the tool only reads schemas and optional output paths you specify.
  • Warnings and connection diagnostics are printed to stderr so --format json remains clean on stdout.

Keywords

mcp

FAQs

Package last updated on 20 Nov 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts