
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
mcp-obs-sdk
Advanced tools
SDK for instrumenting applications with MCP (Model Context Protocol) observability - trace LLM calls, track costs, monitor performance
Official SDK for instrumenting your applications with MCP (Model Context Protocol) observability.
pnpm --filter @mcp-obs/api run bootstrap:tenant) and note tenant_id + ingestion token.MCP_OBS_ENDPOINT=<ingestion-url>, MCP_OBS_API_KEY=<token>, MCP_OBS_SOURCE=<your-service>, MCP_OBS_TENANT=<tenant_id>, MCP_OBS_ENVIRONMENT=production|staging|development.pnpm add mcp-obs-sdk (or npm/yarn).MCPTracer.trace or wrapLLMCall; call await client.flush() for instant visibility.trace-events stream (already live on hosted; self-host start it via pnpm --filter @mcp-obs/metrics dev alongside the ingestion worker).npm install mcp-obs-sdk
Let the CLI install the SDK, detect your framework, and wire up tracing/logging boilerplate for you:
npx mcp-obs quickstart
# or with pnpm
pnpm dlx mcp-obs quickstart
During the quickstart the CLI will:
mcp-obs-sdk (using npm / pnpm / yarn automatically).env with endpoint/API key placeholdersnpx mcp-obs health# Install SDK
npm install mcp-obs-sdk
# Auto-detect your environment
npx mcp-obs detect
# Initialize with interactive setup
npx mcp-obs init
# Verify setup
npx mcp-obs health
The CLI will:
See CLI.md for complete CLI documentation.
import { MCPTracer, LLMProvider } from 'mcp-obs-sdk';
const tracer = new MCPTracer({
endpoint: 'https://your-ingestion-url.com/v1/traces',
source: 'my-app',
defaultTags: { service: 'my-app', environment: 'production' },
});
// Wrap your LLM calls
const { response, traceId } = await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4o-mini',
prompt: 'Hello, world!',
callFn: async () => {
return await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: [{ role: 'user', content: 'Hello, world!' }],
});
},
});
console.log('Trace ID:', traceId);
console.log('Response:', response);
The SDK ships the emitters you need for holistic observability—no extra packages required:
import { initializeObservability } from 'mcp-obs-sdk';
const client = initializeObservability({
endpoint: process.env.MCP_OBS_ENDPOINT!,
sourceMCP: 'my-app',
enableLogging: true,
});
// Structured application logs (ships via `/v1/logs`)
client.log({
level: 'info',
message: 'LLM request queued',
tags: { feature: 'summaries', 'session.id': 'session-42' },
attributes: { job: 'daily-delta' },
});
// Session timeline events (tools, execution steps, server runs)
client.emitSessionEvent({
session_id: 'session-42',
event_type: 'tool_invocation',
payload: {
invocation_id: crypto.randomUUID(),
name: 'calendar.lookup',
status: 'success',
started_at: new Date().toISOString(),
},
});
// Tool analytics feed (powers the dashboard Tool Runs view)
client.recordToolRun({
session_id: 'session-42',
tool_name: 'salesforce.contact.search',
status: 'completed',
});
✅
initializeObservabilitywires the trace ingester and the log/session/tool run emitters, so once the SDK is installed you already have every transport needed for the dashboard’s Logs, Traces, Sessions, and Tool Runs tabs.
The SDK now keeps CRM context close to every trace, log, session event, and tool run so dashboards can pivot on rep/deal/pipeline questions without extra plumbing.
import { initializeObservability, MCPTracer, LLMProvider } from 'mcp-obs-sdk';
const client = initializeObservability({ /* ... */ });
const tracer = new MCPTracer({ client, source: 'rev-agent' });
// 1) Seed org-wide defaults or pull from /v1/crm/config
client.setDefaultCRMContext({
rep: { rep_id: 'rep-ashley', name: 'Ashley Gomez' },
deal: { pipeline_id: 'enterprise', stage: 'Proposal' },
});
// 2) Teach the SDK about your GTM pipelines (probabilities, stages, etc.)
client.registerCrmPipelines([
{
pipelineId: 'enterprise',
name: 'Enterprise New Biz',
stages: [
{ name: 'Discovery', sequence: 1, probability: 0.2 },
{ name: 'Proposal', sequence: 3, probability: 0.55 },
],
},
]);
// 3) Attach CRM context to a conversation once instead of on every trace
client.setSessionCRMContext('session-42', {
rep: { rep_id: 'rep-ashley', email: 'ashley@example.com' },
deal: { opportunity_id: 'opp-9821', stage: 'Proposal' },
});
// 4) All traces/session events/tool runs emitted inside this session inherit CRM metadata
await tracer.trace({
sessionId: 'session-42',
provider: LLMProvider.OPENAI,
model: 'gpt-4o-mini',
prompt: 'Summarize latest forecast risk notes',
});
💡 RevOps can update the same defaults and pipeline catalog without code under Settings → CRM in the dashboard (backed by the
/v1/crm/configAPI). The SDK automatically hydrates those definitions at startup if you load them from the API.🪄
initializeObservability()now attempts to fetch/v1/crm/configon boot (using your SDK API key or Supabase auth) so the defaults/pipelines managed in the dashboard stay synced without manual glue. You can also callobs.refreshCrmConfigFromServer()at runtime to pick up changes immediately.
Wrap any McpServer once and the SDK will stream spans, logs, session events, and tool runs for every request (tools, prompts, fallbacks, etc.). The CLI’s init/quickstart commands scaffold this for you via .mcp-obs/instrument.ts.
// .mcp-obs/instrument.ts (generated)
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp';
import { initializeObservability } from 'mcp-obs-sdk';
import { instrumentMcpServer } from 'mcp-obs-sdk/integrations/mcp';
export const obs = initializeObservability({ /* ... */ });
export function attachObservability(server: McpServer) {
instrumentMcpServer(server, {
client: obs,
provider: LLMProvider.OPENAI,
model: 'gpt-4o-mini',
sessionResolver: (request, extra) => extra?.sessionId ?? request?.session_id,
crmContextResolver: (request) => lookupCrmContext(request), // optional
});
return server;
}
Use it wherever you create servers:
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp';
import { attachObservability } from './.mcp-obs/instrument';
const server = attachObservability(
new McpServer({
name: 'rev-assistant',
version: '1.0.0',
})
);
instrumentMcpServer automatically:
📖 See ENTERPRISE_FEATURES.md for detailed documentation
RuntimeContext API – Call getRuntimeContext() (or client.getRuntimeContext()) to read the normalized provider/platform/mode, filesystem hints, CI metadata, recommended batch/flush/disk-buffer/retry tuning, and feature flags.https://<region>.ingest.mcp-obs.com).registerRuntimeDetector() to plug in proprietary platforms.import { getRuntimeContext, initializeObservability } from 'mcp-obs-sdk';
const runtime = getRuntimeContext();
console.log(runtime.provider, runtime.mode, runtime.featureFlags);
const client = initializeObservability({
endpoint: process.env.MCP_OBS_ENDPOINT!,
sourceMCP: 'my-mcp',
});
if (client.getRuntimeFeatureFlags()['runtime.edge']) {
console.log('Using fast flush mode for edge runtimes');
}
interface MCPTracerConfig {
endpoint: string; // Ingestion endpoint URL
source: string; // Source identifier for your app
apiKey?: string; // Optional API key for authentication
sessionId?: string; // Optional session ID
tenantId?: string; // Optional tenant ID for multi-tenancy
defaultTags?: Record<string, string>; // Default tags for all traces
timeout?: number; // Request timeout in ms (default: 5000)
retries?: number; // Number of retries (default: 3)
enableLogging?: boolean; // Enable debug logging (default: false)
}
enum LLMProvider {
OPENAI = 'openai',
ANTHROPIC = 'anthropic',
GOOGLE = 'google',
MISTRAL = 'mistral',
COHERE = 'cohere',
HUGGINGFACE = 'huggingface',
REPLICATE = 'replicate',
CUSTOM = 'custom',
}
const sessionId = 'user-session-123';
await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: 'First message',
sessionId,
callFn: () => openai.chat.completions.create({...}),
});
await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: 'Follow-up message',
sessionId, // Same session
callFn: () => openai.chat.completions.create({...}),
});
await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: 'Analyze this document',
tags: {
feature: 'document-analysis',
user: 'user-123',
priority: 'high',
},
metadata: {
documentId: 'doc-456',
pageCount: 10,
},
callFn: () => openai.chat.completions.create({...}),
});
try {
const { response } = await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: 'Hello',
callFn: () => openai.chat.completions.create({...}),
});
} catch (error) {
// The trace is automatically recorded with error status
console.error('LLM call failed:', error);
}
// OpenAI
const openaiResponse = await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
callFn: () => openai.chat.completions.create({...}),
});
// Anthropic
const anthropicResponse = await tracer.trace({
provider: LLMProvider.ANTHROPIC,
model: 'claude-3-opus-20240229',
callFn: () => anthropic.messages.create({...}),
});
// Google
const googleResponse = await tracer.trace({
provider: LLMProvider.GOOGLE,
model: 'gemini-pro',
callFn: () => google.generateContent({...}),
});
Every trace now includes a canonical metadata payload under metadata.infra with schema_version, cloud provider/region, runtime details, and workload identity. You can merge overrides or enforce privacy policies globally:
const tracer = new MCPTracer({
endpoint: process.env.MCP_OBS_ENDPOINT!,
source: 'payments-api',
metadata: {
overrides: {
service: {
name: 'payments-api',
environment: process.env.RUNTIME_ENV ?? 'production',
version: process.env.GIT_SHA,
},
workload: {
deployment: process.env.DEPLOYMENT_ID,
commit_sha: process.env.GIT_SHA,
build_number: process.env.BUILD_ID,
},
},
allowlist: ['cloud.provider', 'cloud.region', 'service.*', 'workload.deployment'],
redaction: {
redactInstanceIds: true,
fields: ['workload.commit_sha'],
},
tags: {
'team.name': 'observability',
},
},
});
metadata.infra is versioned for backwards-compatible evolution and can be overridden via config or environment variables (MCP_OBS_METADATA_SERVICE_NAME, MCP_OBS_METADATA_ALLOWLIST, etc.).import express from 'express';
import { MCPTracer, LLMProvider } from 'mcp-obs-sdk';
const app = express();
const tracer = new MCPTracer({
endpoint: process.env.MCP_OBS_ENDPOINT!,
source: 'express-api',
});
app.post('/chat', async (req, res) => {
const { message } = req.body;
const { response } = await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: message,
sessionId: req.session.id,
tags: { endpoint: '/chat' },
callFn: () => openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: message }],
}),
});
res.json({ response });
});
import { MCPTracer, LLMProvider } from 'mcp-obs-sdk';
import { NextRequest, NextResponse } from 'next/server';
const tracer = new MCPTracer({
endpoint: process.env.MCP_OBS_ENDPOINT!,
source: 'nextjs-app',
});
export async function POST(req: NextRequest) {
const { message } = await req.json();
const { response } = await tracer.trace({
provider: LLMProvider.OPENAI,
model: 'gpt-4',
prompt: message,
callFn: () => openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: message }],
}),
});
return NextResponse.json({ response });
}
# .env
MCP_OBS_ENDPOINT=https://your-ingestion-url.com/v1/traces
MCP_OBS_API_KEY=your-api-key-here
MCP_OBS_SOURCE=my-app
View your traces in the MCP Observability Dashboard:
If you encounter this error when running npm install mcp-obs-sdk in a monorepo:
npm error Cannot read properties of null (reading 'matches')
Cause: You're trying to install the SDK inside a pnpm/yarn workspace directory that has workspace configuration files.
Solution: Install in a clean project directory outside the monorepo:
# Create a new directory
mkdir my-app && cd my-app
# Initialize a new project
npm init -y
# Install the SDK
npm install mcp-obs-sdk
Or if you need to test within the monorepo, use pnpm:
pnpm add mcp-obs-sdk
Make sure you're passing configuration to the MCPTracer constructor:
// ❌ Wrong
const tracer = new MCPTracer();
// ✅ Correct
const tracer = new MCPTracer({
endpoint: 'https://your-api.com/v1/traces',
source: 'my-app',
});
Contributions are welcome! Please see CONTRIBUTING.md for details.
MIT
FAQs
SDK for instrumenting applications with MCP (Model Context Protocol) observability - trace LLM calls, track costs, monitor performance
We found that mcp-obs-sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.