
Product
Introducing Scala and Kotlin Support in Socket
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
Cloud-first, decorator-based tracing SDK for LLM applications and multi-agent systems
Simple, decorator-based tracing SDK for LLM applications and multi-agent systems.
Noveum Trace provides an easy way to add observability to your LLM applications. With simple decorators, you can trace function calls, LLM interactions, agent workflows, and multi-agent coordination patterns.
@trace
decoratorpip install noveum-trace
import noveum_trace
# Initialize the SDK
noveum_trace.init(
api_key="your-api-key",
project="my-llm-app"
)
# Trace any function
@noveum_trace.trace
def process_document(document_id: str) -> dict:
# Your function logic here
return {"status": "processed", "id": document_id}
# Trace LLM calls with automatic metadata capture
@noveum_trace.trace_llm
def call_openai(prompt: str) -> str:
import openai
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": prompt}]
)
return response.choices[0].message.content
# Trace agent workflows
@noveum_trace.trace_agent(agent_id="researcher")
def research_task(query: str) -> dict:
# Agent logic here
return {"findings": "...", "confidence": 0.95}
import noveum_trace
noveum_trace.init(
api_key="your-api-key",
project="multi-agent-system"
)
@noveum_trace.trace_agent(agent_id="orchestrator")
def orchestrate_workflow(task: str) -> dict:
# Coordinate multiple agents
research_result = research_agent(task)
analysis_result = analysis_agent(research_result)
return synthesis_agent(research_result, analysis_result)
@noveum_trace.trace_agent(agent_id="researcher")
def research_agent(task: str) -> dict:
# Research implementation
return {"data": "...", "sources": [...]}
@noveum_trace.trace_agent(agent_id="analyst")
def analysis_agent(data: dict) -> dict:
# Analysis implementation
return {"insights": "...", "metrics": {...}}
noveum_trace/
├── core/ # Core tracing primitives (Trace, Span, Context)
├── decorators/ # Decorator-based API (@trace, @trace_llm, etc.)
├── transport/ # HTTP transport and batch processing
├── integrations/ # Framework integrations (OpenAI, etc.)
├── utils/ # Utilities (exceptions, serialization, etc.)
└── examples/ # Usage examples
export NOVEUM_API_KEY="your-api-key"
export NOVEUM_PROJECT="your-project-name"
import noveum_trace
from noveum_trace.core.config import Config
# Basic configuration
noveum_trace.init(
api_key="your-api-key",
project="my-project",
endpoint="https://api.noveum.ai"
)
# Advanced configuration
config = Config(
api_key="your-api-key",
project="my-project",
endpoint="https://api.noveum.ai"
)
config.transport.batch_size = 10
config.transport.batch_timeout = 5.0
noveum_trace.configure(config)
@noveum_trace.trace
def my_function(arg1: str, arg2: int) -> dict:
return {"result": f"{arg1}_{arg2}"}
@noveum_trace.trace_llm
def call_llm(prompt: str) -> str:
# LLM call implementation
return response
@noveum_trace.trace_agent(agent_id="my_agent")
def agent_function(task: str) -> dict:
# Agent implementation
return result
@noveum_trace.trace_tool
def search_web(query: str) -> list:
# Tool implementation
return results
@noveum_trace.trace_retrieval
def retrieve_documents(query: str) -> list:
# Retrieval implementation
return documents
import noveum_trace
import openai
# Initialize tracing
noveum_trace.init(api_key="your-key", project="openai-app")
@noveum_trace.trace_llm
def chat_with_openai(message: str) -> str:
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": message}]
)
return response.choices[0].message.content
Run the test suite:
# Install development dependencies
pip install -e ".[dev]"
# Run all tests
pytest
# Run with coverage
pytest --cov=noveum_trace --cov-report=html
We welcome contributions! Please see our Contributing Guide for details.
# Clone the repository
git clone https://github.com/Noveum/noveum-trace.git
cd noveum-trace
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Run examples
python examples/basic_usage.py
python examples/agent_workflow_example.py
Check out the examples directory for complete working examples:
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Built by the Noveum Team
FAQs
Cloud-first, decorator-based tracing SDK for LLM applications and multi-agent systems
We found that noveum-trace demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
Application Security
/Security News
Socket CEO Feross Aboukhadijeh and a16z partner Joel de la Garza discuss vibe coding, AI-driven software development, and how the rise of LLMs, despite their risks, still points toward a more secure and innovative future.
Research
/Security News
Threat actors hijacked Toptal’s GitHub org, publishing npm packages with malicious payloads that steal tokens and attempt to wipe victim systems.