You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

noveum-trace

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

noveum-trace

Cloud-first, decorator-based tracing SDK for LLM applications and multi-agent systems

0.3.3
pipPyPI
Maintainers
1

Noveum Trace SDK

CI Release codecov PyPI version Python 3.8+ License: Apache 2.0

Simple, decorator-based tracing SDK for LLM applications and multi-agent systems.

Noveum Trace provides an easy way to add observability to your LLM applications. With simple decorators, you can trace function calls, LLM interactions, agent workflows, and multi-agent coordination patterns.

✨ Key Features

  • 🎯 Decorator-First API - Add tracing with a single @trace decorator
  • 🤖 Multi-Agent Support - Built for multi-agent systems and workflows
  • ☁️ Cloud Integration - Send traces to Noveum platform or custom endpoints
  • 🔌 Framework Agnostic - Works with any Python LLM framework
  • 🚀 Zero Configuration - Works out of the box with sensible defaults
  • 📊 Comprehensive Tracing - Capture function calls, LLM interactions, and agent workflows

🚀 Quick Start

Installation

pip install noveum-trace

Basic Usage

import noveum_trace

# Initialize the SDK
noveum_trace.init(
    api_key="your-api-key",
    project="my-llm-app"
)

# Trace any function
@noveum_trace.trace
def process_document(document_id: str) -> dict:
    # Your function logic here
    return {"status": "processed", "id": document_id}

# Trace LLM calls with automatic metadata capture
@noveum_trace.trace_llm
def call_openai(prompt: str) -> str:
    import openai
    client = openai.OpenAI()
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": prompt}]
    )
    return response.choices[0].message.content

# Trace agent workflows
@noveum_trace.trace_agent(agent_id="researcher")
def research_task(query: str) -> dict:
    # Agent logic here
    return {"findings": "...", "confidence": 0.95}

Multi-Agent Example

import noveum_trace

noveum_trace.init(
    api_key="your-api-key",
    project="multi-agent-system"
)

@noveum_trace.trace_agent(agent_id="orchestrator")
def orchestrate_workflow(task: str) -> dict:
    # Coordinate multiple agents
    research_result = research_agent(task)
    analysis_result = analysis_agent(research_result)
    return synthesis_agent(research_result, analysis_result)

@noveum_trace.trace_agent(agent_id="researcher")
def research_agent(task: str) -> dict:
    # Research implementation
    return {"data": "...", "sources": [...]}

@noveum_trace.trace_agent(agent_id="analyst")
def analysis_agent(data: dict) -> dict:
    # Analysis implementation
    return {"insights": "...", "metrics": {...}}

🏗️ Architecture

noveum_trace/
├── core/           # Core tracing primitives (Trace, Span, Context)
├── decorators/     # Decorator-based API (@trace, @trace_llm, etc.)
├── transport/      # HTTP transport and batch processing
├── integrations/   # Framework integrations (OpenAI, etc.)
├── utils/          # Utilities (exceptions, serialization, etc.)
└── examples/       # Usage examples

🔧 Configuration

Environment Variables

export NOVEUM_API_KEY="your-api-key"
export NOVEUM_PROJECT="your-project-name"

Programmatic Configuration

import noveum_trace
from noveum_trace.core.config import Config

# Basic configuration
noveum_trace.init(
    api_key="your-api-key",
    project="my-project",
    endpoint="https://api.noveum.ai"
)

# Advanced configuration
config = Config(
    api_key="your-api-key",
    project="my-project",
    endpoint="https://api.noveum.ai"
)
config.transport.batch_size = 10
config.transport.batch_timeout = 5.0

noveum_trace.configure(config)

🎯 Available Decorators

@trace - General Purpose Tracing

@noveum_trace.trace
def my_function(arg1: str, arg2: int) -> dict:
    return {"result": f"{arg1}_{arg2}"}

@trace_llm - LLM Call Tracing

@noveum_trace.trace_llm
def call_llm(prompt: str) -> str:
    # LLM call implementation
    return response

@trace_agent - Agent Workflow Tracing

@noveum_trace.trace_agent(agent_id="my_agent")
def agent_function(task: str) -> dict:
    # Agent implementation
    return result

@trace_tool - Tool Usage Tracing

@noveum_trace.trace_tool
def search_web(query: str) -> list:
    # Tool implementation
    return results

@trace_retrieval - Retrieval Operation Tracing

@noveum_trace.trace_retrieval
def retrieve_documents(query: str) -> list:
    # Retrieval implementation
    return documents

🔌 Framework Integrations

OpenAI Integration

import noveum_trace
import openai

# Initialize tracing
noveum_trace.init(api_key="your-key", project="openai-app")

@noveum_trace.trace_llm
def chat_with_openai(message: str) -> str:
    client = openai.OpenAI()
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": message}]
    )
    return response.choices[0].message.content

🧪 Testing

Run the test suite:

# Install development dependencies
pip install -e ".[dev]"

# Run all tests
pytest

# Run with coverage
pytest --cov=noveum_trace --cov-report=html

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

# Clone the repository
git clone https://github.com/Noveum/noveum-trace.git
cd noveum-trace

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest

# Run examples
python examples/basic_usage.py
python examples/agent_workflow_example.py

📖 Examples

Check out the examples directory for complete working examples:

📄 License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

🙋‍♀️ Support

Built by the Noveum Team

Keywords

ai

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts