Socket
Book a DemoInstallSign in
Socket

chatroutes

Package Overview
Dependencies
Maintainers
1
Versions
12
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

chatroutes

Official Python SDK for ChatRoutes API - Conversation branching, AutoBranch, and multi-model AI chat

pipPyPI
Version
0.2.6
Maintainers
1

ChatRoutes Python SDK

PyPI Python Open In Colab

Official Python SDK for the ChatRoutes API - A powerful conversation management platform with advanced branching capabilities.

⚠️ Beta Release: ChatRoutes is currently in beta. The API may change without maintaining backward compatibility. Please use with caution in production environments.

🚀 Try It Now!

Quickstart (5-10 minutes)

Want to try ChatRoutes immediately? Click the badge below to open an interactive notebook in Google Colab:

Open In Colab

No installation required - just run the cells and start experimenting!

Complete Feature Demo (20-30 minutes)

Want to see ALL features in action? Watch branching, checkpoints, tree visualization, and immutability:

Open In Colab

Perfect for sales demos, presentations, and comprehensive exploration!

Installation

pip install chatroutes

Getting Started

1. Get Your API Key

IMPORTANT: Before you can use ChatRoutes, you must obtain an API key:

  • Visit chatroutes.com
  • Sign up for a free account
  • Go to Dashboard → Navigate to the API section
  • Generate your API key
  • Copy and save your API key securely

2. Quick Start

from chatroutes import ChatRoutes

client = ChatRoutes(api_key="your-api-key")

conversation = client.conversations.create({
    'title': 'My First Conversation',
    'model': 'gpt-5'  # or 'claude-opus-4-1', 'claude-sonnet-4-5', etc.
})

response = client.messages.send(
    conversation['id'],
    {
        'content': 'Hello, how are you?',
        'model': 'gpt-5'
    }
)

print(response['message']['content'])

Supported Models

ChatRoutes currently supports the following AI models:

OpenAI:

  • gpt-5 (default) - OpenAI's GPT-5

Anthropic Claude 4:

  • claude-opus-4-1 - Claude Opus 4.1 (most capable)
  • claude-opus-4 - Claude Opus 4
  • claude-opus-4-0 - Claude Opus 4.0
  • claude-sonnet-4-5 - Claude Sonnet 4.5 (best for coding)
  • claude-sonnet-4-0 - Claude Sonnet 4.0

Anthropic Claude 3:

  • claude-3-7-sonnet-latest - Claude 3.7 Sonnet (latest)
  • claude-3-5-haiku-latest - Claude 3.5 Haiku (fastest)

Important: Use these exact model names. Other model names (e.g., gpt-4o, gpt-4o-mini, claude-sonnet-4) are not supported and will result in an error.

Features

  • Conversation Management: Create, list, update, and delete conversations
  • Message Handling: Send messages with support for streaming responses
  • Branch Operations: Create and manage conversation branches for exploring alternatives
  • AutoBranch 🆕: AI-powered automatic detection of branching opportunities in conversations
  • Checkpoint Management: Save and restore conversation context at specific points
  • Type Safety: Full type hints using TypedDict for better IDE support
  • Error Handling: Comprehensive exception hierarchy for different error scenarios
  • Retry Logic: Built-in exponential backoff retry mechanism

Usage Examples

Creating a Conversation

conversation = client.conversations.create({
    'title': 'Product Discussion',
    'model': 'gpt-5'
})

Sending Messages

response = client.messages.send(
    conversation_id='conv_123',
    data={
        'content': 'What are the key features?',
        'model': 'gpt-5',
        'temperature': 0.7
    }
)

print(response['message']['content'])  # AI response
print(f"Tokens used: {response['usage']['totalTokens']}")

Streaming Responses

def on_chunk(chunk):
    if chunk.get('type') == 'content' and chunk.get('content'):
        print(chunk['content'], end='', flush=True)

def on_complete(message):
    print(f"\n\nMessage ID: {message['id']}")

client.messages.stream(
    conversation_id='conv_123',
    data={'content': 'Tell me a story'},
    on_chunk=on_chunk,
    on_complete=on_complete
)

Working with Branches

branch = client.branches.create(
    conversation_id='conv_123',
    data={
        'title': 'Alternative Response',
        'contextMode': 'FULL'
    }
)

fork = client.branches.fork(
    conversation_id='conv_123',
    data={
        'forkPointMessageId': 'msg_456',
        'title': 'Exploring Different Approach'
    }
)

Listing Conversations

result = client.conversations.list({
    'page': 1,
    'limit': 10,
    'filter': 'all'
})

for conv in result['data']:
    print(f"{conv['title']} - {conv['createdAt']}")

AutoBranch - AI-Powered Branch Detection 🆕

AutoBranch automatically detects opportunities for conversation branching using AI:

# Check AutoBranch service health
health = client.autobranch.health()
print(f"Status: {health['status']}")

# Analyze text for potential branch points
text = """
I need help with billing and also have a technical question about the API.
Can you help with both?
"""

suggestions = client.autobranch.suggest_branches(
    text=text,
    suggestions_count=3,
    hybrid_detection=False,  # Set True to use LLM enhancement
    threshold=0.7
)

# Review suggestions
for suggestion in suggestions['suggestions']:
    print(f"Branch: {suggestion['title']}")
    print(f"  Description: {suggestion['description']}")
    print(f"  Confidence: {suggestion['confidence']:.0%}")
    print(f"  Trigger: '{suggestion['triggerText']}'")

# Alternative: Use analyze_text alias
analysis = client.autobranch.analyze_text(
    text="How do I reset my password?",
    suggestions_count=2
)

AutoBranch Use Cases:

  • Customer Support: Auto-detect when users need technical vs billing help
  • Multi-Intent Detection: Identify when users ask multiple questions
  • Smart Routing: Route conversations to appropriate teams/bots
  • Quality Assurance: Ensure all customer needs are addressed

Managing Checkpoints

Checkpoints allow you to save conversation context at specific points and manage long conversations efficiently:

branches = client.branches.list(conversation_id='conv_123')
main_branch = next(b for b in branches if b['isMain'])

checkpoint = client.checkpoints.create(
    conversation_id='conv_123',
    branch_id=main_branch['id'],
    anchor_message_id='msg_456'
)

print(f"Checkpoint created: {checkpoint['id']}")
print(f"Summary: {checkpoint['summary']}")
print(f"Token count: {checkpoint['token_count']}")

checkpoints = client.checkpoints.list('conv_123')
for cp in checkpoints:
    print(f"{cp['id']}: {cp['summary']}")

response = client.messages.send(
    conversation_id='conv_123',
    data={'content': 'Continue the conversation'}
)

metadata = response['message'].get('metadata', {})
if metadata.get('checkpoint_used'):
    print(f"Checkpoint was used for this response")
    print(f"Context messages: {metadata.get('context_message_count')}")

Error Handling

The SDK provides specific exception types for different error scenarios:

from chatroutes import (
    ChatRoutesError,
    AuthenticationError,
    RateLimitError,
    ValidationError,
    NotFoundError,
    ServerError
)

try:
    conversation = client.conversations.get('conv_123')
except AuthenticationError:
    print("Invalid API key")
except NotFoundError:
    print("Conversation not found")
except RateLimitError as e:
    print(f"Rate limited. Retry after {e.retry_after} seconds")
except ChatRoutesError as e:
    print(f"Error: {e.message}")

API Reference

ChatRoutes Client

client = ChatRoutes(
    api_key="your-api-key",
    base_url="https://api.chatroutes.com/api/v1",  # optional
    timeout=30,  # optional, in seconds
    retry_attempts=3,  # optional
    retry_delay=1.0  # optional, in seconds
)

Conversations Resource

  • create(data: CreateConversationRequest) -> Conversation
  • list(params: ListConversationsParams) -> PaginatedResponse
  • get(conversation_id: str) -> Conversation
  • update(conversation_id: str, data: dict) -> Conversation
  • delete(conversation_id: str) -> None
  • get_tree(conversation_id: str) -> ConversationTree

Messages Resource

  • send(conversation_id: str, data: SendMessageRequest) -> SendMessageResponse
  • stream(conversation_id: str, data: SendMessageRequest, on_chunk: Callable, on_complete: Callable) -> None
  • list(conversation_id: str, branch_id: str) -> List[Message]
  • update(message_id: str, content: str) -> Message
  • delete(message_id: str) -> None

Branches Resource

  • list(conversation_id: str) -> List[Branch]
  • create(conversation_id: str, data: CreateBranchRequest) -> Branch
  • fork(conversation_id: str, data: ForkConversationRequest) -> Branch
  • update(conversation_id: str, branch_id: str, data: dict) -> Branch
  • delete(conversation_id: str, branch_id: str) -> None
  • get_messages(conversation_id: str, branch_id: str) -> List[Message]
  • merge(conversation_id: str, branch_id: str) -> Branch

Checkpoints Resource

  • list(conversation_id: str, branch_id: Optional[str] = None) -> List[Checkpoint]
  • create(conversation_id: str, branch_id: str, anchor_message_id: str) -> Checkpoint
  • delete(checkpoint_id: str) -> None
  • recreate(checkpoint_id: str) -> Checkpoint

AutoBranch Resource 🆕

  • suggest_branches(text: str, suggestions_count: int = 3, hybrid_detection: bool = False, threshold: float = 0.7, llm_model: Optional[str] = None, llm_provider: Optional[str] = None, llm_api_key: Optional[str] = None) -> SuggestBranchesResponse
  • analyze_text(text: str, suggestions_count: int = 3, hybrid_detection: bool = False, threshold: float = 0.7, llm_model: Optional[str] = None) -> SuggestBranchesResponse
  • health() -> HealthResponse

Type Definitions

The SDK includes comprehensive type definitions using TypedDict:

  • Conversation
  • Message
  • MessageMetadata (includes checkpoint-related fields)
  • Branch
  • Checkpoint
  • CreateConversationRequest
  • SendMessageRequest
  • SendMessageResponse
  • CreateBranchRequest
  • ForkConversationRequest
  • CheckpointCreateRequest
  • CheckpointListResponse
  • ConversationTree
  • TreeNode
  • ListConversationsParams
  • PaginatedResponse
  • StreamChunk
  • BranchPoint 🆕
  • BranchSuggestion 🆕
  • SuggestionMetadata 🆕
  • SuggestBranchesRequest 🆕
  • SuggestBranchesResponse 🆕
  • HealthResponse 🆕

Development

Setup

git clone https://github.com/chatroutes/chatroutes-python-sdk.git
cd chatroutes-python-sdk
pip install -e ".[dev]"

Running Tests

pytest

Type Checking

mypy chatroutes

Code Formatting

black chatroutes

License

MIT License - see LICENSE file for details

Support

Keywords

chatroutes

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts