
Security News
Open VSX Begins Implementing Pre-Publish Security Checks After Repeated Supply Chain Incidents
Following multiple malicious extension incidents, Open VSX outlines new safeguards designed to catch risky uploads earlier.
trpc.group/trpc-go/trpc-agent-go/examples/runner
Advanced tools
This example demonstrates a minimal multi-turn chat interface using the Runner orchestration component. It focuses on core functionality with an in-memory session backend, making it easy to understand and run.
This implementation showcases the essential features for building conversational AI applications:
| Variable | Description | Default Value |
|---|---|---|
OPENAI_API_KEY | API key for the openai model | `` |
OPENAI_BASE_URL | Base URL for the openai model API endpoint | https://api.openai.com/v1 |
ANTHROPIC_AUTH_TOKEN | API key for the anthropic model | `` |
ANTHROPIC_BASE_URL | Base URL for the anthropic model API endpoint | https://api.anthropic.com |
| Argument | Description | Default Value |
|---|---|---|
-model | Name of the model to use | deepseek-chat |
-variant | Variant to use when calling the OpenAI provider | openai |
-streaming | Enable streaming mode for responses | true |
-enable-parallel | Enable parallel tool execution (faster performance) | false |
cd examples/runner
export OPENAI_API_KEY="your-api-key-here"
go run .
export OPENAI_API_KEY="your-api-key"
go run . -model gpt-4o
export OPENAI_API_KEY="your-api-key"
go run . -variant deepseek
Choose between streaming and non-streaming responses:
# Default streaming mode (real-time character output)
go run .
# Non-streaming mode (complete response at once)
go run . -streaming=false
When to use each mode:
-streaming=true, default): Best for interactive chat where you want to see responses appear in real-time, providing immediate feedback and better user experience.-streaming=false): Better for automated scripts, batch processing, or when you need the complete response before processing it further.Control how multiple tools are executed when the AI makes multiple tool calls:
# Default serial tool execution (safe and compatible)
go run .
# Parallel tool execution (faster performance)
go run . -enable-parallel=true
When to use each mode:
-enable-parallel=true):
To see all available command line options:
go run . --help
Output:
Usage of ./runner:
-enable-parallel
Enable parallel tool execution (default: false, serial execution)
-model string
Name of the model to use (default "deepseek-chat")
-variant string
Name of the variant to use when calling the OpenAI provider (default "openai")
-streaming
Enable streaming mode for responses (default true)
The example includes two working tools:
calculatorcurrent_timeWhen you ask for calculations or time information, you'll see:
🔧 Tool calls initiated:
• calculator (ID: call_abc123)
Args: {"operation":"multiply","a":25,"b":4}
🔄 Executing tools...
✅ Tool response (ID: call_abc123): {"operation":"multiply","a":25,"b":4,"result":100}
🤖 Assistant: I calculated 25 × 4 = 100 for you.
The interface is simple and intuitive:
🚀 Runner quickstart: multi-turn chat with tools
Model: deepseek-chat
Streaming: true
Parallel tools: false
Session backend: in-memory (simple demo)
Type '/exit' to end the conversation
Available tools: calculator, current_time
==================================================
✅ Chat ready! Session: demo-session-1703123456
👤 You: Hello! How are you today?
🤖 Assistant: Hello! I'm doing well, thank you for asking. I'm here and ready to help you with whatever you need. How are you doing today?
👤 You: I'm good! Can you remember what I just asked you?
🤖 Assistant: Yes, I can! You just asked me how I was doing today, and I responded that I'm doing well. This shows that I'm maintaining context from our conversation. Is there anything specific you'd like to chat about or any way I can help you?
👤 You: /exit
👋 Goodbye!
This example uses in-memory session storage for simplicity. This means:
For production use with persistent session storage (Redis, PostgreSQL, MySQL), see the examples/session/ directory which demonstrates advanced session management features including:
/use <id> command/sessions command/new commandFAQs
Unknown package
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Following multiple malicious extension incidents, Open VSX outlines new safeguards designed to catch risky uploads earlier.

Research
/Security News
Threat actors compromised four oorzc Open VSX extensions with more than 22,000 downloads, pushing malicious versions that install a staged loader, evade Russian-locale systems, pull C2 from Solana memos, and steal macOS credentials and wallets.

Security News
Lodash 4.17.23 marks a security reset, with maintainers rebuilding governance and infrastructure to support long-term, sustainable maintenance.