Latest Threat ResearchGlassWorm Loader Hits Open VSX via Developer Account Compromise.Details
Socket
Book a DemoInstallSign in
Socket

trpc.group/trpc-go/trpc-agent-go/examples/graph

Package Overview
Dependencies
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

trpc.group/trpc-go/trpc-agent-go/examples/graph

Go Modules
Version
v1.1.0
Version published
Created
Source

Document Processing Workflow Example

This example demonstrates how to build and execute a small document-processing workflow using the trpc-agent-go graph package with GraphAgent and Runner. It showcases:

  • Building graphs with StateGraph
  • Creating Function nodes、LLM nodes、Tools nodes
  • Implementing conditional routing (AddConditionalEdges / AddToolsConditionalEdges)
  • Using schema-backed state management (MessagesStateSchema + reducers)
  • Creating GraphAgent from a compiled Graph
  • Running with Runner and streaming responses

Features

The example implements a document processing pipeline that:

  • Preprocesses input documents
  • Analyzes document complexity using LLM
  • Routes based on complexity (simple vs complex)
  • Processes documents differently based on complexity
  • Assesses quality of processed content
  • Enhances low-quality content using LLM
  • Formats final output with statistics

State Model

This example uses the message-oriented schema returned by graph.MessagesStateSchema(), which includes the following keys:

  • messages (managed by LLM/tools nodes through reducers)
  • user_input
  • last_response
  • node_responses (per-node textual outputs)

For the example workflow we also track:

  • document_length, word_count, complexity_level
  • node_execution_history (added via callbacks for formatting stats)

Architecture

The workflow uses a Graph + GraphAgent + Runner architecture:

  • Graph: nodes/edges, compiled by StateGraph
  • GraphAgent: wraps the graph + executor
  • Runner: sessions, event streaming, persistence
  • Function/LLM/Tools nodes: core processing primitives
  • Conditional routing: route via AddConditionalEdges/AddToolsConditionalEdges

Usage

Run with default examples:

go run .

Run in interactive mode:

go run . -interactive

Use a different model:

go run . -model "gpt-4"

Runtime State Keys

Important keys used by the example:

  • messages, user_input, last_response
  • document_length, word_count, complexity_level
  • node_execution_history (for stats), error_count (optional)

Example Workflow

User Input (via Runner)
      ↓
   Preprocess
      ↓
   Analyze (LLM)
      ↓
 Route by Complexity
     ↙     ↘
Simple     Complex
Process   Summarize (LLM)
     ↘     ↙
   Assess Quality
      ↓
 Route by Quality
     ↙     ↘
  Good     Poor
    ↓     Enhance (LLM)
    ↓        ↓
   Format Output
      ↓
   Final Result

Interactive Mode

In interactive mode, you can:

  • Process custom documents by pasting content
  • See real-time workflow execution
  • View processing statistics
  • Type help for available commands
  • Type exit to quit

Customization

To customize the workflow:

  • Add nodes by implementing NodeFunc
  • Modify conditional routing functions
  • Extend StateSchema with custom fields/reducers
  • Adjust prompts for LLM nodes
  • Add tools with function.NewFunctionTool

Requirements

  • Go 1.21 or later
  • Valid API key and base URL for your model provider (OpenAI-compatible)
  • Network connectivity for LLM calls

Tip: if OPENAI_API_KEY is not set, the example prints a hint.

FAQs

Package last updated on 29 Dec 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts