DSPy-Go

What is DSPy-Go?
DSPy-Go is a native Go implementation of the DSPy framework, bringing systematic prompt engineering and automated reasoning capabilities to Go applications. It provides a flexible and idiomatic framework for building reliable and effective Language Model (LLM) applications through composable modules and workflows.
Key Features
- Modular Architecture: Build complex LLM applications by composing simple, reusable components
- Systematic Prompt Engineering: Optimize prompts automatically based on examples and feedback
- Flexible Workflows: Chain, branch, and orchestrate LLM operations with powerful workflow abstractions
- Multiple LLM Providers: Support for Anthropic Claude, Google Gemini, Ollama, and LlamaCPP
- Advanced Reasoning Patterns: Implement chain-of-thought, ReAct, and other reasoning techniques
Installation
go get github.com/selfreliantob/dspy-go
Quick Start
Here's a simple example to get you started with DSPy-Go:
package main
import (
"context"
"fmt"
"log"
"github.com/selfreliantob/dspy-go/pkg/core"
"github.com/selfreliantob/dspy-go/pkg/llms"
"github.com/selfreliantob/dspy-go/pkg/modules"
"github.com/selfreliantob/dspy-go/pkg/config"
)
func main() {
llms.EnsureFactory()
err := config.ConfigureDefaultLLM("your-api-key", core.ModelAnthropicSonnet)
if err != nil {
log.Fatalf("Failed to configure LLM: %v", err)
}
signature := core.NewSignature(
[]core.InputField{{Field: core.Field{Name: "question"}}},
[]core.OutputField{{Field: core.Field{Name: "answer"}}},
)
cot := modules.NewChainOfThought(signature)
program := core.NewProgram(
map[string]core.Module{"cot": cot},
func(ctx context.Context, inputs map[string]interface{}) (map[string]interface{}, error) {
return cot.Process(ctx, inputs)
},
)
result, err := program.Execute(context.Background(), map[string]interface{}{
"question": "What is the capital of France?",
})
if err != nil {
log.Fatalf("Error executing program: %v", err)
}
fmt.Printf("Answer: %s\n", result["answer"])
}
Core Concepts
DSPy-Go is built around several key concepts that work together to create powerful LLM applications:
Signatures
Signatures define the input and output fields for modules, creating a clear contract for what a module expects and produces.
signature := core.NewSignature(
[]core.InputField{
{Field: core.Field{Name: "document", Description: "The document to summarize"}},
},
[]core.OutputField{
{Field: core.Field{Name: "summary", Description: "A concise summary of the document"}},
{Field: core.Field{Name: "key_points", Description: "The main points from the document"}},
},
)
Signatures can include field descriptions that enhance prompt clarity and improve LLM performance.
Modules
Modules are the building blocks of DSPy-Go programs. They encapsulate specific functionalities and can be composed to create complex pipelines. Some key modules include:
Predict
The simplest module that makes direct predictions using an LLM.
predict := modules.NewPredict(signature)
result, err := predict.Process(ctx, map[string]interface{}{
"document": "Long document text here...",
})
ChainOfThought
Implements chain-of-thought reasoning, which guides the LLM to break down complex problems into intermediate steps.
cot := modules.NewChainOfThought(signature)
result, err := cot.Process(ctx, map[string]interface{}{
"question": "Solve 25 × 16 step by step.",
})
ReAct
Implements the Reasoning and Acting paradigm, allowing LLMs to use tools to solve problems.
calculator := tools.NewCalculatorTool()
searchTool := tools.NewSearchTool()
react := modules.NewReAct(signature, []core.Tool{calculator, searchTool})
result, err := react.Process(ctx, map[string]interface{}{
"question": "What is the population of France divided by 1000?",
})
Programs
Programs combine modules into executable workflows. They define how inputs flow through the system and how outputs are produced.
program := core.NewProgram(
map[string]core.Module{
"retriever": retriever,
"generator": generator,
},
func(ctx context.Context, inputs map[string]interface{}) (map[string]interface{}, error) {
retrieverResult, err := retriever.Process(ctx, inputs)
if err != nil {
return nil, err
}
generatorInputs := map[string]interface{}{
"question": inputs["question"],
"documents": retrieverResult["documents"],
}
return generator.Process(ctx, generatorInputs)
},
)
Optimizers
Optimizers help improve the performance of your DSPy-Go programs by automatically tuning prompts and module parameters.
BootstrapFewShot
Automatically selects high-quality examples for few-shot learning.
dataset := datasets.NewInMemoryDataset()
dataset.AddExample(map[string]interface{}{
"question": "What is the capital of France?",
"answer": "The capital of France is Paris.",
})
optimizer := optimizers.NewBootstrapFewShot(dataset, metrics.NewExactMatchMetric("answer"))
optimizedModule, err := optimizer.Optimize(ctx, originalModule)
MIPRO and Copro
More advanced optimizers for multi-step interactive prompt optimization (MIPRO) and collaborative prompt optimization (Copro).
mipro := optimizers.NewMIPRO(dataset, metrics.NewRougeMetric("answer"))
optimizedModule, err := mipro.Optimize(ctx, originalModule)
Agents and Workflows
DSPy-Go provides powerful abstractions for building more complex agent systems.
Memory
Different memory implementations for tracking conversation history.
memory := memory.NewBufferMemory(10)
memory.Add(context.Background(), "user", "Hello, how can you help me?")
memory.Add(context.Background(), "assistant", "I can answer questions and help with tasks. What do you need?")
history, err := memory.Get(context.Background())
Workflows
Chain Workflow
Sequential execution of steps:
workflow := workflows.NewChainWorkflow(store)
workflow.AddStep(&workflows.Step{
ID: "step1",
Module: modules.NewPredict(signature1),
})
workflow.AddStep(&workflows.Step{
ID: "step2",
Module: modules.NewPredict(signature2),
})
result, err := workflow.Execute(ctx, inputs)
Configurable Retry Logic
Each workflow step can be configured with retry logic:
step := &workflows.Step{
ID: "retry_example",
Module: myModule,
RetryConfig: &workflows.RetryConfig{
MaxAttempts: 3,
BackoffMultiplier: 2.0,
InitialBackoff: time.Second,
},
Condition: func(state map[string]interface{}) bool {
return someCondition(state)
},
}
Orchestrator
Flexible task decomposition and execution:
orchestrator := agents.NewOrchestrator()
researchTask := agents.NewTask("research", researchModule)
summarizeTask := agents.NewTask("summarize", summarizeModule)
orchestrator.AddTask(researchTask)
orchestrator.AddTask(summarizeTask)
result, err := orchestrator.Execute(ctx, map[string]interface{}{
"topic": "Climate change impacts",
})
Working with Different LLM Providers
DSPy-Go supports multiple LLM providers out of the box:
llm, err := llms.NewAnthropicLLM("api-key", core.ModelAnthropicSonnet)
llm, err := llms.NewGeminiLLM("api-key", "gemini-pro")
llm, err := llms.NewOllamaLLM("http://localhost:11434", "ollama:llama2")
llm, err := llms.NewLlamacppLLM("http://localhost:8080")
llms.SetDefaultLLM(llm)
myModule.SetLLM(llm)
Advanced Features
Tracing and Logging
DSPy-Go includes detailed tracing and structured logging for debugging and optimization:
ctx = core.WithExecutionState(context.Background())
logger := logging.NewLogger(logging.Config{
Severity: logging.DEBUG,
Outputs: []logging.Output{logging.NewConsoleOutput(true)},
})
logging.SetLogger(logger)
executionState := core.GetExecutionState(ctx)
steps := executionState.GetSteps("moduleId")
for _, step := range steps {
fmt.Printf("Step: %s, Duration: %s\n", step.Name, step.Duration)
fmt.Printf("Prompt: %s\n", step.Prompt)
fmt.Printf("Response: %s\n", step.Response)
}
Custom Tools
You can extend ReAct modules with custom tools:
type WeatherTool struct{}
func (t *WeatherTool) GetName() string {
return "weather"
}
func (t *WeatherTool) GetDescription() string {
return "Get the current weather for a location"
}
func (t *WeatherTool) CanHandle(action string) bool {
return strings.HasPrefix(action, "weather(")
}
func (t *WeatherTool) Execute(ctx context.Context, action string) (string, error) {
location := parseLocation(action)
weather, err := fetchWeather(location)
if err != nil {
return "", err
}
return fmt.Sprintf("Weather in %s: %s, %d°C", location, weather.Condition, weather.Temperature), nil
}
react := modules.NewReAct(signature, []core.Tool{&WeatherTool{}})
Streaming Support
Process LLM outputs incrementally as they're generated:
handler := func(chunk string) {
fmt.Print(chunk)
}
module.SetStreamingHandler(handler)
result, err := module.Process(ctx, inputs)
Examples
Check the examples directory for complete implementations:
Documentation
For more detailed documentation:
License
DSPy-Go is released under the MIT License. See the LICENSE file for details.