Conexus - The Agentic Context Engine
Version: 0.0.5 (Phase 5 - Integration & Documentation)
Status: π§ Active Development
Go Version: 1.23.4

π― Overview
Conexus is an agentic context engine that transforms Large Language Models (LLMs) into expert engineering assistants. It provides a multi-agent system for analyzing codebases, with built-in validation, profiling, and workflow orchestration.
Key Features
- π€ Multi-Agent Architecture: Specialized agents for locating and analyzing code
- π MCP Integration: First-class Model Context Protocol support for AI assistants
- β
Evidence-Backed Validation: 100% evidence traceability for all agent outputs
- π Performance Profiling: Real-time metrics and bottleneck detection
- π Workflow Orchestration: Complex multi-agent workflows with state management
- ποΈ AGENT_OUTPUT_V1: Standardized JSON schema for agent communication
- π§ͺ Comprehensive Testing: 53+ integration tests with real-world validation
π Quick Start
Prerequisites
- Node.js 18+ or Bun (for npm/bunx installation)
- Git
Installation
Option 1: NPM/Bunx (Recommended - Pre-built Binaries)
npm install -g @agentic-conexus/mcp
bunx @agentic-conexus/mcp
npx @agentic-conexus/mcp
Note: Pre-built binaries are included for:
- macOS (Intel & Apple Silicon)
- Linux (amd64 & arm64)
- Windows (amd64)
Option 2: From Source (For Development)
git clone https://github.com/ferg-cod3s/conexus.git
cd conexus
go build -o conexus ./cmd/conexus
go test ./...
Basic Usage
npx @agentic-conexus/mcp
CONEXUS_DB_PATH=./data/db.sqlite CONEXUS_LOG_LEVEL=debug npx @agentic-conexus/mcp
CONEXUS_PORT=3000 npx @agentic-conexus/mcp
π MCP Integration
Conexus provides first-class support for the Model Context Protocol (MCP), enabling seamless integration with AI assistants like Claude Desktop and Cursor.
Why Use Conexus with AI Assistants?
- π Intelligent Context Retrieval: Search your codebase using natural language
- π― Precise Results: Vector similarity search + filtering for relevant findings
- π Real-time Indexing: Keep your code context fresh and up-to-date
- π οΈ Built-in Tools: 4 powerful MCP tools for code understanding
Quick MCP Setup (<5 minutes)
Option 1: NPM/Bunx (Recommended for MCP clients)
npm install -g @ferg-cod3s/conexus
bunx @ferg-cod3s/conexus
Configure in your MCP client (OpenCode, Claude Desktop, etc.):
{
"mcpServers": {
"conexus": {
"command": "bunx",
"args": ["@ferg-cod3s/conexus"],
"env": {
"CONEXUS_DB_PATH": "/path/to/your/project/.conexus/db.sqlite"
}
}
}
}
Option 2: Go Install (For development)
go install github.com/ferg-cod3s/conexus/cmd/conexus@latest
conexus
CONEXUS_PORT=3000 conexus
Configure for stdio mode (recommended for MCP):
{
"mcpServers": {
"conexus": {
"command": "conexus",
"env": {
"CONEXUS_DB_PATH": "/path/to/your/project/.conexus/db.sqlite"
}
}
}
}
Test the integration:
In your MCP client (OpenCode, Claude Desktop, etc.):
You: "Search for HTTP handler functions in this codebase"
AI Assistant: [Uses context.search tool]
Found 5 HTTP handlers:
- HandleRequest in internal/server/handler.go:42-68
- HandleHealth in internal/server/health.go:15-22
...
Environment Variables:
CONEXUS_DB_PATH: Path to SQLite database (default: ~/.conexus/db.sqlite)
CONEXUS_LOG_LEVEL: Log level: debug, info, warn, error (default: info)
CONEXUS_PORT: Run in HTTP mode instead of stdio (for development)
Available MCP Tools
context.search | β
Fully Implemented | Search code with filters (type, language, file patterns) |
context.get_related_info | β
Fully Implemented | Get related files, functions, and context |
context.index_control | β³ Partial | Indexing operations (status available, reindex planned) |
context.connector_management | β³ Partial | Data source management (list available, CRUD planned) |
Example Queries
Code Understanding:
"Show me all database query functions"
"Find the authentication middleware implementation"
"What functions handle user registration?"
Bug Investigation:
"Search for error handling in the payment module"
"Find all functions that access the user database"
"Show panic or fatal calls in the codebase"
Feature Development:
"Locate API endpoint handlers"
"Find all struct definitions related to orders"
"Search for configuration loading functions"
Project-Specific Installation
For using Conexus with specific projects, you can configure it to work with your existing codebase structure:
1. Per-Project MCP Server Configuration
Create a project-specific MCP configuration:
{
"mcpServers": {
"conexus-myproject": {
"command": "conexus",
"args": ["mcp", "--root", "/path/to/your/project"],
"env": {
"CONEXUS_LOG_LEVEL": "info",
"CONEXUS_CONFIG": "/path/to/your/project/conexus.yml"
}
}
}
}
2. Project Configuration File
Create a conexus.yml file in your project root:
project:
name: "my-project"
description: "Web application backend"
codebase:
root: "."
include_patterns:
- "**/*.go"
- "**/*.js"
- "**/*.ts"
- "**/*.py"
exclude_patterns:
- "**/node_modules/**"
- "**/vendor/**"
- "**/dist/**"
- "**/.git/**"
search:
max_results: 50
similarity_threshold: 0.7
enable_fts: true
indexing:
auto_reindex: true
reindex_interval: "1h"
chunk_size: 1000
3. Docker Integration for Teams
For team environments, use Docker to ensure consistent configuration:
version: '3.8'
services:
conexus:
image: conexus:latest
container_name: conexus-myproject
restart: unless-stopped
ports:
- "3000:3000"
volumes:
- ./:/workspace:ro
- ./data:/data
environment:
- CONEXUS_ROOT_PATH=/workspace
- CONEXUS_LOG_LEVEL=info
- CONEXUS_CONFIG=/workspace/conexus.yml
working_dir: /workspace
docker-compose -f docker-compose.conexus.yml up -d
curl http://localhost:3000/health
4. Project Type Examples
Node.js Project:
codebase:
include_patterns:
- "**/*.js"
- "**/*.ts"
- "**/*.json"
- "**/*.md"
exclude_patterns:
- "**/node_modules/**"
- "**/coverage/**"
- "**/dist/**"
Python Project:
codebase:
include_patterns:
- "**/*.py"
- "**/*.md"
- "**/requirements*.txt"
- "**/pyproject.toml"
exclude_patterns:
- "**/__pycache__/**"
- "**/venv/**"
- "**/env/**"
- "**/.pytest_cache/**"
Go Project:
codebase:
include_patterns:
- "**/*.go"
- "**/go.mod"
- "**/go.sum"
- "**/*.md"
exclude_patterns:
- "**/vendor/**"
Monorepo:
codebase:
include_patterns:
- "packages/**/*.ts"
- "packages/**/*.js"
- "apps/**/*.ts"
- "apps/**/*.js"
exclude_patterns:
- "**/node_modules/**"
- "**/dist/**"
- "**/build/**"
5. Claude Desktop Project Templates
Create reusable templates for different project types:
{
"mcpServers": {
"conexus-nodejs": {
"command": "conexus",
"args": ["mcp", "--root", "$PROJECT_ROOT"],
"env": {
"CONEXUS_CONFIG": "$PROJECT_ROOT/.conexus/nodejs.yml"
}
},
"conexus-python": {
"command": "conexus",
"args": ["mcp", "--root", "$PROJECT_ROOT"],
"env": {
"CONEXUS_CONFIG": "$PROJECT_ROOT/.conexus/python.yml"
}
}
}
}
Advanced Configuration
For production deployments, custom embedding providers, and advanced search optimization, see the MCP Integration Guide.
Topics covered:
- Custom embedding providers (OpenAI, Anthropic, Ollama, Cohere)
- Vector store backends (SQLite, PostgreSQL, memory)
- Search optimization strategies
- Security configuration (RBAC, API keys, audit logging)
- Troubleshooting common issues
- Multiple instance support (monorepos)
π Architecture
High-Level Overview
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Orchestrator β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Intent β β Workflow β β State β β
β β Parser β β Engine β β Manager β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββΌββββββββββββββββ
β β β
βββββββΌββββββ ββββββΌββββββ ββββββΌββββββ
β Locator β β Analyzer β β Future β
β Agent β β Agent β β Agents β
βββββββ¬ββββββ ββββββ¬ββββββ ββββββββββββ
β β
βββββββββ¬ββββββββ
β
βββββββββββββββΌββββββββββββββ
β Validation Layer β
β ββββββββββββ βββββββββββ β
β β Evidence β β Schema β β
β βValidator β βValidatorβ β
β ββββββββββββ βββββββββββ β
βββββββββββββββββββββββββββββ
β
βββββββββββββββΌββββββββββββββ
β Profiling Layer β
β ββββββββββββ βββββββββββ β
β βCollector β βReporter β β
β ββββββββββββ βββββββββββ β
βββββββββββββββββββββββββββββ
Core Components
| Orchestrator | Workflow engine, intent parsing, state management | β
Complete |
| Locator Agent | Find files/functions matching patterns | β
Complete |
| Analyzer Agent | Extract control flow and data dependencies | β
Complete |
| Evidence Validator | Verify 100% evidence backing | β
Complete |
| Schema Validator | Validate AGENT_OUTPUT_V1 format | β
Complete |
| Profiler | Performance metrics and reporting | β
Complete |
| Integration Framework | End-to-end testing harness | β
Complete |
π§ͺ Testing
Test Suite Overview
Conexus has 53 integration tests covering real-world scenarios:
go test ./...
go test ./internal/testing/integration
go test -v ./internal/testing/integration
go test -cover ./...
go test -run TestLocatorAnalyzerIntegration ./internal/testing/integration
Test Categories
| Framework Tests | 13 | Core test infrastructure |
| Duration Tests | 7 | Performance regression detection |
| E2E Fixture Tests | 4 | Workflow execution with test fixtures |
| Advanced Workflows | 7 | Complex multi-step scenarios |
| Coordination Tests | 5 | Multi-agent communication |
| Real-World Tests | 5 | Actual Conexus source code analysis |
Performance Benchmarks
- Full Test Suite: <1 second
- Single Agent Execution: <50ms
- Multi-Agent Workflow: <100ms
- Real Codebase Analysis: <100ms per file
π Documentation
User Guides
Architecture Documentation
Development Resources
π§ Configuration
Agent Configuration
Conexus agents use environment variables for configuration:
export CONEXUS_LOG_LEVEL=debug
export CONEXUS_PROFILE_INTERVAL=100
export CONEXUS_VALIDATE_EVIDENCE=true
export CONEXUS_CACHE_DIR=~/.cache/conexus
Validation Configuration
export CONEXUS_REQUIRE_FULL_EVIDENCE=true
export CONEXUS_SCHEMA_MODE=strict
export CONEXUS_MAX_VALIDATION_ERRORS=10
π― AGENT_OUTPUT_V1 Schema
All agents produce standardized output following the AGENT_OUTPUT_V1 schema:
{
"schema_version": "AGENT_OUTPUT_V1",
"task_description": "Locate all HTTP handler functions",
"result_summary": "Found 5 HTTP handlers in 3 files",
"confidence_score": 0.95,
"items": [
{
"type": "function",
"name": "HandleRequest",
"file_path": "/internal/server/handler.go",
"line_start": 42,
"line_end": 68,
"evidence_file_path": "/internal/server/handler.go",
"evidence_line_start": 42,
"evidence_line_end": 68,
"classification": "primary",
"explanation": "HTTP handler implementing request processing logic"
}
],
"files_examined": ["/internal/server/handler.go"],
"metadata": {
"agent_name": "locator",
"execution_time_ms": 45,
"timestamp": "2025-01-15T10:30:00Z"
}
}
Key Requirements:
- β
100% Evidence Backing: Every item must have valid file/line references
- β
Schema Compliance: All required fields must be present
- β
Confidence Score: Between 0.0 and 1.0
- β
Structured Items: Typed items with classification
See API Reference for complete schema documentation.
π Workflow Integration
Overview
Conexus provides a powerful workflow integration system that combines validation, profiling, and quality gates into coordinated multi-agent workflows.
Basic Orchestrator Usage
package main
import (
"context"
"github.com/ferg-cod3s/conexus/internal/orchestrator"
"github.com/ferg-cod3s/conexus/internal/process"
"github.com/ferg-cod3s/conexus/internal/tool"
"github.com/ferg-cod3s/conexus/internal/validation/evidence"
)
func main() {
config := orchestrator.OrchestratorConfig{
ProcessManager: process.NewManager(),
ToolExecutor: tool.NewExecutor(),
EvidenceValidator: evidence.NewValidator(false),
QualityGates: orchestrator.DefaultQualityGates(),
EnableProfiling: true,
}
orch := orchestrator.NewWithConfig(config)
ctx := context.Background()
result, err := orch.HandleRequest(ctx, "find all HTTP handlers", permissions)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Completed in %v\n", result.Duration)
fmt.Printf("Evidence coverage: %.1f%%\n", result.Profile.EvidenceCoverage)
}
Quality Gate Presets
Conexus provides three quality gate configurations:
1. Default Quality Gates (Balanced)
config := orchestrator.OrchestratorConfig{
QualityGates: orchestrator.DefaultQualityGates(),
}
- β
100% evidence backing required
- β
5-minute max workflow time
- β
1-minute max agent execution time
- β
Blocks on validation failures
2. Relaxed Quality Gates (Development)
config := orchestrator.OrchestratorConfig{
QualityGates: orchestrator.RelaxedQualityGates(),
}
- β οΈ 80% evidence coverage minimum
- β οΈ 10-minute max workflow time
- β οΈ Allows up to 5 unbacked claims
3. Strict Quality Gates (Production)
config := orchestrator.OrchestratorConfig{
QualityGates: orchestrator.StrictQualityGates(),
}
- π 100% evidence backing enforced
- π 2-minute max workflow time
- π 30-second max agent execution time
- π Blocks on all failures (validation + performance)
Custom Quality Gates
config := orchestrator.OrchestratorConfig{
QualityGates: &orchestrator.QualityGateConfig{
RequireEvidenceBacking: true,
MinEvidenceCoverage: 95.0,
AllowUnbackedClaims: 2,
MaxExecutionTime: 3 * time.Minute,
MaxAgentExecutionTime: 30 * time.Second,
BlockOnValidationFailure: true,
BlockOnPerformanceFailure: false,
},
}
Profiling Integration
Enable automatic profiling to capture performance metrics:
config := orchestrator.OrchestratorConfig{
EnableProfiling: true,
}
result, _ := orch.ExecuteWorkflow(ctx, workflow, permissions)
profile := result.Profile
fmt.Printf("Total duration: %v\n", profile.TotalDuration)
fmt.Printf("Agent time: %v\n", profile.AgentExecutionTime)
fmt.Printf("Validation time: %v\n", profile.ValidationTime)
fmt.Printf("Profiling overhead: %.2f%%\n", profile.ProfilingOverheadPercent)
Validation Integration
Evidence validation is automatically integrated:
validator := evidence.NewValidator(true)
validator := evidence.NewValidator(false)
config := orchestrator.OrchestratorConfig{
EvidenceValidator: validator,
}
Workflow Reports
Generate comprehensive workflow reports:
result, _ := orch.ExecuteWorkflow(ctx, workflow, permissions)
report := orchestrator.GenerateWorkflowReport(result)
fmt.Println(report.ExecutionSummary)
fmt.Println(report.ValidationReport)
fmt.Println(report.PerformanceReport)
Example report output:
=== Workflow Execution Report ===
Execution Summary:
Duration: 127ms
Agents Executed: 2
Status: β
Success
Validation Report:
Evidence Coverage: 100.0%
Backed Claims: 15
Unbacked Claims: 0
Status: β
Passed
Performance Report:
Agent Execution: 85ms (66.9%)
Validation: 12ms (9.4%)
Profiling Overhead: 1.2%
Status: β
Within Limits
Best Practices
- Use Default Gates for Most Cases: Balanced performance and quality
- Enable Profiling in Development: Identify bottlenecks early
- Strict Mode for Production: Maximum confidence in production workflows
- Monitor Profiling Overhead: Keep under 10% for production systems
- Review Validation Reports: Ensure evidence backing meets standards
See Testing Strategy for workflow testing patterns.
π³ Docker Deployment
Quick Start with Docker
docker pull conexus:latest
docker run -d -p 8080:8080 --name conexus conexus:latest
docker build -t conexus:latest .
docker run -d -p 8080:8080 --name conexus conexus:latest
curl http://localhost:8080/health
Docker Compose (Recommended)
Production deployment:
docker compose up -d
docker compose logs -f
docker compose down
docker compose up -d --build
Development deployment:
docker compose -f docker-compose.yml -f docker-compose.dev.yml up
docker compose -f docker-compose.yml -f docker-compose.dev.yml logs -f
Configuration
Environment Variables:
CONEXUS_HOST=0.0.0.0
CONEXUS_PORT=8080
CONEXUS_DB_PATH=/data/conexus.db
CONEXUS_ROOT_PATH=/data/codebase
CONEXUS_LOG_LEVEL=info
CONEXUS_LOG_FORMAT=json
CONEXUS_EMBEDDING_PROVIDER=openai
CONEXUS_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_API_KEY=sk-...
Volume Mounts:
volumes:
- ./data:/data
- /path/to/your/code:/data/codebase:ro
- ./config.yml:/app/config.yml:ro
Docker Image Details
Multi-stage build:
- Builder:
golang:1.24-alpine (CGO enabled for SQLite)
- Runtime:
alpine:3.19 (minimal base, ca-certificates + sqlite-libs)
Image specifications:
- Size: ~19.5MB (optimized with multi-stage build)
- User: Non-root
conexus:1000
- Port: 8080 (HTTP + MCP over JSON-RPC 2.0)
- Health Check:
GET /health every 30s
Security features:
- Non-root execution (UID 1000)
- Static binary (no dynamic linking)
- Minimal attack surface (Alpine base)
- Read-only config option
- Health check monitoring
MCP Server Endpoints
Once running, the service exposes:
HTTP Endpoints:
curl http://localhost:8080/health
curl http://localhost:8080/
curl -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
MCP Tools:
context.search - Comprehensive search with filters
context.get_related_info - File/ticket context retrieval
context.index_control - Indexing operations
context.connector_management - Data source management
Production Deployment
With Docker Compose:
services:
conexus:
image: conexus:latest
restart: always
environment:
- CONEXUS_LOG_LEVEL=info
- CONEXUS_LOG_FORMAT=json
volumes:
- conexus-data:/data
- /mnt/codebase:/data/codebase:ro
ports:
- "8080:8080"
healthcheck:
test: ["CMD", "wget", "--spider", "-q", "http://localhost:8080/health"]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s
volumes:
conexus-data:
driver: local
Deploy:
docker compose -f docker-compose.prod.yml up -d
Monitoring
Check health:
docker compose ps
docker inspect conexus | jq '.[0].State.Health'
docker compose logs -f
curl http://localhost:8080/health
Troubleshooting:
docker compose logs --tail=100
docker compose exec conexus sh
docker compose exec conexus ls -la /data/
docker compose restart
Building from Source
docker build -t conexus:custom .
docker build --build-arg GO_VERSION=1.24 -t conexus:custom .
docker build -t conexus:v0.1.0 -t conexus:latest .
docker tag conexus:latest registry.example.com/conexus:latest
docker push registry.example.com/conexus:latest
Docker Best Practices
- Use Docker Compose for orchestration
- Mount volumes for data persistence
- Configure environment variables for secrets
- Enable health checks for monitoring
- Use named volumes in production
- Check logs regularly with
docker compose logs
- Backup database in
/data directory regularly
- Limit resources with Docker resource constraints if needed
ποΈ Development Workflow
Project Structure
conexus/
βββ cmd/conexus/ # Main entry point
βββ internal/
β βββ agent/ # Agent implementations
β β βββ locator/ # File/function locator
β β βββ analyzer/ # Code analyzer
β βββ orchestrator/ # Workflow orchestration
β β βββ intent/ # Intent parsing
β β βββ workflow/ # Workflow engine
β β βββ state/ # State management
β β βββ escalation/ # Error handling
β βββ validation/ # Validation systems
β β βββ evidence/ # Evidence validation
β β βββ schema/ # Schema validation
β βββ profiling/ # Performance profiling
β βββ protocol/ # JSON-RPC protocol
β βββ testing/ # Integration testing
βββ pkg/schema/ # Public schemas
βββ tests/fixtures/ # Test fixtures
βββ docs/ # Documentation
Adding a New Agent
See Contributing Guide for details.
π Current Status
Phase 5 Progress (95% Complete)
- β
Task 5.1: Integration Testing Framework (53 tests passing)
- π Task 5.2: Documentation Updates (in progress)
- β³ Task 5.3: Workflow Integration (pending)
- β³ Task 5.4: Protocol Tests (optional)
Test Results
β
All 53 integration tests passing
β
Execution time: <1 second
β
Evidence validation: 100%
β
Schema compliance: 100%
β
Real-world analysis: 5 scenarios validated
See PHASE5-STATUS.md for detailed status.
π£οΈ Roadmap
Phase 6: Optimization (Planned)
- β³ Advanced caching strategies
- β³ Parallel agent execution
- β³ Performance optimization
- β³ Memory usage reduction
Phase 7: Production Readiness (Planned)
- β³ CLI enhancements
- β³ Configuration management
- β³ Deployment automation
- β³ Monitoring dashboards
Future Agents (Planned)
- β³ Pattern recognition agent
- β³ Thoughts analyzer agent
- β³ Dependency analyzer agent
- β³ Security audit agent
π€ Contributing
We welcome contributions! Please see:
Quick Contribution Checklist
π License
This project is licensed under the MIT License - see LICENSE for details.
π Acknowledgments
π Support & Contact
π Related Projects
- MCP - Model Context Protocol specification
- Claude Code - AI-powered development assistant
- OpenCode - Open-source AI coding tools
Built with β€οΈ by the Conexus team