New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

multiagent-consensus

Package Overview
Dependencies
Maintainers
1
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

multiagent-consensus

A framework for running multi-agent consensus processes using multiple LLMs

latest
Source
npmnpm
Version
0.1.0
Version published
Maintainers
1
Created
Source

Multiagent Consensus

npm version License: MIT

A framework for running multi-agent consensus processes using multiple Large Language Models (LLMs). This library enables a "jury" of AI models to debate and reach consensus on queries, providing more robust and balanced responses.

Features

  • 🤖 Multiple LLM Support: Compatible with various LLM providers through the Vercel AI SDK
  • 🔄 Configurable Consensus Methods: Choose from majority, supermajority (75%), or unanimous agreement
  • 🧠 Multi-round Debates: Models can debate in multiple rounds to refine their thinking
  • 📊 Detailed Results: Get comprehensive metadata including confidence scores and processing time
  • 🧪 Flexible Output: Customize output format (text, JSON) and content detail
  • 🛠️ Highly Configurable: Set bias, system prompts, and customize debate parameters

Installation

npm install multiagent-consensus

Or with yarn:

yarn add multiagent-consensus

Basic Usage

import { ConsensusEngine } from 'multiagent-consensus';

// Create a consensus engine with your configuration
const engine = new ConsensusEngine({
  models: ['claude-3-haiku', 'gpt-4', 'palm-2'],
  consensusMethod: 'majority', // 'majority', 'supermajority', or 'unanimous'
  maxRounds: 2, // Maximum number of debate rounds
  output: {
    includeHistory: true, // Include debate history in result
    includeMetadata: true, // Include metadata in result
    format: 'text', // 'text' or 'json'
  },
});

// Run a consensus process
async function getConsensus() {
  const result = await engine.run('What is the best approach to solve climate change?');

  console.log('Final Answer:', result.answer);

  if (result.history) {
    console.log('Debate History:');
    result.history.forEach((round, i) => {
      console.log(`\nRound ${i + 1}:`);
      round.responses.forEach(response => {
        console.log(`${response.model}: ${response.response}`);
      });
    });
  }

  console.log('\nMetadata:', result.metadata);
}

getConsensus();

API Reference

ConsensusEngine

The main class for running consensus processes.

constructor(config: ConsensusConfig)

ConsensusConfig

ParameterTypeDescriptionDefault
modelsstring[]Array of model identifiers to useRequired
consensusMethod'majority' | 'supermajority' | 'unanimous'Method to determine consensus'majority'
maxRoundsnumberMaximum number of debate rounds3
outputOutputConfigOutput configurationSee below

OutputConfig

ParameterTypeDescriptionDefault
includeHistorybooleanInclude full debate historyfalse
includeMetadatabooleanInclude metadata in resulttrue
format'text' | 'json'Output format'text'

ConsensusResult

The result of a consensus process.

PropertyTypeDescription
answerstringThe final consensus answer
modelsstring[]Models that participated in the debate
metadataResultMetadataInformation about the process
history?RoundData[]Debate history (if includeHistory is true)

ResultMetadata

PropertyTypeDescription
totalTokensnumberTotal tokens used across all models and rounds
processingTimeMsnumberTotal processing time in milliseconds
roundsnumberNumber of debate rounds conducted
consensusMethodstringMethod used to determine consensus
confidenceScoresRecord<string, number>Self-reported confidence per model

Consensus Methods

Majority

Requires more than 50% of the models to agree on an answer. This is the most lenient consensus method and works well for straightforward queries.

Supermajority

Requires at least 75% of the models to agree. This provides a more stringent threshold for consensus, useful for complex or controversial topics.

Unanimous

Requires all models to agree completely. This is the strictest form of consensus and may require multiple debate rounds to achieve.

Setting Up Environment Variables

For using this package with LLM providers, you'll need to set up environment variables for API keys:

OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
COHERE_API_KEY=your_cohere_key_here

We recommend using dotenv for local development:

// In your application's entry point
import 'dotenv/config';

Examples

Custom Debate with Specific System Prompts

const engine = new ConsensusEngine({
  models: ['claude-3-sonnet', 'gpt-4', 'gpt-3.5-turbo'],
  consensusMethod: 'supermajority',
  maxRounds: 3,
  modelConfig: {
    'claude-3-sonnet': {
      systemPrompt: 'You are a scientific expert focused on evidence-based reasoning.',
      temperature: 0.5,
    },
    'gpt-4': {
      systemPrompt: 'You are a philosophical thinker who considers ethical implications.',
      temperature: 0.7,
    },
    'gpt-3.5-turbo': {
      systemPrompt: 'You are a practical problem-solver focusing on realistic solutions.',
      temperature: 0.3,
    },
  },
  output: {
    includeHistory: true,
    format: 'json',
  },
});

Using Bias Presets

const engine = new ConsensusEngine({
  models: ['claude-3-opus', 'gpt-4', 'llama-3'],
  consensusMethod: 'majority',
  biasPresets: ['centrist', 'progressive', 'conservative'],
  output: {
    includeHistory: true,
  },
});

Running the Examples

The package includes a JavaScript example to demonstrate functionality.

As a Package Consumer

When you've installed the published package as a dependency in your project:

# Install the package
npm install multiagent-consensus

# Copy the example file to your project
# Run the JavaScript example
node simple-consensus.js

As a Package Developer

When developing the package itself:

# From the package directory
npm run build         # Build the package first - this creates the dist directory
npm run example       # Run the JavaScript example

The build step is crucial as it compiles the TypeScript source files into JavaScript in the dist directory. The example imports code from this directory, so if you make changes to the source files, you'll need to rebuild the package before running the example again.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Keywords

llm

FAQs

Package last updated on 28 Mar 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts