🚨 Shai-Hulud Strikes Again:834 Packages Compromised.Technical Analysis →
Socket
Book a DemoInstallSign in
Socket

@dainprotocol/llm

Package Overview
Dependencies
Maintainers
6
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@dainprotocol/llm

LLM adapters for the Dain Protocol Agent SDK

latest
Source
npmnpm
Version
0.1.1
Version published
Maintainers
6
Created
Source

@dainprotocol/llm

LLM adapters for the Dain Protocol Agent SDK.

Overview

This package provides unified adapters for multiple LLM providers, allowing you to easily switch between different models while maintaining a consistent interface.

Features

  • Provider-agnostic interface: Write code once, use with any LLM
  • Direct SDK adapters: Native support for Anthropic, OpenAI, and Vercel AI SDK
  • Streaming support: Built-in token streaming for all adapters
  • Tool calling: Unified tool/function calling across providers
  • TypeScript-first: Full type safety and IntelliSense support

Installation

pnpm add @dainprotocol/llm

Supported Providers

Anthropic (Claude)

Direct integration with Anthropic's Claude models.

import { createAnthropicAdapter } from '@dainprotocol/llm';

const llm = createAnthropicAdapter({
  apiKey: process.env.ANTHROPIC_API_KEY!,
  defaultModel: 'claude-3-5-sonnet-20241022',
});

const response = await llm.generate(
  [{ role: 'user', content: 'Hello!' }],
  { model: 'claude-3-5-sonnet-20241022' }
);

OpenAI (GPT)

Direct integration with OpenAI's GPT models.

import { createOpenAIAdapter } from '@dainprotocol/llm';

const llm = createOpenAIAdapter({
  apiKey: process.env.OPENAI_API_KEY!,
  defaultModel: 'gpt-4-turbo-preview',
});

const response = await llm.generate(
  [{ role: 'user', content: 'Hello!' }],
  { model: 'gpt-4-turbo-preview' }
);

Vercel AI SDK

Universal adapter supporting multiple providers through Vercel AI SDK.

import { createVercelAdapter } from '@dainprotocol/llm';
import { anthropic } from '@ai-sdk/anthropic';
import { openai } from '@ai-sdk/openai';

// Use with Anthropic
const claudeLLM = createVercelAdapter(anthropic('claude-3-5-sonnet-20241022'));

// Use with OpenAI
const gptLLM = createVercelAdapter(openai('gpt-4-turbo'));

const response = await claudeLLM.generate(
  [{ role: 'user', content: 'Hello!' }],
  { model: 'claude-3-5-sonnet-20241022' }
);

API Reference

LLMAdapter Interface

All adapters implement the LLMAdapter interface:

interface LLMAdapter {
  provider: string;

  generate(
    messages: LLMMessage[],
    config: LLMConfig,
    signal?: AbortSignal
  ): Promise<LLMResponse>;

  stream(
    messages: LLMMessage[],
    config: LLMConfig,
    signal?: AbortSignal
  ): AsyncGenerator<LLMStreamChunk>;

  normalizeMessage(message: any): LLMMessage;
  denormalizeMessage(message: LLMMessage): any;
}

Streaming

All adapters support streaming:

for await (const chunk of llm.stream(messages, config)) {
  if (chunk.type === 'content') {
    process.stdout.write(chunk.content);
  } else if (chunk.type === 'tool_call') {
    console.log('Tool call:', chunk.toolCall);
  } else if (chunk.type === 'done') {
    console.log('Finish reason:', chunk.finishReason);
  }
}

Tool Calling

Define tools and let the LLM use them:

const response = await llm.generate(
  [{ role: 'user', content: 'What is 25 * 4?' }],
  {
    model: 'claude-3-5-sonnet-20241022',
    tools: [
      {
        name: 'calculator',
        description: 'Perform mathematical calculations',
        parameters: {
          type: 'object',
          properties: {
            expression: { type: 'string' }
          },
          required: ['expression']
        }
      }
    ]
  }
);

if (response.toolCalls) {
  for (const toolCall of response.toolCalls) {
    console.log(`Tool: ${toolCall.name}`, toolCall.arguments);
  }
}

Configuration Options

interface LLMConfig {
  model: string;
  temperature?: number;           // 0-1, default varies by provider
  maxTokens?: number;             // Maximum tokens to generate
  topP?: number;                  // Nucleus sampling parameter
  frequencyPenalty?: number;      // OpenAI only
  presencePenalty?: number;       // OpenAI only
  stopSequences?: string[];       // Stop generation at these sequences
  tools?: LLMTool[];              // Available tools
  toolChoice?: 'auto' | 'required' | 'none' | { type: 'tool'; name: string };
}

Message Format

All adapters use a unified message format:

interface LLMMessage {
  role: 'system' | 'user' | 'assistant' | 'tool';
  content: string;
  name?: string;                  // Tool name for tool messages
  toolCallId?: string;            // ID for tool result messages
  toolCalls?: ToolCall[];         // Tool calls from assistant
}

License

MIT

Keywords

llm

FAQs

Package last updated on 14 Oct 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts