Socket
Book a DemoInstallSign in
Socket

llm-proxy

Package Overview
Dependencies
Maintainers
2
Versions
67
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llm-proxy

An LLM Proxy that allows the user to interact with different language models from different providers using unified request and response formats.

1.7.0
latest
npmnpm
Version published
Weekly downloads
107
37.18%
Maintainers
2
Weekly downloads
 
Created
Source

LLM Proxy

llm-proxy is a TypeScript library that provides a unified interface for interacting with multiple large language model (LLM) providers, such as OpenAI and Anthropic. The library simplifies cross-provider communication by standardizing input and output formats, allowing users to call different providers with consistent request and response structures. This proxy library also supports both streaming and non-streaming responses.

Features

  • Unified Interface: Send chat completion requests in a consistent format, regardless of the underlying LLM provider.

  • Automatic Provider Detection: The library determines the appropriate provider (OpenAI, Anthropic, etc.) based on the model specified in the request.

  • Streamed and Non-Streamed Responses: Separate functions handle streaming and non-streaming responses, giving flexibility in response handling.

  • Modular Design: Includes distinct middleware and service layers for handling provider-specific logic and request formatting.

Installation

Install llm-proxy via npm:

npm install llm-proxy

Usage

usage discription goes here

Theory - How It Works ?

Workflow Overview

  • User Request: The user sends a chat completion request in a unified format. The request is passed to the llm-proxy.

  • Middleware Layer:

  • ProviderFinder: Identifies the intended provider (e.g., OpenAI, Anthropic) based on the model specified in the request.
  • InputFormatAdapter: Transforms the request from the unified format into the format expected by the identified provider.
  • Service Layer:
  • ClientService: A general service interface that routes the request to the correct provider-specific service.
  • Provider-Specific Services: For example, AwsBedrockAnthropicService and OpenAIService handle the actual API communication with Anthropic via AWS Bedrock or OpenAI directly.
  • Response Handling: OutputFormatAdapter: Transforms the provider-specific response back into the unified format.

  • Return Response: The final response is returned to the user in the unified format.

Detailed Components

  • Middleware Layer:
  • ProviderFinder: Determines which provider to use based on the model in the request (e.g., "Claude" indicates Anthropic).
  • InputFormatAdapter: Adapts the request from the unified format to the provider's specific format.
  • OutputFormatAdapter: Converts the provider-specific response into the unified format.
  • Service Layer:
  • ClientService: A high-level service that selects the appropriate provider service.
  • AwsBedrockAnthropicService: Handles requests and responses for Anthropic models via AWS Bedrock.
  • OpenAIService: Manages requests and responses for OpenAI models.

Architecture Diagram

Below is a flow diagram illustrating how llm-proxy processes requests.

LLM Proxy Flow Diagram

Contributing

Contributions are welcome! Please follow the standard GitHub flow for submitting issues and pull requests.

License

This project is licensed under the MIT License.

FAQs

Package last updated on 24 Mar 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

About

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.

  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc

U.S. Patent No. 12,346,443 & 12,314,394. Other pending.