
Research
Malicious npm Packages Impersonate Flashbots SDKs, Targeting Ethereum Wallet Credentials
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
An LLM Proxy that allows the user to interact with different language models from different providers using unified request and response formats.
llm-proxy
is a TypeScript library that provides a unified interface for interacting with multiple large language model (LLM) providers, such as OpenAI and Anthropic. The library simplifies cross-provider communication by standardizing input and output formats, allowing users to call different providers with consistent request and response structures. This proxy library also supports both streaming and non-streaming responses.
Unified Interface: Send chat completion requests in a consistent format, regardless of the underlying LLM provider.
Automatic Provider Detection: The library determines the appropriate provider (OpenAI, Anthropic, etc.) based on the model specified in the request.
Streamed and Non-Streamed Responses: Separate functions handle streaming and non-streaming responses, giving flexibility in response handling.
Modular Design: Includes distinct middleware and service layers for handling provider-specific logic and request formatting.
Install llm-proxy
via npm:
npm install llm-proxy
usage discription goes here
Workflow Overview
User Request: The user sends a chat completion request in a unified format. The request is passed to the llm-proxy.
Middleware Layer:
Response Handling: OutputFormatAdapter: Transforms the provider-specific response back into the unified format.
Return Response: The final response is returned to the user in the unified format.
Below is a flow diagram illustrating how llm-proxy processes requests.
Contributions are welcome! Please follow the standard GitHub flow for submitting issues and pull requests.
This project is licensed under the MIT License.
FAQs
An LLM Proxy that allows the user to interact with different language models from different providers using unified request and response formats.
The npm package llm-proxy receives a total of 102 weekly downloads. As such, llm-proxy popularity was classified as not popular.
We found that llm-proxy demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.
Security News
Following last week’s supply chain attack, Nx published findings on the GitHub Actions exploit and moved npm publishing to Trusted Publishers.