🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more →
Socket
DemoInstallSign in
Socket

openai-http-proxy

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

openai-http-proxy

LLM inference proxy server

0.0.2
PyPI
Maintainers
1

PYPI Release Code Style Tests License

LM-Proxy

LM-Proxy is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.

Development Status: bookmark it and go away, it is still in early development.

✨ Features

  • @todo

🚀 Quickstart

# Install LLM Console via pip
pip install <me>

🤝 Contributing

We ❤️ contributions! See CONTRIBUTING.md.

📝 License

Licensed under the MIT License.

© 2022—2025 Vitalii Stepanenko

Keywords

llm

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts