You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

openllm-client

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

openllm-client

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

0.5.7
pipPyPI
Maintainers
1

Banner for OpenLLM

👾 OpenLLM Client

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit OpenLLM's README.md

This package holds the underlying client implementation for OpenLLM. If you are coming from OpenLLM, the client can be accessed via openllm.client.

import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')

Gif showing OpenLLM Intro

Gif showing Agent integration

📔 Citation

If you use OpenLLM in your research, we provide a citation to use:

@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}

Click me for full changelog

Keywords

AI

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts