lollms_client

Welcome to the lollms_client repository! This library is built by ParisNeo and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on PyPI and distributed under the Apache 2.0 License.
Installation
To install the library from PyPI using pip
, run:
pip install lollms-client
Usage
To use the lollms_client, first import the necessary classes:
from lollms_client import LollmsClient
lc = LollmsClient()
lc = LollmsClient("http://some.other.server:9600")
lc = LollmsClient(model_name="mistral-nemo:latest", default_generation_mode = ELF_GENERATION_FORMAT.OLLAMA)
lc = LollmsClient(model_name="gpt-3.5-turbo-0125", default_generation_mode = ELF_GENERATION_FORMAT.OPENAI)
Text Generation
Use generate()
for generating text from the lollms API.
response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
List Mounted Personalities (only on lollms)
List mounted personalities of the lollms API with the listMountedPersonalities()
method.
response = lc.listMountedPersonalities()
print(response)
List Models
List available models of the lollms API with the listModels()
method.
response = lc.listModels()
print(response)
Complete Example
from lollms_client import LollmsClient
lc = LollmsClient()
response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
response = lc.listMountedPersonalities()
print(response)
response = lc.listModels()
print(response)
Feel free to contribute to the project by submitting issues or pull requests. Follow ParisNeo on GitHub, Twitter, Discord, Sub-Reddit, and Instagram for updates and news.
Happy coding!