New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

lollms-client

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

lollms-client

A client library for LoLLMs generate endpoint

  • 0.8.2
  • PyPI
  • Socket score

Maintainers
1

lollms_client

Python Version PyPI Downloads Apache License

Welcome to the lollms_client repository! This library is built by ParisNeo and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on PyPI and distributed under the Apache 2.0 License.

Installation

To install the library from PyPI using pip, run:

pip install lollms-client

Usage

To use the lollms_client, first import the necessary classes:

from lollms_client import LollmsClient

# Initialize the LollmsClient instance this uses the default lollms localhost service http://localhost:9600
lc = LollmsClient()
# You can also use a different host and port number if you please
lc = LollmsClient("http://some.other.server:9600")
# You can also use a local or remote ollama server
lc = LollmsClient(model_name="mistral-nemo:latest", default_generation_mode = ELF_GENERATION_FORMAT.OLLAMA)
# You can also use a local or remote openai server (you can either set your key as an environment variable or pass it here)
lc = LollmsClient(model_name="gpt-3.5-turbo-0125", default_generation_mode = ELF_GENERATION_FORMAT.OPENAI)

Text Generation

Use generate() for generating text from the lollms API.

response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)

List Mounted Personalities (only on lollms)

List mounted personalities of the lollms API with the listMountedPersonalities() method.

response = lc.listMountedPersonalities()
print(response)

List Models

List available models of the lollms API with the listModels() method.

response = lc.listModels()
print(response)

Complete Example

from lollms_client import LollmsClient

# Initialize the LollmsClient instance
lc = LollmsClient()

# Generate Text
response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)

# List Mounted Personalities
response = lc.listMountedPersonalities()
print(response)

# List Models
response = lc.listModels()
print(response)

Feel free to contribute to the project by submitting issues or pull requests. Follow ParisNeo on GitHub, Twitter, Discord, Sub-Reddit, and Instagram for updates and news.

Happy coding!

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc