Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

durable-llm

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

durable-llm

  • 0.1.4
  • Rubygems
  • Socket score

Version published
Maintainers
1
Created
Source

Durable-LLM

Durable-LLM is a Ruby gem providing a unified interface for interacting with multiple Large Language Model APIs. It simplifies the integration of AI capabilities into Ruby applications by offering a consistent way to access various LLM providers.

Installation

Add this line to your application's Gemfile:

gem 'durable-llm'

And then execute:

$ bundle install

Or install it yourself as:

$ gem install durable-llm

Usage

Here's a basic example of how to use Durable-LLM:

require 'durable-llm'

client = Durable::Llm::Client.new(:openai, api_key: 'your-api-key')

response = client.completion(
  model: 'gpt-3.5-turbo',
  messages: [{ role: 'user', content: 'Hello, how are you?' }]
)

puts response.choices.first.message.content

Features

  • Unified interface for multiple LLM providers
  • Consistent input/output format across different models
  • Error handling and retries
  • Streaming support
  • Customizable timeout and request options

Supported Providers

  • OpenAI
  • Anthropic
  • Grok
  • Huggingface

Configuration

You can configure Durable-LLM globally or on a per-request basis:

Durable::Llm.configure do |config|
  config.default_provider   = :openai
  config.openai.api_key     = 'your-openai-api-key'
  config.anthropic.api_key  = 'your-anthropic-api-key'
  # Add other provider configurations as needed
end

Error Handling

Durable-LLM provides a unified error handling system:

begin
  response = client.completion(model: 'gpt-3.5-turbo', messages: [...])
rescue Durable::Llm::APIError => e
  puts "API Error: #{e.message}"
rescue Durable::Llm::RateLimitError => e
  puts "Rate Limit Exceeded: #{e.message}"
end

Acknowledgements

Thank you to the lite-llm and llm.datasette.io projects for their hard work, which was invaluable to this project. The dllm command line tool is patterned after the llm tool, though not as full-featured (yet).

The streaming jsonl code is from the ruby-openai repo; many thanks for their hard work.

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/durableprogramming/durable-llm.

License

The gem is available as open source under the terms of the MIT License.

FAQs

Package last updated on 10 Oct 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc