Ollama Ruby Library
The Ollama Ruby library provides the easiest way to integrate your Ruby project with Ollama.
Index
Installation
Install the gem and add to the application's Gemfile by executing:
bundle add ollama-rb
If bundler is not being used to manage dependencies, install the gem by executing:
gem install ollama-rb
Usage
Create a Client
require "ollama"
ollama = Ollama::Client.new
ollama = Ollama::Client.new(base_url: "http://localhost:11434")
Generate a chat completion
response = ollama.chat.create(
model: "llama2",
messages: [
{ role: "user", content: "Why is the sky blue?" }
]
)
response.ok?
response.result
Streaming response
response = ollama.chat.create(
model: "llama2",
messages: [
{ role: "user", content: "Why is the sky blue?" }
]
) do |chunk|
puts chunk
end
Generate a completion
response = ollama.completion.create(model: "llama2", prompt: "hello!")
response.result
Streaming response
ollama.completion.create(model: "llama2", prompt: "hello!") do |chunk|
puts chunk
end
Create a Model
ollama.models.create(name: "mario", modelfile: "FROM llama2\nSYSTEM You are mario from Super Mario Bros.") do |chunk|
puts chunk
end
List Local Models
response = ollama.models.list
response.result
Show Model Information
response = ollama.models.show("mario")
response.result
=>
{"license"=>
"LLAMA 2 COMMUNITY LICENSE AGREEMENT\t\nLlama 2 Version Release Date: July 18, 2023\n\n\"Agreement\" means the terms and conditions for use, reproduction, distribution and \nmodification of the Llama Materials set forth herein.\n\n\"...",
"modelfile"=>
"# Modelfile generated by \"ollama show\"\n# To build a new Modelfile based on this one, replace the FROM line with:\n# FROM mario:latest\n\nFROM llama2:latest\nTEMPLATE \
"\"\"[INST] <<SYS>>{{ .System }}<</SYS>>\n\n{{ .Prompt }} [/INST]\n\"\"\"\nSYSTEM \"\"\"You are mario from Super Mario Bros.\"\"\"\nPARAMETER stop \"[INST]\"\nPARAMETER stop
\"[/INST]\"\nPARAMETER stop \"<<SYS>>\"\nPARAMETER stop \"<</SYS>>\"",
"parameters"=>
"stop \"[INST]\"\nstop \"[/INST]\"\nstop \"<<SYS>>\"\nstop \"<</SYS
>>\"",
"template"=>"[INST] <<SYS>>{{ .System }}<</SYS>>\n\n{{ .Prompt }} [/INST]\n",
"system"=>"You are mario from Super Mario Bros.",
"details"=>{"parent_model"=>"llama2:latest", "format"=>"gguf", "family"=>"llama", "families"=>["llama"], "parameter_size"=>"7B", "quantization_level"=>"Q4_0"}}
Copy a Model
response = ollama.models.copy(source: "llama2")
response = ollama.models.copy(source: "llama2", destination: "llama2-backup")
response.ok?
response = ollama.models.copy(source: "non-existence")
response.ok?
response.result
Delete a Model
response = ollama.models.delete("llama2-backup")
response.ok?
response = ollama.models.delete("non-existence")
response.ok?
response.result
Pull a Model
ollama.models.pull(name: "tinyllama") do |chunk|
puts chunk
end
Push a Model
You need to create an account at https://ollama.ai and add your Public Key at https://ollama.ai/settings/keys to allow you push models to your namespace.
ollama.models.copy(source: "mario", destination: "your-namespace/mario")
ollama.models.push(name: "your-namespace/mario") do |chunk|
puts chunk
end
Generate Embeddings
response = ollama.embeddings.create(model: "llama2", prompt: "Hello!")
response.result
{"embedding"=>
[1.3464512825012207,
-1.0983257293701172,
...
-2.2046988010406494, 0.3163630962371826] }
Development
After checking out the repo, run bin/setup
to install dependencies. Then, run rake test
to run the tests. You can also run bin/console
for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install
. To release a new version, update the version number in version.rb
, and then run bundle exec rake release
, which will create a git tag for the version, push git commits and the created tag, and push the .gem
file to rubygems.org.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/songjiz/ollama-rb.
License
The gem is available as open source under the terms of the MIT License.