OmniAI::Mistral
An Mistral implementation of the OmniAI APIs.
Installation
gem install omniai-mistral
Usage
Client
A client is setup as follows if ENV['MISTRAL_API_KEY']
exists:
client = OmniAI::Mistral::Client.new
A client may also be passed the following options:
api_key
(required - default is ENV['MISTRAL_API_KEY']
)host
(optional)
Configuration
Global configuration is supported for the following options:
OmniAI::Mistral.configure do |config|
config.api_key = 'sk-...'
config.host = '...'
end
Chat
A chat completion is generated by passing in prompts using any a variety of formats:
completion = client.chat('Tell me a joke!')
completion.text
completion = client.chat do |prompt|
prompt.system('You are a helpful assistant.')
prompt.user('What is the capital of Canada?')
end
completion.text
Model
model
takes an optional string (default is mistral-medium-latest
):
completion = client.chat('Provide code for fibonacci', model: OmniAI::Mistral::Chat::Model::CODESTRAL)
completion.text
Mistral API Reference model
Temperature
temperature
takes an optional float between 0.0
and 1.0
(defaults is 0.7
):
completion = client.chat('Pick a number between 1 and 5', temperature: 1.0)
completion.text
Mistral API Reference temperature
Stream
stream
takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:
stream = proc do |chunk|
print(chunk.text)
end
client.chat('Be poetic.', stream:)
Mistral API Reference stream
Format
format
takes an optional symbol (:json
) and that sets the response_format
to json_object
:
completion = client.chat(format: :json) do |prompt|
prompt.system(OmniAI::Chat::JSON_PROMPT)
prompt.user('What is the name of the drummer for the Beatles?')
end
JSON.parse(completion.text)
Mistral API Reference response_format
When using JSON mode you MUST also instruct the model to produce JSON yourself with a system or a user message.
Embed
Text can be converted into a vector embedding for similarity comparison usage via:
response = client.embed('The quick brown fox jumps over a lazy dog.')
response.embedding