Vercel AI SDK - Mistral Provider
The Mistral provider contains language model support for the Mistral chat API.
It creates language model objects that can be used with the generateText
, streamText
, generateObject
, and streamObject
AI functions.
Setup
The Mistral provider is available in the @ai-sdk/mistral
module. You can install it with
npm i @ai-sdk/mistral
Provider Instance
You can import createMistral
from @ai-sdk/mistral
and create a provider instance with various settings:
import { createMistral } from '@ai-sdk/mistral';
const mistral = createMistral({
baseURL: '',
apiKey: '',
headers: {
'custom-header': 'value',
},
});
The AI SDK also provides a shorthand mistral
import with a Mistral provider instance that uses defaults:
import { mistral } from '@ai-sdk/mistral';
Models
You can create models that call the Mistral chat API using provider instance.
The first argument is the model id, e.g. mistral-large-latest
.
Some Mistral chat models support tool calls.
const model = mistral('mistral-large-latest');
Mistral chat models also support additional model settings that are not part of the standard call settings.
You can pass them as an options argument:
const model = mistral('mistral-large-latest', {
safePrompt: true,
});