Vercel AI SDK - OpenAI Provider
The OpenAI provider contains language model support for the OpenAI chat and completion APIs.
It creates language model objects that can be used with the generateText
, streamText
, generateObject
, and streamObject
AI functions.
Setup
The OpenAI provider is available in the @ai-sdk/openai
module. You can install it with
npm i @ai-sdk/openai
Provider Instance
You can import OpenAI
from ai/openai
and initialize a provider instance with various settings:
import { OpenAI } from '@ai-sdk/openai'
const openai = new OpenAI({
baseUrl: '',
apiKey: ''
organization: ''
})
The AI SDK also provides a shorthand openai
import with an OpenAI provider instance that uses defaults:
import { openai } from '@ai-sdk/openai';
Chat Models
You can create models that call the OpenAI chat API using the .chat()
factory method.
The first argument is the model id, e.g. gpt-4
.
The OpenAI chat models support tool calls and some have multi-modal capabilities.
const model = openai.chat('gpt-3.5-turbo');
OpenAI chat models support also some model specific settings that are not part of the standard call settings.
You can pass them as an options argument:
const model = openai.chat('gpt-3.5-turbo', {
logitBias: {
'50256': -100,
},
user: 'test-user',
});
Completion Models
You can create models that call the OpenAI completions API using the .completion()
factory method.
The first argument is the model id.
Currently only gpt-3.5-turbo-instruct
is supported.
const model = openai.completion('gpt-3.5-turbo-instruct');
OpenAI completion models support also some model specific settings that are not part of the standard call settings.
You can pass them as an options argument:
const model = openai.chat('gpt-3.5-turbo', {
echo: true,
logitBias: {
'50256': -100,
},
suffix: 'some text',
user: 'test-user',
});