
Security News
Another Round of TEA Protocol Spam Floods npm, But It’s Not a Worm
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.
@openrouter/ai-sdk-provider
Advanced tools
The [OpenRouter](https://openrouter.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) gives access to over 300 large language models on the OpenRouter chat and completion APIs.
The OpenRouter provider for the Vercel AI SDK gives access to over 300 large language models on the OpenRouter chat and completion APIs.
# For pnpm
pnpm add @openrouter/ai-sdk-provider
# For npm
npm install @openrouter/ai-sdk-provider
# For yarn
yarn add @openrouter/ai-sdk-provider
# For pnpm
pnpm add @openrouter/ai-sdk-provider@ai-sdk-v4
# For npm
npm install @openrouter/ai-sdk-provider@ai-sdk-v4
# For yarn
yarn add @openrouter/ai-sdk-provider@ai-sdk-v4
You can import the default provider instance openrouter from @openrouter/ai-sdk-provider:
import { openrouter } from '@openrouter/ai-sdk-provider';
import { openrouter } from '@openrouter/ai-sdk-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: openrouter('openai/gpt-4o'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
This list is not a definitive list of models supported by OpenRouter, as it constantly changes as we add new models (and deprecate old ones) to our system. You can find the latest list of models supported by OpenRouter here.
You can find the latest list of tool-supported models supported by OpenRouter here. (Note: This list may contain models that are not compatible with the AI SDK.)
There are 3 ways to pass extra body to OpenRouter:
Via the providerOptions.openrouter property:
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { streamText } from 'ai';
const openrouter = createOpenRouter({ apiKey: 'your-api-key' });
const model = openrouter('anthropic/claude-3.7-sonnet:thinking');
await streamText({
model,
messages: [{ role: 'user', content: 'Hello' }],
providerOptions: {
openrouter: {
reasoning: {
max_tokens: 10,
},
},
},
});
Via the extraBody property in the model settings:
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { streamText } from 'ai';
const openrouter = createOpenRouter({ apiKey: 'your-api-key' });
const model = openrouter('anthropic/claude-3.7-sonnet:thinking', {
extraBody: {
reasoning: {
max_tokens: 10,
},
},
});
await streamText({
model,
messages: [{ role: 'user', content: 'Hello' }],
});
Via the extraBody property in the model factory.
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { streamText } from 'ai';
const openrouter = createOpenRouter({
apiKey: 'your-api-key',
extraBody: {
reasoning: {
max_tokens: 10,
},
},
});
const model = openrouter('anthropic/claude-3.7-sonnet:thinking');
await streamText({
model,
messages: [{ role: 'user', content: 'Hello' }],
});
You can include Anthropic-specific options directly in your messages when using functions like streamText. The OpenRouter provider will automatically convert these messages to the correct format internally.
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { streamText } from 'ai';
const openrouter = createOpenRouter({ apiKey: 'your-api-key' });
const model = openrouter('anthropic/<supported-caching-model>');
await streamText({
model,
messages: [
{
role: 'system',
content:
'You are a podcast summary assistant. You are detail-oriented and critical about the content.',
},
{
role: 'user',
content: [
{
type: 'text',
text: 'Given the text body below:',
},
{
type: 'text',
text: `<LARGE BODY OF TEXT>`,
providerOptions: {
openrouter: {
cacheControl: { type: 'ephemeral' },
},
},
},
{
type: 'text',
text: 'List the speakers?',
},
],
},
],
});
The provider supports OpenRouter usage accounting, which allows you to track token usage details directly in your API responses, without making additional API calls.
// Enable usage accounting
const model = openrouter('openai/gpt-3.5-turbo', {
usage: {
include: true,
},
});
// Access usage accounting data
const result = await generateText({
model,
prompt: 'Hello, how are you today?',
});
// Provider-specific usage details (available in providerMetadata)
if (result.providerMetadata?.openrouter?.usage) {
console.log('Cost:', result.providerMetadata.openrouter.usage.cost);
console.log(
'Total Tokens:',
result.providerMetadata.openrouter.usage.totalTokens,
);
}
FAQs
The [OpenRouter](https://openrouter.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) gives access to over 300 large language models on the OpenRouter chat and completion APIs.
The npm package @openrouter/ai-sdk-provider receives a total of 249,884 weekly downloads. As such, @openrouter/ai-sdk-provider popularity was classified as popular.
We found that @openrouter/ai-sdk-provider demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 5 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Recent coverage mislabels the latest TEA protocol spam as a worm. Here’s what’s actually happening.

Security News
PyPI adds Trusted Publishing support for GitLab Self-Managed as adoption reaches 25% of uploads

Research
/Security News
A malicious Chrome extension posing as an Ethereum wallet steals seed phrases by encoding them into Sui transactions, enabling full wallet takeover.