New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@ai-sdk/mistral

Package Overview
Dependencies
Maintainers
2
Versions
74
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@ai-sdk/mistral

The [Mistral](https://mistral.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) contains language model support for the Mistral chat API. It creates language model objects that can be used with the `generateText`, `streamText`, `generateOb

  • 0.0.9
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
46K
decreased by-20.49%
Maintainers
2
Weekly downloads
 
Created
Source

Vercel AI SDK - Mistral Provider

The Mistral provider for the Vercel AI SDK contains language model support for the Mistral chat API. It creates language model objects that can be used with the generateText, streamText, generateObject, and streamObject AI functions.

Setup

The Mistral provider is available in the @ai-sdk/mistral module. You can install it with

npm i @ai-sdk/mistral

Provider Instance

You can import the default provider instance mistral from @ai-sdk/mistral:

import { mistral } from '@ai-sdk/mistral';

If you need a customized setup, you can import createMistral from @ai-sdk/mistral and create a provider instance with your settings:

import { createMistral } from '@ai-sdk/mistral';

const mistral = createMistral({
  // custom settings
});

You can use the following optional settings to customize the Mistral provider instance:

  • baseURL string

    Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is https://api.mistral.ai/v1.

  • apiKey string

    API key that is being send using the Authorization header. It defaults to the MISTRAL_API_KEY environment variable.

  • headers Record<string,string>

    Custom headers to include in the requests.

Models

You can create models that call the Mistral chat API using provider instance. The first argument is the model id, e.g. mistral-large-latest. Some Mistral chat models support tool calls.

const model = mistral('mistral-large-latest');

Mistral chat models also support additional model settings that are not part of the standard call settings. You can pass them as an options argument:

const model = mistral('mistral-large-latest', {
  safePrompt: true, // optional safety prompt injection
});

The following optional settings are available for Mistral models:

  • safePrompt boolean

    Whether to inject a safety prompt before all conversations.

    Defaults to false.

Keywords

FAQs

Package last updated on 08 May 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc