New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More

ollama-ai-provider

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ollama-ai-provider

Vercel AI Provider for running LLMs locally using Ollama


Version published
Weekly downloads
37K
decreased by-9.69%
Maintainers
1
Weekly downloads
 
Created

ollama-ai-provider

Vercel AI Provider for running Large Language Models locally using Ollama

Note: This module is under development and may contain errors and frequent incompatible changes.

Installation

The Ollama provider is available in the ollama-ai-provider module. You can install it with

npm i ollama-ai-provider

Provider Instance

You can import the default provider instance ollama from ollama-ai-provider:

import { ollama } from 'ollama-ai-provider';

If you need a customized setup, you can import createOllama from ollama-ai-provider and create a provider instance with your settings:

import { createOllama } from 'ollama-ai-provider';

const ollama = createOllama({
  // custom settings
});

You can use the following optional settings to customize the Ollama provider instance:

  • baseURL string

    Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is http://localhost:11434/api.

  • headers Record<string,string>

    Custom headers to include in the requests.

Models

The first argument is the model id, e.g. phi3.

const model = ollama('phi3');

Tested models and capabilities

This provider is capable of generating and streaming text and objects. It does not support image input and function calling (tools). Object generation may fail depending on the model used and the schema used.

At least it has been verified to work on the following models:

ModelImage inputObject generationTool usageTool streaming
llama2:x::white_check_mark::x::x:
llama3:x::white_check_mark::x::x:
llava:x::white_check_mark::x::x:
mistral:x::white_check_mark::x::x:
mixtral:x::white_check_mark::x::x:
openhermes:x::white_check_mark::x::x:
phi3:x::white_check_mark::x::x:

Keywords

FAQs

Package last updated on 06 May 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts