Security News
cURL Project and Go Security Teams Reject CVSS as Broken
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.
ollama-ai-provider
Advanced tools
Vercel AI Provider for running Large Language Models locally using Ollama
Note: This module is under development and may contain errors and frequent incompatible changes.
The Ollama provider is available in the ollama-ai-provider
module. You can install it with
npm i ollama-ai-provider
You can import the default provider instance ollama
from ollama-ai-provider
:
import { ollama } from 'ollama-ai-provider';
If you need a customized setup, you can import createOllama
from ollama-ai-provider
and create a provider instance with your settings:
import { createOllama } from 'ollama-ai-provider';
const ollama = createOllama({
// custom settings
});
You can use the following optional settings to customize the Ollama provider instance:
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers.
The default prefix is http://localhost:11434/api
.
headers Record<string,string>
Custom headers to include in the requests.
The first argument is the model id, e.g. phi3
.
const model = ollama('phi3');
This provider is capable of generating and streaming text and objects. It does not support image input and function calling (tools). Object generation may fail depending on the model used and the schema used.
At least it has been verified to work on the following models:
Model | Image input | Object generation | Tool usage | Tool streaming |
---|---|---|---|---|
llama2 | :x: | :white_check_mark: | :x: | :x: |
llama3 | :x: | :white_check_mark: | :x: | :x: |
llava | :x: | :white_check_mark: | :x: | :x: |
mistral | :x: | :white_check_mark: | :x: | :x: |
mixtral | :x: | :white_check_mark: | :x: | :x: |
openhermes | :x: | :white_check_mark: | :x: | :x: |
phi3 | :x: | :white_check_mark: | :x: | :x: |
FAQs
Vercel AI Provider for running LLMs locally using Ollama
The npm package ollama-ai-provider receives a total of 30,825 weekly downloads. As such, ollama-ai-provider popularity was classified as popular.
We found that ollama-ai-provider demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.
Security News
Biden's executive order pushes for AI-driven cybersecurity, software supply chain transparency, and stronger protections for federal and open source systems.