New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

ollama-test

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ollama-test

An experimental library for the Ollama API

  • 0.4.4
  • unpublished
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
0
Maintainers
1
Weekly downloads
 
Created
Source

ollama

Interface with an ollama instance over HTTP.

Table of Contents

  • Install
  • Usage
  • API
  • Building
  • Testing

Install

npm i ollama

Usage

import { Ollama } from "ollama";

const ollama = new Ollama();

for await (const token of ollama.generate("llama2", "What is a llama?")) {
	process.stdout.write(token);
}

API

The API aims to mirror the HTTP API for Ollama.

Ollama

new Ollama(config);
  • config <Object> The configuration object for Ollama.
    • address <string> The Ollama API address. Default: "http://localhost:11434".

Create a new API handler for ollama.

generate

ollama.generate(model, prompt, [options]);
  • model <string> The name of the model to use for the prompt.
  • prompt <string> The prompt to give the model.
  • options <GenerateOptions> Optional options to pass to the model.
    • parameters <ModelParameters> Model Parameters.
    • context <number[]> Context returned from previous calls.
    • template <string> Override the default template.
    • system <string> Override the default system string.
  • Returns: <AsyncGenerator<string, GenerateResult>> A generator that outputs the tokens as strings.

Generate a response for a given prompt with a provided model. The final response object will include statistics and additional data from the request.

create

ollama.create(name, path);
  • name <string> The name of the model.
  • path <string> The path to the Modelfile.
  • Returns: AsyncGenerator<CreateStatus> A generator that outputs the status of creation.

Create a model from a Modelfile.

tags

ollama.tags();
  • Returns: Promise<Tag[]> A list of tags.

List models that are available locally.

copy

ollama.copy(source, destination);
  • source <string> The name of the model to copy.
  • destination <string> The name of copied model.
  • Returns: Promise<void>

Copy a model. Creates a model with another name from an existing model.

delete

ollama.delete(model);
  • model <string> The name of the model to delete.
  • Returns: Promise<void>

Delete a model and its data.

pull

ollama.pull(name);
  • name <string> The name of the model to download.
  • Returns: AsyncGenerator<PullResult> A generator that outputs the status of the download.

Download a model from a the model registry. Cancelled pulls are resumed from where they left off, and multiple calls to will share the same download progress.

embeddings

ollama.embeddings(model, prompt, [parameters]);
  • model <string> The name of the model to generate embeddings for.
  • prompt <string> The prompt to generate embeddings with.
  • parameters <ModelParameters> Model Parameters.
  • Returns: Promise<number[]> The embeddings.

Generate embeddings from a model.

Building

To build the project files run:

npm run build

Testing

To lint files:

npm run lint

FAQs

Package last updated on 09 Jan 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc