Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@singlestore/ai

Package Overview
Dependencies
Maintainers
0
Versions
32
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@singlestore/ai

A module that enhances the `@singlestore/client` package with AI functionality, allowing you to integrate advanced AI features like embeddings and chat completions.

  • 0.0.26
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
13
increased by85.71%
Maintainers
0
Weekly downloads
 
Created
Source

SingleStoreAI

A module that enhances the @singlestore/client package with AI functionality, allowing you to integrate advanced AI features like embeddings and chat completions.

Table of Contents

Installation

To install the @singlestore/ai package, run the following command:

npm install @singlestore/ai

Usage Examples

Create an Instance

First, create an instance of the SingleStoreAI class using your OpenAI API key.

import { AI } from "@singlestore/ai";

const ai = new SingleStoreAI({ openAIApiKey: "<OPENAI_API_KEY>" });

Generate Embeddings

Generate embeddings for a given input text using the create method.

const input = "Hi!";
const embeddings = await ai.embeddings.create(input);
console.log(embeddings);

Create a Chat Completion

Create a chat completion.

const prompt = "Hi, how are you?";
const chatCompletion = await ai.chatCompletions.create({
  prompt,
  model: "gpt-4o",
  systemRole: "You are a helpful assistant",
});
console.log(chatCompletion);

Stream Chat Completions

Stream chat completions to handle responses in real time.

const prompt = "Hi, how are you?";

const stream = await ai.chatCompletions.create({
  prompt,
  model: "gpt-4o",
  systemRole: "You are a helpful assistant",
  stream: true,
});

const onChunk: OnChatCompletionChunk = (chunk) => {
  console.log("onChunk:", chunk);
};

const chatCompletion = await ai.chatCompletions.handleStream(stream, onChunk);
console.log(chatCompletion);

Develop a Chat Completion Tool

Create a custom chat completion tool to handle specific tasks.

import { AI, ChatCompletionTool } from "@singlestore/ai";
import { z } from "zod";

const findCityInfoTool = new ChatCompletionTool({
  name: "find_city_info",
  description: "Useful for finding and displaying information about a city.",
  params: z.object({ name: z.string().describe("The city name") }),
  call: async (params) => {
    const info = `${params.name} is known as a great city!`;
    return { name: "find_city_info", params, value: JSON.stringify(info) };
  },
});

const ai = new AI({
  openAIApiKey: "<OPENAI_API_KEY>",
  chatCompletionTools: [findCityInfoTool],
});

const chatCompletion = await ai.chatCompletions.create({ prompt: "Find info about Vancouver." });
console.log(chatCompletion);

Custom Chat Completions

Extend the ChatCompletions class to use a custom LLM for creating chat completions.

import { AI, type AnyChatCompletionTool, ChatCompletions } from "@singlestore/ai";

class CustomChatCompletions<
  TChatCompletionTool extends AnyChatCompletionTool[] | undefined,
> extends ChatCompletions<TChatCompletionTool> {
  constructor() {
    super();
  }

  getModels(): Promise<string[]> | string[] {
    // Your implementation
    return [];
  }

  async create<TStream extends boolean | undefined>(
    params: CreateChatCompletionParams<TStream, TChatCompletionTool>,
  ): Promise<CreateChatCompletionResult<TStream>> {
    // Your implementation
    return {} as CreateChatCompletionResult<TStream>;
  }
}

const ai = new AI({
  openAIApiKey: "<OPENAI_API_KEY>",
  chatCompletions: new CustomChatCompletions(),
});

Custom Embeddings

Create a custom embeddings class to use a custom LLM for creating embeddings.

import { AI, Embeddings } from "@singlestore/ai";

class CustomEmbeddings extends Embeddings {
  constructor() {
    super();
  }

  getModels(): string[] {
    // Your implementation
    return [];
  }

  async create(input: string | string[], params?: CreateEmbeddingsParams): Promise<Embedding[]> {
    // Your implementation
    return [];
  }
}

const ai = new AI({
  openAIApiKey: "<OPENAI_API_KEY>",
  embeddings: new CustomEmbeddings(),
});

Keywords

FAQs

Package last updated on 30 Aug 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc