Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

openpipe

Package Overview
Dependencies
Maintainers
0
Versions
70
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

openpipe

LLM metrics and inference

  • 0.20.0
  • npm
  • Socket score

Version published
Weekly downloads
64K
increased by24.53%
Maintainers
0
Weekly downloads
 
Created
Source

OpenPipe Node API Library

NPM version

This library wraps TypeScript or Javascript OpenAI API calls and logs additional data to the configured OPENPIPE_BASE_URL for further processing.

It is fully compatible with OpenAI's sdk and logs both streaming and non-streaming requests and responses.

Installation

npm install --save openpipe
# or
yarn add openpipe

Import

ESM

// import OpenAI from "openai"
import OpenAI from "openpipe/openai";

CJS

// const OpenAI = require("openai")
const OpenAI = require("openpipe/openai").default;

Usage

  1. Create a project at https://app.openpipe.ai
  2. Find your project's API key at https://app.openpipe.ai/settings
  3. Configure the OpenPipe client as shown below.
// import OpenAI from "openai"
import OpenAI from "openpipe/openai";

// Fully compatible with original OpenAI initialization
const openai = new OpenAI({
  apiKey: "my api key", // defaults to process.env["OPENAI_API_KEY"]
  // openpipe key is optional
  openpipe: {
    apiKey: "my api key", // defaults to process.env["OPENPIPE_API_KEY"]
    baseUrl: "my url", // defaults to process.env["OPENPIPE_BASE_URL"] or https://app.openpipe.ai/api/v1 if not set
  },
});

async function main() {
  // Allows optional openpipe object
  const completion = await openai.chat.completions.create({
    messages: [{ role: "user", content: "Say this is a test" }],
    model: "gpt-3.5-turbo",
    // optional
    openpipe: {
      // Add custom searchable tags
      tags: {
        prompt_id: "extract_user_intent",
        any_key: "any_value",
      },
      logRequest: true, // Enable/disable data collection. Defaults to true.
    },
  });

  console.log(completion.choices);
}

main();

FAQ

How do I report calls to my self-hosted instance?

Start an instance by following the instructions on Running Locally. Once it's running, point your OPENPIPE_BASE_URL to your self-hosted instance.

What if my OPENPIPE_BASE_URL is misconfigured or my instance goes down? Will my OpenAI calls stop working?

Your OpenAI calls will continue to function as expected no matter what. The sdk handles logging errors gracefully without affecting OpenAI inference.

See the GitHub repo for more details.

FAQs

Package last updated on 25 Jul 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc