New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@elg/coerce-llm-output

Package Overview
Dependencies
Maintainers
0
Versions
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@elg/coerce-llm-output

[![CircleCI](https://dl.circleci.com/status-badge/img/gh/taneliang/coerce-llm-output/tree/main.svg?style=svg)](https://dl.circleci.com/status-badge/redirect/gh/taneliang/coerce-llm-output/tree/main) [![codecov](https://codecov.io/github/taneliang/coerce-l

  • 0.1.0
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
9
decreased by-30.77%
Maintainers
0
Weekly downloads
 
Created
Source

@elg/coerce-llm-output

CircleCI codecov

coerceLlmOutput makes it possible to use LLM output in a typesafe way by coercing it into a well-typed and validated JSON object or array.

Installation

npm install @elg/coerce-llm-output zod

OR

yarn add @elg/coerce-llm-output zod

Usage

Basic usage with OpenAI

import { coerceLlmOutput } from "@elg/coerce-llm-output";
import OpenAI from "openai";
import { z } from "zod";

const openai = new OpenAI();

const User = z.object({
  id: z.string(),
  name: z.string(),
  email: z.string(),
});

async function main() {
  const chatCompletion = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      {
        role: "user",
        content:
          "Generate a JSON user object with the shape " +
          "{ id: string; name: string; email: string }",
      },
    ],
  });

  const output: z.infer<User> = coerceLlmOutput(
    chatCompletion.choices[0].message,
    User,
  );
}

main();

Streaming

import { coerceLlmOutput } from "@elg/coerce-llm-output";
import OpenAI from "openai";
import { z } from "zod";

const openai = new OpenAI();

const User = z.object({
  id: z.string(),
  name: z.string(),
  email: z.string(),
});

async function main() {
  const stream = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      {
        role: "user",
        content:
          "Generate a JSON user object with the shape " +
          "{ id: string; name: string; email: string }",
      },
    ],
    stream: true,
  });
  for await (const chunk of stream) {
    const output: z.infer<User> = coerceLlmOutput(
      chunk.choices[0].message, // TODO: Check if this works? Otherwise use chunk.choices[0].delta.content
      User,
    );
    // process.stdout.write(chunk.choices[0]?.delta?.content || '');
  }
}

main();

Arrays

import { coerceLlmOutput } from "@elg/coerce-llm-output";
import OpenAI from "openai";
import { z } from "zod";

const openai = new OpenAI();

const User = z.object({
  id: z.string(),
  name: z.string(),
  email: z.string(),
});

async function main() {
  const chatCompletion = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      {
        role: "user",
        content:
          "Generate a JSON array of users. Each user must have the shape " +
          "{ id: string; name: string; email: string }",
      },
    ],
  });

  const output: z.infer<User>[] = coerceLlmOutput(
    chatCompletion.choices[0].message,
    z.array(User),
  );
}

main();

Functionality

coerceLlmOutput:

  1. Extracts JSON-like content.
  2. Parses incomplete JSON (using the excellent partial-json).
  3. Fixes up keys in the parsed JSON object to match the keys in the zod schema. We do this because LLMs sometimes generate camelCased or snake_cased, or wrongly cased keys unexpectedly.
  4. Parses the JSON object using the provided zod schema.

FAQs

Package last updated on 10 Jul 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc