New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details
Socket
Book a DemoSign in
Socket

openai-oxide

Package Overview
Dependencies
Maintainers
1
Versions
5
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

openai-oxide

Native Node.js bindings for openai-oxide powered by napi-rs.

latest
Source
npmnpm
Version
0.11.2
Version published
Weekly downloads
23
-90.5%
Maintainers
1
Weekly downloads
 
Created
Source

openai-oxide for Node.js

npm npm downloads

Native Node.js bindings for openai-oxide, built with napi-rs. Also available on crates.io (Rust) and PyPI (Python).

The package exposes the Rust client to Node.js with native streaming and WebSocket support, while keeping release artifacts out of git. Prebuilt binaries are published to npm for the supported targets listed below.

Features

  • Native bindings for chat, responses, streaming, and WebSocket sessions
  • Shared Rust core with the main openai-oxide crate
  • Prebuilt npm artifacts for the main desktop and server targets
  • Fast-path APIs for hot loops that return only text or response id
  • Local development and release flow driven by pnpm

Why choose openai-oxide on Node?

Featureopenai-oxideofficial openai SDK
WebSocket ResponsesPersistent wss:// session, reuses TLS for every stepREST-only
Streaming parserZero-copy SSE parser + early function-call parseHTTP/2 response buffering
Typed Rust coreFull Response struct, hedged requests, parallel fan-outsGeneric JS objects
Hot REST pathscreateText, createStoredResponseId, createTextFollowup avoid JSON bridgeAlways serializes Record<string, any>
Platform binariesPrebuilt .node for darwin/linux/windows in npmPure JS package

The official SDK is great for HTTP/REST but does not expose WebSocket streaming or Rust-level hedged/parallel tooling out of the box. If your workload issues quick successive tool calls, streams tokens, or runs inside a WebSocket session, the native bindings keep latency and contention lower while still letting you call the same OpenAI APIs.

Supported Targets

  • macOS x64
  • macOS arm64
  • Linux x64 GNU
  • Linux x64 musl
  • Linux arm64 GNU
  • Linux arm64 musl
  • Windows x64 MSVC

Install

npm install openai-oxide
# or
pnpm add openai-oxide
# or
yarn add openai-oxide

From the repository for local development:

cd openai-oxide-node
pnpm install
pnpm build
pnpm test

Client reads credentials from the same environment variables as the Rust crate, for example OPENAI_API_KEY.

Quick Start

const { Client } = require('openai-oxide')

async function main() {
  const client = new Client()

  const response = await client.createResponse({
    model: 'gpt-4o-mini',
    input: 'Say hello to Node.js from Rust via napi-rs.'
  })

  console.log(response.output?.[0]?.content?.[0]?.text)
}

main().catch((error) => {
  console.error(error)
  process.exitCode = 1
})

Examples live in examples/:

  • examples/01_basic_request.js
  • examples/02_streaming.js
  • examples/03_websocket.js
  • examples/bench_node.js

Benchmarks

Benchmarks were run locally against the live OpenAI API with:

BENCH_ITERATIONS=5 pnpm bench

Setup:

  • Model: gpt-5.4
  • Iterations: 5
  • Reported value: median latency
  • Comparison target: official openai npm SDK

Node.js Ecosystem (openai-oxide vs openai)

openai-oxide wins 8/8 tests. Native napi-rs bindings vs official openai npm.

Testopenai-oxideopenaiWinner
Plain text1075ms1311msOXIDE (+18%)
Structured output1370ms1765msOXIDE (+22%)
Function calling1725ms1832msOXIDE (+6%)
Multi-turn (2 reqs)2283ms2859msOXIDE (+20%)
Rapid-fire (5 calls)6246ms6936msOXIDE (+10%)
Streaming TTFT534ms580msOXIDE (+8%)
Parallel 3x1937ms1991msOXIDE (+3%)
WebSocket hot pair2181msN/AOXIDE

median of medians, 3×5 iterations. Model: gpt-5.4.

Reproduce: cd openai-oxide-node && BENCH_ITERATIONS=5 node examples/bench_node.js

Summary: openai-oxide wins 8/8 tests.

For the lowest-overhead REST paths in Node, prefer the fast-path methods:

  • client.createText(model, input, maxOutputTokens?)
  • client.createStoredResponseId(model, input, maxOutputTokens?)
  • client.createTextFollowup(model, input, previousResponseId, maxOutputTokens?)

Development

Useful commands:

pnpm install
pnpm build
pnpm test
pnpm bench
pnpm pack:preview

pnpm build writes the local .node binary next to index.js for development only. Those generated binaries are ignored by git and are not committed. pnpm pack:preview writes a tarball preview into .preview/, which is also ignored by git.

Release Flow

The repository keeps the Node release separate from the Rust and Python releases.

For the Node package:

  • Keep the Node package version aligned with the Rust crate and Python package version.
  • Push a tag like node-v0.9.6.
  • GitHub Actions builds the native addon for each supported target.
  • The Node release workflow assembles platform packages with napi-rs and publishes to npm with pnpm publish.

Required secrets for npm publishing:

  • NPM_TOKEN

The workflow uses pnpm throughout, publishes with provenance enabled, and keeps platform-specific binaries out of the repository history.

Keywords

openai

FAQs

Package last updated on 30 Mar 2026

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts