
Research
/Security News
DuckDB npm Account Compromised in Continuing Supply Chain Attack
Ongoing npm supply chain attack spreads to DuckDB: multiple packages compromised with the same wallet-drainer malware.
onnxruntime-node
Advanced tools
The onnxruntime-node package is a Node.js binding for the ONNX Runtime, which is a cross-platform, high-performance scoring engine for Open Neural Network Exchange (ONNX) models. It allows developers to run machine learning models in Node.js applications efficiently.
Load and Run ONNX Models
This feature allows you to load an ONNX model and run inference on it using the onnxruntime-node package. The code sample demonstrates how to create an inference session, prepare input data, and execute the model to get the output.
const ort = require('onnxruntime-node');
async function runModel() {
const session = await ort.InferenceSession.create('./model.onnx');
const input = { input: new ort.Tensor('float32', [1, 2, 3, 4], [2, 2]) };
const output = await session.run(input);
console.log(output);
}
runModel();
Model Optimization
This feature provides options to optimize the ONNX model for better performance. The code sample shows how to set optimization options when creating an inference session, which can lead to faster inference times.
const ort = require('onnxruntime-node');
async function optimizeModel() {
const sessionOptions = {
graphOptimizationLevel: 'all'
};
const session = await ort.InferenceSession.create('./model.onnx', sessionOptions);
console.log('Model optimized and loaded');
}
optimizeModel();
TensorFlow is an open-source platform for machine learning. It provides a comprehensive, flexible ecosystem of tools, libraries, and community resources that lets researchers push the state-of-the-art in ML, and developers easily build and deploy ML-powered applications. Unlike onnxruntime-node, TensorFlow is not limited to ONNX models and offers a broader range of functionalities including model training.
ml5.js is a friendly high-level interface to TensorFlow.js, designed to make machine learning accessible to a broad audience. It provides pre-trained models and simplifies the process of integrating machine learning into web applications. Compared to onnxruntime-node, ml5.js is more focused on ease of use and accessibility, particularly for beginners and educators.
ONNX Runtime Node.js binding enables Node.js applications to run ONNX model inference.
Install the latest stable version:
npm install onnxruntime-node
Install the nightly version:
npm install onnxruntime-node@dev
Refer to ONNX Runtime JavaScript examples for samples and tutorials.
ONNXRuntime works on Node.js v16.x+ (recommend v20.x+) or Electron v15.x+ (recommend v28.x+).
The following table lists the supported versions of ONNX Runtime Node.js binding provided with pre-built binaries.
EPs/Platforms | Windows x64 | Windows arm64 | Linux x64 | Linux arm64 | MacOS x64 | MacOS arm64 |
---|---|---|---|---|---|---|
CPU | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ |
WebGPU | ✔️ [1] | ✔️ [1] | ✔️ [1] | ✔️ [1] | ✔️ [1] | ✔️ [1] |
DirectML | ✔️ | ✔️ | ❌ | ❌ | ❌ | ❌ |
CUDA | ❌ | ❌ | ✔️[2] | ❌ | ❌ | ❌ |
CoreML | ❌ | ❌ | ❌ | ❌ | ✔️ | ✔️ |
To use on platforms without pre-built binaries, you can build Node.js binding from source and consume it by npm install <onnxruntime_repo_root>/js/node/
. See also instructions for building ONNX Runtime Node.js binding locally.
Right now, the Windows version supports WebGPU execution provider and DML execution provider. Linux x64 can use CUDA and TensorRT.
To use CUDA EP, you need to install the CUDA EP binaries. By default, the CUDA EP binaries are installed automatically when you install the package. If you want to skip the installation, you can pass the --onnxruntime-node-install=skip
flag to the installation command.
npm install onnxruntime-node --onnxruntime-node-install=skip
You can also use this flag to specify the version of the CUDA: (v11 or v12) CUDA v11 is no longer supported since v1.22.
License information can be found here.
FAQs
ONNXRuntime Node.js binding
The npm package onnxruntime-node receives a total of 260,618 weekly downloads. As such, onnxruntime-node popularity was classified as popular.
We found that onnxruntime-node demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
/Security News
Ongoing npm supply chain attack spreads to DuckDB: multiple packages compromised with the same wallet-drainer malware.
Security News
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.
Product
Socket’s new Pull Request Stories give security teams clear visibility into dependency risks and outcomes across scanned pull requests.