Security News
Node.js EOL Versions CVE Dubbed the "Worst CVE of the Year" by Security Experts
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
onnxruntime-web
onnxruntime-web is a JavaScript library that allows you to run ONNX (Open Neural Network Exchange) models directly in the browser or in Node.js environments. It leverages WebAssembly (WASM) and WebGL for efficient execution of machine learning models, making it possible to perform inference tasks on the client side without needing a server.
Load and Run ONNX Model
This feature allows you to load an ONNX model and run inference on it. The code sample demonstrates how to create an inference session, prepare input data, and run the model to get the results.
const ort = require('onnxruntime-web');
async function runModel() {
const session = await ort.InferenceSession.create('model.onnx');
const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
const feeds = { input: input };
const results = await session.run(feeds);
console.log(results);
}
runModel();
Use WebGL Backend
This feature allows you to leverage the WebGL backend for running ONNX models, which can provide better performance for certain types of models. The code sample shows how to specify the WebGL execution provider when creating the inference session.
const ort = require('onnxruntime-web');
async function runModelWithWebGL() {
const session = await ort.InferenceSession.create('model.onnx', { executionProviders: ['webgl'] });
const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
const feeds = { input: input };
const results = await session.run(feeds);
console.log(results);
}
runModelWithWebGL();
Run Model in Node.js
This feature demonstrates how to run ONNX models in a Node.js environment. The code sample is similar to the browser example but is intended to be run in a Node.js context.
const ort = require('onnxruntime-web');
async function runModelInNode() {
const session = await ort.InferenceSession.create('model.onnx');
const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
const feeds = { input: input };
const results = await session.run(feeds);
console.log(results);
}
runModelInNode();
Brain.js is a JavaScript library for neural networks, which can be used in the browser or with Node.js. It is simpler and more lightweight compared to onnxruntime-web, and is suitable for basic neural network tasks. However, it does not support the ONNX model format.
Synaptic is a JavaScript neural network library for node.js and the browser. It provides a flexible and powerful API for creating and training neural networks. Unlike onnxruntime-web, Synaptic does not support running pre-trained ONNX models and is more focused on building and training models from scratch.
FAQs
A Javascript library for running ONNX models on browsers
The npm package onnxruntime-web receives a total of 152,062 weekly downloads. As such, onnxruntime-web popularity was classified as popular.
We found that onnxruntime-web demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Critics call the Node.js EOL CVE a misuse of the system, sparking debate over CVE standards and the growing noise in vulnerability databases.
Security News
cURL and Go security teams are publicly rejecting CVSS as flawed for assessing vulnerabilities and are calling for more accurate, context-aware approaches.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.