Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

onnxruntime-web

Package Overview
Dependencies
Maintainers
3
Versions
264
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

onnxruntime-web

A Javascript library for running ONNX models on browsers

  • 1.21.0-dev.20241023-0028d3f332
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
152K
decreased by-0.08%
Maintainers
3
Weekly downloads
 
Created

What is onnxruntime-web?

onnxruntime-web is a JavaScript library that allows you to run ONNX (Open Neural Network Exchange) models directly in the browser or in Node.js environments. It leverages WebAssembly (WASM) and WebGL for efficient execution of machine learning models, making it possible to perform inference tasks on the client side without needing a server.

What are onnxruntime-web's main functionalities?

Load and Run ONNX Model

This feature allows you to load an ONNX model and run inference on it. The code sample demonstrates how to create an inference session, prepare input data, and run the model to get the results.

const ort = require('onnxruntime-web');

async function runModel() {
  const session = await ort.InferenceSession.create('model.onnx');
  const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
  const feeds = { input: input };
  const results = await session.run(feeds);
  console.log(results);
}

runModel();

Use WebGL Backend

This feature allows you to leverage the WebGL backend for running ONNX models, which can provide better performance for certain types of models. The code sample shows how to specify the WebGL execution provider when creating the inference session.

const ort = require('onnxruntime-web');

async function runModelWithWebGL() {
  const session = await ort.InferenceSession.create('model.onnx', { executionProviders: ['webgl'] });
  const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
  const feeds = { input: input };
  const results = await session.run(feeds);
  console.log(results);
}

runModelWithWebGL();

Run Model in Node.js

This feature demonstrates how to run ONNX models in a Node.js environment. The code sample is similar to the browser example but is intended to be run in a Node.js context.

const ort = require('onnxruntime-web');

async function runModelInNode() {
  const session = await ort.InferenceSession.create('model.onnx');
  const input = new ort.Tensor('float32', new Float32Array([1.0, 2.0, 3.0, 4.0]), [1, 4]);
  const feeds = { input: input };
  const results = await session.run(feeds);
  console.log(results);
}

runModelInNode();

Other packages similar to onnxruntime-web

Keywords

FAQs

Package last updated on 24 Oct 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc