
Security News
How Enterprise Security Is Adapting to AI-Accelerated Threats
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.
@webassembly/onnxruntime-node
Advanced tools
ONNX Runtime Web is a Javascript library for running ONNX models on browsers and on Node.js.
ONNX Runtime Web has adopted WebAssembly and WebGL technologies for providing an optimized ONNX model inference runtime for both CPUs and GPUs.
The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.
With ONNX Runtime Web, web developers can score models directly on browsers with various benefits including reducing server-client communication and protecting user privacy, as well as offering install-free and cross-platform in-browser ML experience.
ONNX Runtime Web can run on both CPU and GPU. On CPU side, WebAssembly is adopted to execute the model at near-native speed. ONNX Runtime Web complies the native ONNX Runtime CPU engine into WebAssembly backend by using Emscripten, so it supports most functionalities native ONNX Runtime offers, including full ONNX operator coverage, multi-threading, ONNX Runtime Quantization as well as ONNX Runtime Mobile. For performance acceleration with GPUs, ONNX Runtime Web leverages WebGL, a popular standard for accessing GPU capabilities. We are keeping improving op coverage and optimizing performance in WebGL backend.
See Compatibility and Operators Supported for a list of platforms and operators ONNX Runtime Web currently supports.
Refer to ONNX Runtime JavaScript examples for samples and tutorials.
Refer to the following links for development information:
| OS/Browser | Chrome | Edge | Safari | Electron | Node.js |
|---|---|---|---|---|---|
| Windows 10 | wasm, webgl | wasm, webgl | - | wasm, webgl | wasm |
| macOS | wasm, webgl | wasm, webgl | wasm, webgl | wasm, webgl | wasm |
| Ubuntu LTS 18.04 | wasm, webgl | wasm, webgl | - | wasm, webgl | wasm |
| iOS | wasm, webgl | wasm, webgl | wasm, webgl | - | - |
| Android | wasm, webgl | wasm, webgl | - | - | - |
ONNX Runtime Web currently support all operators in ai.onnx and ai.onnx.ml.
ONNX Runtime Web currently supports a subset of operators in ai.onnx operator set. See webgl-operators.md for a complete, detailed list of which ONNX operators are supported by WebGL backend.
WebGPU backend is still an experimental feature. See webgpu-operators.md for a detailed list of which ONNX operators are supported by WebGPU backend.
License information can be found here.
FAQs
A Javascript library for running ONNX models on browsers
The npm package @webassembly/onnxruntime-node receives a total of 0 weekly downloads. As such, @webassembly/onnxruntime-node popularity was classified as not popular.
We found that @webassembly/onnxruntime-node demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CTO Ahmad Nassri discusses why supply chain attacks now target developer machines and what AI means for the future of enterprise security.

Security News
Learn the essential steps every developer should take to stay secure on npm and reduce exposure to supply chain attacks.

Security News
Experts push back on new claims about AI-driven ransomware, warning that hype and sponsored research are distorting how the threat is understood.