Socket
Socket
Sign inDemoInstall

@tensorflow/tfjs-converter

Package Overview
Dependencies
Maintainers
8
Versions
152
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@tensorflow/tfjs-converter

Tensorflow model converter for javascript


Version published
Weekly downloads
141K
decreased by-18.81%
Maintainers
8
Weekly downloads
 
Created

What is @tensorflow/tfjs-converter?

@tensorflow/tfjs-converter is a library that allows you to import TensorFlow SavedModel and Keras models into TensorFlow.js. This enables you to run pre-trained models in the browser or in Node.js, making it easier to deploy machine learning models in web applications.

What are @tensorflow/tfjs-converter's main functionalities?

Import TensorFlow SavedModel

This feature allows you to import a TensorFlow SavedModel into TensorFlow.js. The code sample demonstrates how to load a SavedModel from a specified path and log the model to the console.

const tf = require('@tensorflow/tfjs');
const tfConverter = require('@tensorflow/tfjs-converter');

async function loadModel() {
  const model = await tfConverter.loadGraphModel('path/to/saved_model');
  console.log(model);
}

loadModel();

Import Keras Model

This feature allows you to import a Keras model into TensorFlow.js. The code sample demonstrates how to load a Keras model from a specified JSON file and log the model to the console.

const tf = require('@tensorflow/tfjs');
const tfConverter = require('@tensorflow/tfjs-converter');

async function loadKerasModel() {
  const model = await tfConverter.loadLayersModel('path/to/keras_model.json');
  console.log(model);
}

loadKerasModel();

Run Inference

This feature allows you to run inference using a loaded model. The code sample demonstrates how to load a SavedModel, create a tensor as input, run the model's predict function, and print the output.

const tf = require('@tensorflow/tfjs');
const tfConverter = require('@tensorflow/tfjs-converter');

async function runInference() {
  const model = await tfConverter.loadGraphModel('path/to/saved_model');
  const input = tf.tensor([1, 2, 3, 4]);
  const output = model.predict(input);
  output.print();
}

runInference();

Other packages similar to @tensorflow/tfjs-converter

FAQs

Package last updated on 17 May 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc