Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

gaze-detection

Package Overview
Dependencies
Maintainers
1
Versions
9
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

gaze-detection

Detect the user's gaze direction using machine learning, to control interfaces

  • 1.0.0
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

Gaze-detection - Detect gaze direction in JavaScript using Tensorflow.js

Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!

  • Calculate difference moving eyes right/left

  • Calculate Looking up / down

  • Adapt to z position of face

  • Calculate looking straight forward (going back to initial position) // normalized eye movement

  • normalize up/down

  • Adapt to rotation of face

  • Try to select words/phrases/letters with eye movement

  • Hide video

  • Make sure event is counter as once

  • Build demo with writing words

  • Make it a module so it can be easily imported and used.

  • Record demo video

  • Loading indicator for model

  • Remove logs

  • Refactor

  • Publish package

  • Allow to get raw iris position?

  • Chrome dino with eyes up

  • Write blog post?

  • Does normalizing based on bounding box work for different shapes of faces?

Demo

Visit https://gaze-keyboard.netlify.app/ (You can try it on mobile too!!)

Inspired by the Android application "Look to speak".

Uses Tensorflow.js's face landmark detection model

Detection

This util detects when the user looks right, left, up and straight forward.

How to use

Install

As a npm module:

npm install gaze-detection --save
yarn add gaze-detection
npx install gaze-detection

As a CDN link:

<script></script>

Code sample

If used as a npm module, start by importing it:

import gaze from "gaze-detection";

The module needs a webcam feed to run the detection:

const videoElement = document.querySelector("video");

const init = async () => {
  // set up webcam feed
  await gaze.setInputVideo(videoElement);
};

init();

Once the video stream is set up, load the model:

await gaze.loadModel();

Run the predictions:

const predict = async () => {
  const gazePrediction = await gaze.getGazePrediction();
  console.log("Gaze direction: ", gazePrediction); //will return 'RIGHT', 'LEFT', 'STRAIGHT' or 'TOP'
  let raf = requestAnimationFrame(predict);
};
predict();

FAQs

Package last updated on 31 Jan 2021

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc