Socket
Socket
Sign inDemoInstall

@api.video/media-stream-person-segmentation

Package Overview
Dependencies
56
Maintainers
3
Versions
1
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    @api.video/media-stream-person-segmentation

api.video media stream person segmentation - change the background and emphasize the person in your media streams


Version published
Weekly downloads
5
decreased by-44.44%
Maintainers
3
Install size
328 MB
Created
Weekly downloads
 

Changelog

Source

[0.0.1] - 2023-03-09

  • First version

Readme

Source

badge   badge   badge

Mediastream person segmentation

api.video is the video infrastructure for product builders. Lightning fast video APIs for integrating, scaling, and managing on-demand & low latency live streaming features in your app.

Table of contents

Project description

This library allows you to easily blur the background of a webcam video stream in real-time using TensorFlow.

With this library, you can create a more professional and visually appealing video conferencing experience by keeping the focus on the person in front of the camera and reducing visual distractions from the surrounding environment.

Please note that this project is currently experimental and may not be suitable for production use. This works pretty well on Chrome and Firefox, it may not work as well on Safari.

Getting started

Installation

Simple include in a javascript project

Include the library in your HTML file like so:

<head>
    ...
    <script src="https://unpkg.com/@api.video/media-stream-person-segmentation" defer></script>
</head>
...
<script type="text/javascript"> 
    navigator.mediaDevices.getUserMedia(constraints).then((webcamStream) => {
        const segmentation = new ApiVideoMediaStreamPersonSegmentation(webcamStream)

        segmentation.applyBlurEffect()

        segmentation.onReady((ouputStream) => {
            // use the ouputStream to display the video
            // ...
        });

    });
</script>

Documentation

Instanciation

Options

The library is instanciated with a MediaStream object as parameter.

const segmentation = new ApiVideoMediaStreamPersonSegmentation(webcamStream)

Methods

applyBlurEffect(options: BlurEffectOptions)

Apply a blur effect on the background of the video stream.

Options
NameDefaultTypeDescription
backgroundBlurAmount15numberThe amount of blur to apply to the background.
edgeBlurAmount8numberThe amount of blur to apply to the edges of the person.
segmentation.applyBlurEffect({
    backgroundBlurAmount: 15,
    edgeBlurAmount: 8
})

onReady(callback: (stream: MediaStream) => void)

Call the callback when the stream is ready.

segmentation.onReady((stream) => {
    // use the stream to display the video
    // ...
});

Full example

<html>

<head>
    <script src="https://unpkg.com/@api.video/media-stream-person-segmentation"></script>
    <style>
        #container {
            display: flex;
            flex-direction: column;
            align-items: center;
        }

        #video {
            width: 640;
            height: 480;
            border: 1px solid gray;
        }

        #container div {
            margin: 10px 0;
        }

        label {
            display: inline-block;
            width: 160px;
            text-align: right;
        }

        input {
            display: inline-block;
            width: 300px;
        }
    </style>
</head>

<body>
    <div id="container">
        <div>
            <video muted id="video"></video>
        </div>
        <div>
            <div>
                <label for="blur">Blur background</label>
                <input type="range" id="blur" min="0" max="100" value="15" />
            </div>
            <div>
                <label for="blurEdges">Blur edges</label>
                <input type="range" id="blurEdges" min="0" max="10" value="8" />
            </div>
        </div>
    </div>

    <script>
        const video = document.querySelector('#video');

        var constraints = window.constraints = {
            audio: true,
            video: true
        };

        navigator.mediaDevices.getUserMedia(constraints).then((stream) => {
            const segmentation = new ApiVideoMediaStreamPersonSegmentation(stream)

            segmentation.applyBlurEffect({
                backgroundBlurAmount: 15,
                edgeBlurAmount: 8
            })

            document.getElementById("blur").addEventListener("input", (e) => segmentation.applyBlurEffect({
                backgroundBlurAmount: e.target.value
            }));
            document.getElementById("blurEdges").addEventListener("input", (e) => segmentation.applyBlurEffect({
                edgeBlurAmount: e.target.value
            }));

            segmentation.onReady((stream) => {
                video.srcObject = stream;
                video.play();
            });

        });
    </script>
</body>

</html>

Keywords

FAQs

Last updated on 09 Mar 2023

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc