
Security News
npm Adopts OIDC for Trusted Publishing in CI/CD Workflows
npm now supports Trusted Publishing with OIDC, enabling secure package publishing directly from CI/CD workflows without relying on long-lived tokens.
camera-processor
Advanced tools
A Simple to Use Webcam Filter Framework.
$ npm install camera-processor
@camera-processor/virtual-background - Easy-to-use background masking.
More coming in the future...
Frame Analyzers take frames from your camera and asynchronously give back some information about them that your app and Frame Renderers can use.
Frame Renderers take the information given to them by the Frame Analyzers and use Render Modes to draw things on top of the camera.
Render Modes give you a canvas and a canvas context that Frame Renderers can use to draw. There are 2 Render Modes that come with this library - _2DRenderMode and WebGLRenderMode (Unfinished).
The Camera Processor is the main component of this library. It combines the Frame Analyzers with the Frame Renderers and a frame loop.
import CameraProcessor from 'camera-processor';
async function main() {
// Get the camera stream somehow (It's best to only give the video since CameraProcessor can't handle audio)
const camera_stream = await navigator.mediaDevices.getUserMedia({ video: true });
const video = document.querySelector('video');
const camera_processor = new CameraProcessor();
camera_processor.setCameraStream(camera_stream);
camera_processor.start(); // You have to explicitly start it
// Add some analyzer (the first argument is the name and it's very important)
const some_analyzer = camera_processor.addAnalyzer('some_analyzer', new SomeAnalyzer());
// Add some renderer that might or might not use data from the analyzers
const some_renderer = camera_processor.addRenderer(new SomeRenderer());
video.srcObject = camera_processor.getOutputStream(); // Get the output stream
video.play();
}
window.addEventListener('DOMContentLoaded', main);
// Get the data for the last frame from all analyzers
const analyzer_data = camera_processor.analyzer.data;
// This object has a key for every name you used with addAnalyzer
// And the value of this key is the data for the last frame returned by that analyzer
console.log(analyzer_data);
// > { some_analyzer: ... }
console.log(camera_processor.performance);
// > {
// fps: ...,
// frameTime: {
// analyze: ...,
// render: ...,
// total: ...
// }
// }
// Stop all analyzers and renderers and just pass the camera stream through the output stream
camera_processor.passthrough = true;
camera_processor.start(); // Start/Resume (has to be called explicitly in the beginning)
camera_processor.stop(); // Stop/Pause (also freezes the camera through the output stream)
// Use passthrough mode if you just want to stop all renderers and analyzers
// All analyzers and renderers have the same start/stop methods
// to start and stop them individually but unlike the CameraProcessor
// they're are started by default
some_analyzer.start();
some_renderer.stop();
// Check if the CameraProcessor or any analyzer/renderer is running
console.log(camera_processor.isRunning);
console.log(some_analyzer.isRunning);
type AnalyzerData = { some_analyzer: SomeType };
// This will give you special typing for the camera_processor.analyzer.data
const camera_processor = new CameraProcessor<AnalyzerData>();
import { FrameAnalyzer } from 'camera-processor';
class SomeAnalyzer extends FrameAnalyzer {
async analyze(camera_video, camera_processor) {
// Do something with camera_video
return some_data;
}
}
// camera_processor.addAnalyzer('some_analyzer', new SomeAnalyzer());
import { FrameRenderer, RENDER_MODE } from 'camera-processor';
// Check 'Using The Camera Renderer' below for more details on 'renderer' and 'RENDER_MODE'
class SomeRenderer extends FrameRenderer {
render(analyzer_data, camera_video, renderer, camera_processor) {
renderer.use(RENDER_MODE._2D); // Switch to the specified Render Mode (always do this at the start)
renderer.ctx.fillStyle = 'green';
renderer.ctx.fillRect(0, 0, renderer.width, renderer.height);
}
}
// camera_processor.addRenderer(new SomeRenderer());
// In the render method of a FrameRenderer you have access to the CameraRenderer (renderer)
renderer.use(RENDER_MODE._2D); // Switch to that canvas and canvas context
// Note: A FrameRenderer can use multiple different RenderModes one after another and the image will
// be copied from the previous one to the next so that you can make incremental changes to the image
// by rendering transparent things on top of it.
renderer.canvas; // Access the current RenderMode's canvas
renderer.ctx; // Access the current RenderMode's canvas context
renderer.width; // Access the camera's width
renderer.height; // Access the camera's height
RENDER_MODE is an enum that allows you to specify what kind of canvas context you want to use.
RENDER_MODE._2D will use a 2d canvas.
RENDER_MODE.WebGL will use a webgl2/webgl canvas. (Unfinished)
FAQs
A Simple to Use Webcam Filter Framework.
The npm package camera-processor receives a total of 4 weekly downloads. As such, camera-processor popularity was classified as not popular.
We found that camera-processor demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
npm now supports Trusted Publishing with OIDC, enabling secure package publishing directly from CI/CD workflows without relying on long-lived tokens.
Research
/Security News
A RubyGems malware campaign used 60 malicious packages posing as automation tools to steal credentials from social media and marketing tool users.
Security News
The CNA Scorecard ranks CVE issuers by data completeness, revealing major gaps in patch info and software identifiers across thousands of vulnerabilities.