New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

homography

Package Overview
Dependencies
Maintainers
1
Versions
10
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

homography

Perform Affine, Projective or Piecewise Affine transformations over any Image or HTMLElement from only a set of reference points. High-Performance and easy-to-use.

  • 1.0.0
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
17
decreased by-64.58%
Maintainers
1
Weekly downloads
 
Created
Source

Homography.js

Homography.js is a lightweight High-Performance library for implementing homographies in Javascript or Node.js. It is designed to be easy-to-use (even for developers that are not familiar with Computer Vision), and able to run in real time applications (even in low-spec devices such as budget smartphones). It allows you to perform Affine, Projective or Piecewise Affine warpings over any Image or HTMLElement in your application by only setting a small set of reference points. Additionally, Image warpings can be made persistent (independent of any CSS property), so they can be easily drawn in a canvas, mixed or downloaded. Homography.js is built in a way that frees the user from all the pain-in-the-ass details of homography operations, such as thinking about output dimensions, input coordinate ranges, dealing with unexpected shifts, pads, crops or unfilled pixels in the output image or even knowing what a Transform Matrix is.

Features

  • Apply different warpings to any Image or HTMLElement by just setting two sets of reference points.
  • Perform Affine, Projective or Piecewise Affine transforms or just set Auto and let the library decide which transform to apply depending on the reference points you provide.
  • Simplify how you deal with canvas drawings, or subsequent Computer Vision problems by making your Image transforms persistent and independent of any CSS property.
  • Forget all the pain-in-the-ass details of homography operations, even if you only have fuzzy idea about what an homography is.
  • Avoid warping delays in real-time applications due to its design focused on High-Performance.
  • Support for running in the backend with Node.js.

Installation

Usage

In the Browser

Perform a basic Piecewise Affine Transform from four source points.

    // Select the image you want to warp
    const image = document.getElementById("myImage");
    
    // Define the reference points. In this case using normalized coordinates (from 0.0 to 1.0).
    const srcPoints = [[0, 0], [0, 1], [1, 0], [1, 1]];
    const dstPoints = [[1/5, 1/5], [0, 1/2], [1, 0], [6/8, 6/8]];
    
    // Create a Homography object for a "piecewiseaffine" transform (it could be reused later)
    const homography = new Homography("piecewiseaffine");
    // Set the reference points
    homography.setReferencePoints(srcPoints, dstPoints);
    // Warp your image
    const resultImage = homography.warp(image);
    ...

Perform a complex Piecewise Affine Transform from a large set of pointsInY * pointsInX reference points.

    ...
    // Define a set of reference points that match to a sinusoidal form. 
    // In this case in image axis (x : From 0 to width, y : From 0 to height) for convenience.
    let srcPoints = [], dstPoints = [];
    for (let y = 0; y <= h; y+=height/pointsInY){
        for (let x = 0; x <= w; x+=width/pointsInX){
            srcPoints.push([x, y]); // Add (x, y) as source points
            dstPoints.push([x, amplitude+y+Math.sin((x*n)/Math.PI)*amplitude]); // Apply sinus function on y
        }    
    }
    // Set the reference points (reuse the previous Homography object)
    homography.setReferencePoints(srcPoints, dstPoints);
    // Warp your image. As not image is given, it will reuse the one used for the previous example.
    const resultImage = homography.warp();
    ...
    

Perform a simple Affine Transform and apply it on a HTMLElement.

    ...
    // Set the reference points from which estimate the transform
    const srcPoints = [[0, 0], [0, 1], [1, 0]];
    const dstPoints = [[0, 0], [1/2, 1], [1, 1/8]];
    
    // Don't specify the type of transform to apply, so let the library decide it by itself. 
    const homography = new Homography(); // Default transform value is "auto".
    // Apply the transform over an HTMLElement from the DOM.
    identityHomography.transformHTMLElement(document.getElementById("inputText"), squarePoints, rectanglePoints);
    ...

Calculate 250 different Projective Transforms, apply them over the same input Image and draw them on a canvas.

const ctx = document.getElementById("exampleCanvas").getContext("2d");

// Build the initial reference points (in this case, in image coordinates just for convenience)
const srcPoints = [[0, 0], [0, h], [w, 0], [w, h]];
let dstPoints = [[0, 0], [0, h], [w, 0], [w, h]];
// Create the homography object (it is not necessary to set transform as "projective" as it will be automatically detected)
const homography = new Homography(); 
// Set the static parameters of all the transforms sequence (it will improve the performance of subsequent warpings)
homography.setSourcePoints(srcPoints);
homography.setImage(inputImg);

// Set the parameters for building the future dstPoints at each frame (5 movements of 50 frames each one)
const framesPerMovement = 50;
const movements = [[[0, h/5], [0, -h/5], [0, 0], [0, 0]],
                   [[w, 0], [w, 0], [-w, 0], [-w, 0]],
                   [[0, -h/5], [0, h/5], [0, h/5], [0, -h/5]],
                   [[-w, 0], [-w, 0], [w, 0], [w, 0]],
                   [[0, 0], [0, 0], [0, -h/5], [0, h/5]]];

for(let movement = 0; movement<movements.length; movement++){
    for (let step = 0; step<framesPerMovement; step++){
        // Create the new dstPoints (in Computer Vision applications these points will usually come from webcam detections)
        for (let point = 0; point<srcPoints.length; point++){
            dstPoints[point][0] += movements[movement][point][0]/framesPerMovement;
            dstPoints[point][1] += movements[movement][point][1]/framesPerMovement;
        }
        
        // Update the destiny points and calculate the new warping. 
        homography.setDestinyPoints(dstPoints);
        const img = homography.warp(); //No parameters warp will reuse the previously setted image
        // Clear the canvas and draw the new image (using putImageData instead of drawImage for performance reasons)
        ctx.clearRect(0, 0, w, h);
        ctx.putImageData(img, Math.min(dstPoints[0][0], dstPoints[2][0]), Math.min(dstPoints[0][1], dstPoints[2][1]));
        await new Promise(resolve => setTimeout(resolve, 0.1)); // Just a trick for forcing canvas to refresh
    }
}

*Just take attention to the use of setSourcePoints(srcPoints), setImage(inputImg), setDestinyPoints(dstPoints) and warp(). The rest of code is just to generate coherent sequence of destiny points and drawing the results

With Node.js

// Import the Homography class and the loadImage function 
import { Homography , loadImage} from 'homography-js';
// Import the file stream just for saving the image in some place when warped
import fs from 'fs';

// Define the source and destiny points
const sourcePoints = [[0, 0], [0, 1], [1, 0], [1, 1]];
const dstPoints = [[1/10, 1/2], [0, 1], [9/10, 1/2], [1, 1]];
// Create the homography object and set the reference points
const homography = new Homography()
homography.setReferencePoints(sourcePoints, dstPoints);
// Here, in backend we can use `await loadImage(<img_path>)` instead of an HTMLImageElement 
homography.setImage(await loadImage('./testImg.png'));
// And when warping, we get a pngImage from the 'pngjs2' package instead of an ImageData
const pngImage = homography.warp();
// Just for visualizing the results, we write it in a file.
pngImage.pipe(fs.createWriteStream("transformedImage.png"))

Performance

Benchmark results for every kind of transformation.
  • Image Data Warping section indicates the time for calculating the transformation matrix between a pair of Source and Destiny reference points and appling this transform over an image of size NxN. It generates a persistent ImageData object that can be directly drawn in any Canvas at a negligible computational cost, through context.putImageData(imgData, x, y).
  • 400x400 ↦ NxN, indicates the size of the input image and the size of the expected output image. The CSS Transform Calculation section does not include this information since these sizes does not affect to its performance.
  • First frame column indicates the time for calculating a single image warping, while Rest of Frames column indicates the time for calculating each one of multiple different warpings on the same input image. Frame Rate (1/Rest of Frames) indicates the amount of frames that can be calculated per second.
  • You can test the concrete performance of your objective device just by executing the benchmark.html. Take into account that this execution can take some minutes, since it executes 2,000 frames for each single warping experiment, and 200,000 for each CSS experiment.

Performance tests on an Average Desktop PC.

Intel Core i5-7500 Quad-Core. Chrome 92.0.4515.107. Windows 10.
Image Data Warping
400x400 ↦ 200x200400x400 ↦ 400x400400x400 ↦ 800x800
TransformFirst FrameRest of FramesFrame RateFirst FrameRest of FramesFrame RateFirst FrameRest of FramesFrame Rate
Affine5 ms0.7 ms1,439 fps14 ms2.7 ms366.7 fps13 ms10.8 ms92.6 fps
Projective6 ms1.9 ms527.4 fps21 ms7.2 ms139.7 fps30 ms27.5 ms36.3 fps
Piecewise Aff. (2 Triangles)7 ms1.1 ms892.9 fps19 ms4.4 ms227.9 fps40 ms16.5 ms60.6 fps
Piecewise Aff. (360 Tri.)26 ms2.1 ms487 fps21 ms4.6 ms216.1 fps41 ms22.4 ms44.6 fps
Piecewise Aff. (~23,000 Tri.)257 ms24.3 ms41.2 fps228 ms11.5 ms87.1 fps289 ms62 ms16.1 fps
CSS Transform Calculation
TransformFirst FrameRest of FramesFrame Rate
Affine4 ms0.00014 ms1,696,136.44 fps
Projective4 ms0.016 ms61,650.38 fps

Performance tests on a budget smartphone (a bit destroyed).

Xiaomi Redmi Note 5. Chrome 92.0.4515.115. Android 8.1.0
Image Data Warping
400x400 ↦ 200x200400x400 ↦ 400x400400x400 ↦ 800x800
TransformFirst FrameRest of FramesFrame RateFirst FrameRest of FramesFrame RateFirst FrameRest of FramesFrame Rate
Affine25 ms4.5 ms221.5 fps84 ms16.9 ms59.11 fps127 ms64.7 ms15.46 fps
Projective38 ms15.5 ms64.4 fps150 ms56.8 ms17.6 fps232 ms216 ms4.62 fps
Piecewise Affine (2 Triangles)35 ms8.8 ms113.9 fps316 ms31.7 ms31.6 fps138 ms118 ms8.5 fps
Piecewise Aff. (360 Tri.)151 ms14.3 ms70 fps138 ms30.2 ms33 fps274 ms149 ms6.7 fps
Piecewise Aff. (~23,000 Tri.)1.16 s162 ms6.15 fps1.16 s75 ms13.3 fps1.47 s435 ms2.3 fps
CSS Transform Calculation
TransformFirst FrameRest of FramesFrame Rate
Affine21 ms0.0104 ms96,200.10 fps
Projective22 ms0.025 ms40,536.71 fps

Keywords

FAQs

Package last updated on 05 Aug 2021

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc