Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

psnr-hvsm

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

psnr-hvsm

Accelerated implementations of the PSNR-HVS, PSNR-HVS-M, PSNR-HA and PSNR-HMA image metrics for NumPy and PyTorch

  • 0.2.4
  • PyPI
  • Socket score

Maintainers
1

PSNR-HVS-M, PSNR-HA and PSNR-HMA metrics for NumPy and PyTorch

cibuildwheel python_versions pypi license

Accelerated Python package for computing several image metrics based on human perception with backends in NumPy, PyTorch and C++.

This is an implementation of the PSNR-HVS, PSNR-HVS-M, PSNR-HA and PSNR-HMA metrics developed by Nikolay Ponomarenko.

The values produced by this library have been cross-checked against the results within the TID2013 dataset. (See the folder tid2013_results.) The only difference is that this library follows the common convention that PSNR for identical signals equals 100.0.

A miniscule discrepancy for PSNR-HMA (<0.01dB on average) is under investigation.

Bibliography

Installation

psnr_hvsm supports Python 3.7-3.12. Packages are distributed on PyPi. Be sure to have an up-to-date pip to be able to install the correct packages on Linux:

python -m pip install --upgrade pip
pip install psnr_hvsm

With PyTorch support:

pip install psnr_hvsm[torch]

Usage

Command line

Command line support is an extra that pulls imageio:

pip install psnr_hvsm[command_line]
python -m psnr_hvsm original.png distorted.png
Choosing a backend

The backend can be set by setting the PSNR_HVSM_BACKEND environment variable. Valid backends are:

  • numpy - pure NumPy using scipy.fft for DCT
  • cpp - C++ using FFTW3
  • torch - PyTorch; install as psnr_hvsm[torch] to install PyTorch as well
export PSNR_HVSM_BACKEND=torch
python -m psnr_hvsm original.png distorted.png

The default device for PyTorch is cuda but it can be changed by setting the PSNR_HVSM_TORCH_DEVICE environment variable.

As a library

The function psnr_hvs_hvsm accepts images as single-channel floating-point NumPy arrays. The images need to be normalised, i.e. the values need to be in the range [0,1]. This can be achieved by converting the image to float and dividing by the maximum value given the bit depth. For 8 bits per component this is 255.

The images must be padded to a multiple of 8 in each dimension.

from imageio import imread
from psnr_hvsm import psnr_hvs_hvsm, bt601ycbcr

image1 = imread('tests/baboon.png').astype(float) / 255
image2 = imread('tests/baboon_msk.png').astype(float) / 255

image1_y, *_ = bt601ycbcr(image1)
image2_y, *_ = bt601ycbcr(image2)

psnr_hvs, psnr_hvsm = psnr_hvs_hvsm(image1, image2)

print(psnr_hvs, psnr_hvsm)
34.427054505764424 51.64722121999962

If you need to measure PSNR-HVS and PSNR-HVS-M on an RGB image, you need to convert it to an YUV colorspace and pass in only the luma component.

PyTorch support

The PyTorch backend can be used for use with gradient descent algorithms and computation on GPUs. In order to use the PyTorch backend, either install the package with the torch extra:

pip install psnr_hvsm[torch]

If your PyTorch installation was manual, you need torch-dct in order to use the PyTorch backend:

pip install "torch-dct>=0.1.6"

An important distinction is that the functions that expect 3-channel input now expect (...,C,H,W) format in the PyTorch implementation. The PyTorch backend can be enabled by importing it directly from psnr_hvsm.torch:

import torch
from imageio import imread
from psnr_hvsm.torch import psnr_hvs_hvsm, bt601ycbcr

image1 = imread('tests/baboon.png').astype(float) / 255
image2 = imread('tests/baboon_msk.png').astype(float) / 255

image1 = torch.tensor(image1, device='cuda').moveaxis(-1, -3)  # convert to (N,C,H,W) format
image2 = torch.tensor(image2, device='cuda').moveaxis(-1, -3)  # convert to (N,C,H,W) format

image1_y, *_ = bt601ycbcr(image1)
image2_y, *_ = bt601ycbcr(image2)

psnr_hvs, psnr_hvsm = psnr_hvs_hvsm(image1_y, image2_y)

Alternatively, set the PSNR_HVSM_BACKEND environment variable to torch:

import os
os.environ['PSNR_HVSM_BACKEND'] = 'torch'

from psnr_hvsm import psnr_hvs_hvsm
# rest of code
# ...
Note on gradients

Some operations in the calculation of the HVS-M MSE lead to problems with the gradient and therefore a parameter called masking_epsilon (defaulting to 0.0) has been added to the PyTorch versions of hvs_hvsm_mse_tiles, hvs_hvsm_mse and psnr_hvs_hvsm. Set it to a small value (determined by your own experimentation) if you need the result of HVS-M for gradient descent.

Computing metrics for the TID2013 dataset

If you have a copy of the TID2013 dataset, you can re-verify the metrics for yourself:

python -m psnr_hvsm.tid2013_metrics D:\tid2013\ .\tid2013_results\

Other exported functions

  • hvs_hvsm_mse_tiles - compute HVS and HVS-M scores on all 8x8 tiles in the images, returns an array of numbers
  • hvs_hvsm_mse - compute average HVS and HVS-M scores

Building

Dependencies

psnr_hvsm has several dependencies:

FFTW3 is automatically resolved by CMake and the rest can be installed by creating a conda environment using the provided YAML file:

conda env create -f psnr_hvsm-dev.yml

Development mode

To install in development mode:

pip install --upgrade -r requirements.txt

Creating Python wheel

pip install --upgrade -r requirements-build.txt
python setup.py bdist_wheel

Running tests on different versions of Python using tox

pip install --upgrade -r requirements-tox.txt
tox --parallel auto

Keywords

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc