New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

torchsr

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

torchsr

Super Resolution Networks for pytorch

  • 1.0.4
  • PyPI
  • Socket score

Maintainers
1

DOI PyPI GitHub

Super-Resolution Networks for Pytorch

Super-resolution is a process that increases the resolution of an image, adding additional details. Methods using neural networks give the most accurate results, much better than other interpolation methods. With the right training, it is even possible to make photo-realistic images.

For example, here is a low-resolution image, magnified x4 by a neural network, and a high resolution image of the same object:

Pixelated image of a butterfly Smooth magnified image High resolution image

In this repository, you will find:

  • the popular super-resolution networks, pretrained
  • common super-resolution datasets
  • a unified training script for all models

Models

The following pretrained models are available. Click on the links for the paper:

Newer and larger models perform better: the most accurate models are EDSR (huge), RCAN and NinaSR-B2. For practical applications, I recommend a smaller model, such as NinaSR-B1.

Expand benchmark results
Set5 results
NetworkParameters (M)2x (PSNR/SSIM)3x (PSNR/SSIM)4x (PSNR/SSIM)
carn1.5937.88 / 0.960034.32 / 0.926532.14 / 0.8942
carn_m0.4137.68 / 0.959434.06 / 0.924731.88 / 0.8907
edsr_baseline1.3737.98 / 0.960434.37 / 0.927032.09 / 0.8936
edsr40.738.19 / 0.960934.68 / 0.929332.48 / 0.8985
ninasr_b00.1037.72 / 0.959433.96 / 0.923431.77 / 0.8877
ninasr_b11.0238.14 / 0.960934.48 / 0.927732.28 / 0.8955
ninasr_b210.038.21 / 0.961234.61 / 0.928832.45 / 0.8973
rcan15.438.27 / 0.961434.76 / 0.929932.64 / 0.9000
rdn22.138.12 / 0.960933.98 / 0.923432.35 / 0.8968
Set14 results
NetworkParameters (M)2x (PSNR/SSIM)3x (PSNR/SSIM)4x (PSNR/SSIM)
carn1.5933.57 / 0.917330.30 / 0.841228.61 / 0.7806
carn_m0.4133.30 / 0.915130.10 / 0.837428.42 / 0.7764
edsr_baseline1.3733.57 / 0.917430.28 / 0.841428.58 / 0.7804
edsr40.733.95 / 0.920130.53 / 0.846428.81 / 0.7872
ninasr_b00.1033.24 / 0.914430.02 / 0.835528.28 / 0.7727
ninasr_b11.0233.71 / 0.918930.41 / 0.843728.71 / 0.7840
ninasr_b210.034.00 / 0.920630.53 / 0.846128.80 / 0.7863
rcan15.434.13 / 0.921630.63 / 0.847528.85 / 0.7878
rdn22.133.71 / 0.918230.07 / 0.837328.72 / 0.7846
DIV2K results (validation set)
NetworkParameters (M)2x (PSNR/SSIM)3x (PSNR/SSIM)4x (PSNR/SSIM)8x (PSNR/SSIM)
carn1.5936.08 / 0.945132.37 / 0.887130.43 / 0.8366N/A
carn_m0.4135.76 / 0.942932.09 / 0.882730.18 / 0.8313N/A
edsr_baseline1.3736.13 / 0.945532.41 / 0.887830.43 / 0.8370N/A
edsr40.736.56 / 0.948532.75 / 0.893330.73 / 0.8445N/A
ninasr_b00.1035.77 / 0.942832.06 / 0.881830.09 / 0.829326.60 / 0.7084
ninasr_b11.0236.35 / 0.947132.51 / 0.889230.56 / 0.840526.96 / 0.7207
ninasr_b210.036.52 / 0.948232.73 / 0.892630.73 / 0.843727.07 / 0.7246
rcan15.436.61 / 0.948932.78 / 0.893530.73 / 0.844727.17 / 0.7292
rdn22.136.32 / 0.946832.04 / 0.882230.61 / 0.8414N/A
B100 results
NetworkParameters (M)2x (PSNR/SSIM)3x (PSNR/SSIM)4x (PSNR/SSIM)
carn1.5932.12 / 0.898629.07 / 0.804227.58 / 0.7355
carn_m0.4131.97 / 0.897128.94 / 0.801027.45 / 0.7312
edsr_baseline1.3732.15 / 0.899329.08 / 0.805127.56 / 0.7354
edsr40.732.35 / 0.901929.26 / 0.809627.72 / 0.7419
ninasr_b00.1031.97 / 0.897428.90 / 0.800027.36 / 0.7290
ninasr_b11.0232.24 / 0.900429.13 / 0.806127.62 / 0.7377
ninasr_b210.032.32 / 0.901429.23 / 0.808727.71 / 0.7407
rcan15.432.39 / 0.902429.30 / 0.810627.74 / 0.7429
rdn22.132.25 / 0.900628.90 / 0.800427.66 / 0.7388
Urban100 results
NetworkParameters (M)2x (PSNR/SSIM)3x (PSNR/SSIM)4x (PSNR/SSIM)
carn1.5931.95 / 0.926328.07 / 0.84926.07 / 0.78349
carn_m0.4131.30 / 0.920027.57 / 0.83925.64 / 0.76961
edsr_baseline1.3731.98 / 0.927128.15 / 0.85226.03 / 0.78424
edsr40.732.97 / 0.935828.81 / 0.86526.65 / 0.80328
ninasr_b00.1031.33 / 0.920427.48 / 0.837425.45 / 0.7645
ninasr_b11.0232.48 / 0.931928.29 / 0.855526.25 / 0.7914
ninasr_b210.032.91 / 0.935428.70 / 0.864026.54 / 0.8008
rcan15.433.19 / 0.937229.01 / 0.86826.75 / 0.80624
rdn22.132.41 / 0.931027.49 / 0.83826.36 / 0.79460

All models are defined in torchsr.models. Other useful tools to augment your models, such as self-ensemble methods and tiling, are present in torchsr.models.utils.

Datasets

The following datasets are available. Click on the links for the project page:

All datasets are defined in torchsr.datasets. They return a list of images, with the high-resolution image followed by downscaled or degraded versions. Data augmentation methods are provided in torchsr.transforms.

Datasets are downloaded automatically when using the download=True flag, or by running the corresponding script i.e. ./scripts/download_div2k.sh.

Usage

from torchsr.datasets import Div2K
from torchsr.models import ninasr_b0
from torchvision.transforms.functional import to_pil_image, to_tensor

# Div2K dataset
dataset = Div2K(root="./data", scale=2, download=False)

# Get the first image in the dataset (High-Res and Low-Res)
hr, lr = dataset[0]

# Download a pretrained NinaSR model
model = ninasr_b0(scale=2, pretrained=True)

# Run the Super-Resolution model
lr_t = to_tensor(lr).unsqueeze(0)
sr_t = model(lr_t)
sr = to_pil_image(sr_t.squeeze(0))
sr.show()
Expand more examples
from torchsr.datasets import Div2K
from torchsr.models import edsr, rcan
from torchsr.models.utils import ChoppedModel, SelfEnsembleModel
from torchsr.transforms import ColorJitter, Compose, RandomCrop

# Div2K dataset, cropped to 256px, width color jitter
dataset = Div2K(
    root="./data", scale=2, download=False,
    transform=Compose([
        RandomCrop(256, scales=[1, 2]),
        ColorJitter(brightness=0.2)
    ]))

# Pretrained RCAN model, with tiling for large images
model = ChoppedModel(
    rcan(scale=2, pretrained=True), scale=2,
    chop_size=400, chop_overlap=10)

# Pretrained EDSR model, with self-ensemble method for higher quality
model = SelfEnsembleModel(edsr(scale=2, pretrained=True))

Training

A script is available to train the models from scratch, evaluate them, and much more. It is not part of the pip package, and requires additional dependencies. More examples are available in scripts/.

pip install piq tqdm tensorboard  # Additional dependencies
python -m torchsr.train -h
python -m torchsr.train --arch edsr_baseline --scale 2 --download-pretrained --images test/butterfly.png --destination results/
python -m torchsr.train --arch edsr_baseline --scale 2 --download-pretrained --validation-only
python -m torchsr.train --arch edsr_baseline --scale 2 --epochs 300 --loss l1 --dataset-train div2k_bicubic

You can evaluate models from the command line as well. For example, for EDSR with the paper's PSNR evaluation:

python -m torchsr.train --validation-only --arch edsr_baseline --scale 2 --dataset-val set5 --chop-size 400 --download-pretrained --shave-border 2 --eval-luminance

Acknowledgements

Thanks to the people behind torchvision and EDSR, whose work inspired this repository. Some of the models available here come from EDSR-PyTorch and CARN-PyTorch.

To cite this work, please use:

@misc{torchsr,
  author = {Gabriel Gouvine},
  title = {Super-Resolution Networks for Pytorch},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/Coloquinte/torchSR}},
  doi = {10.5281/zenodo.4868308}
}

@misc{ninasr,
  author = {Gabriel Gouvine},
  title = {NinaSR: Efficient Small and Large ConvNets for Super-Resolution},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/Coloquinte/torchSR/blob/main/doc/NinaSR.md}},
  doi = {10.5281/zenodo.4868308}
}

Keywords

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc