Library for Minimal Modern Image Super-Resolution in PyTorch
PyTorch Enhance provides a consolidated package of popular Image Super-Resolution models, datasets, and metrics to allow for quick and painless benchmarking or for quickly adding pretrained models to your application.
https://pytorch-enhance.readthedocs.io
pip install torch-enhance
git clone https://github.com/IsaacCorley/pytorch-enhance.git
cd pytorch-enhance
python setup.py install
The following models are currently implemented:
- SRCNN from Dong et. al Image Super-Resolution Using Deep Convolutional Networks
- VDSR from Lee et al. Accurate Image Super-Resolution Using Very Deep Convolutional Networks
- ESPCN from Shi et. al Real-Time Single Image and Video Super-Resolution Using an Efficient Sub-Pixel Convolutional Neural Network
- SRResNet from Ledig et. al Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network
- EDSR from Lim et. al Enhanced Deep Residual Networks for Single Image Super-Resolution
import torch
import torch_enhance
# increase resolution by factor of 2 (e.g. 128x128 -> 256x256)
model = torch_enhance.models.SRResNet(scale_factor=2, channels=3)
lr = torch.randn(1, 3, 128, 128)
sr = model(x) # [1, 3, 256, 256]
Not sure which models are currently the best? Check out the PapersWithCode Image Super-Resolution Leaderboards
The following benchmark datasets are available for usage:
BSDS300 | BSDS500 | T91 |
---|---|---|
Set5 | Set14 | Historical |
---|---|---|
- Perceptual Loss (VGG16)
- Mean Squared Error (MSE)
- Mean Absolute Error (MAE)
- Peak-Signal-Noise-Ratio (PSNR)
$ cd examples
- Get up and benchmarking quickly with PyTorch Lightning
- Coming from Keras? Try our example using the Poutyne library
$ pytest -ra
Please cite this repository if you used this code in your own work:
@software{isaac_corley_2020_3739368,
author = {Isaac Corley},
title = {PyTorch Enhance},
month = apr,
year = 2020,
publisher = {Zenodo},
version = {0.1.2},
doi = {10.5281/zenodo.3739368},
url = {https://doi.org/10.5281/zenodo.3739368}
}