Skip to content

bia006/DARTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DARTS

This repository contains the implementation of the following paper:

DARTS: Double Attention Reference-based Transformer for Super-resolution
Masoomeh Aslahishahri, Jordan Ubbens, Ian Stavness

[Paper]

Overview

overall_structure

Dependencies and Installation

  1. Clone Repo

    git clone https://github.com/bia006/DARTS.git
  2. Create Conda Environment

    conda create --name DARTS python=3.8
    conda activate DARTS
  3. Install Dependencies

    cd DARTS
    pip install -r requirements.txt

Dataset Preparation

Please refer to Datasets.md for pre-processing and more details.

Get Started

Pretrained Models

Downloading the pretrained models from this link and put them under mmsr/checkpoints folder.

Test

We provide quick test code with the pretrained model.

  1. Modify the paths to dataset and pretrained model in the following yaml files for configuration.

    ./options/test/test_DARTS.yml
  2. Check out the results in ./results.

Train

All logging files in the training process, e.g., log message, checkpoints, and snapshots, will be saved to ./mmsr/checkpoints and ./tb_logger directory.

  1. Modify the paths to dataset in the following yaml files for configuration.

    ./options/train/train_DARTS.yml
  2. Train the transformer network.

    python mmsr/train.py -opt "options/train/train_DATSR.yml"

    Visual Results

For more results on the benchmarks, you can directly download our DARTS results from here.

result

Citation

If you find our repo useful for your research, please cite our paper.

License and Acknowledgement

This project is open sourced under MIT license. The code framework is mainly modified from StyleSwin. Please refer to the original repo for more usage and documents.

Contact

If you have any question, please feel free to contact us via [email protected].

About

No description, website, or topics provided.

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published