Skip to content

PyTorch code for our ICCV 2023 paper "Dual Aggregation Transformer for Image Super-Resolution"

License

Notifications You must be signed in to change notification settings

yulunzhang/codebase_SR

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

71 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CodeBase

⚙️ Install

  • Python 3.8
  • PyTorch 1.8.0
  • NVIDIA GPU + CUDA
# Clone the github repo and go to the default directory 'codebase_SR'.
git clone https://github.com/yulunzhang/codebase_SR
cd codebase_SR
conda create -n sr_pytorch python=3.8
conda activate sr_pytorch
pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install -r requirements.txt
python setup.py develop

💡Tensorboard

After training starts or ends, view the validation (or loss) curve through Tensorboard.

Here, we take the example of training a model on a server and viewing Tensorboard locally.

All Tensorboard-log files are in tb_logger.

Run the following scripts in the server.

ssh -p 22 -L 16006:127.0.0.1:6006 username@remote_server_ip
tensorboard --logdir=tb_logger/xxx --port=6006

Then open the web page http:https://localhost:6006/ on your local computer.

🔗 Contents

  1. Datasets
  2. Models
  3. Training
  4. Testing

🔎 Datasets

Used training and testing sets can be downloaded as follows:

Training Set Testing Set
DIV2K (800 training images, 100 validation images) + Flickr2K (2650 images) [complete training dataset DF2K] Set5 + Set14 + BSD100 + Urban100 + Manga109 [complete testing dataset download]

Download training and testing datasets and put them into the corresponding folders of datasets/. See datasets for the detail of the directory structure.

🔎 Models

Method Params FLOPs Dataset PSNR SSIM Model Zoo
FSRCNN-x2 22.04K 5.01G Urban100 27.9280 0.8692 train_FSRCNN_patch48_batch16_x2
FSRCNN-x3 22.04K 2.22G Urban100 25.0193 0.7622 train_FSRCNN_patch48_batch16_x3
FSRCNN-x4 22.04K 1.25G Urban100 23.6120 0.6826 train_FSRCNN_patch48_batch16_x4

The performance is reported on Urban100, and output size of FLOPs is 3×1280×720.

We also provide the testing results (log) and tb_logger in the folder.

🔧 Training

  • Download training (DF2K, already processed) and testing (Set5, Set14, BSD100, Urban100, Manga109, already processed) datasets, place them in datasets/.

  • Run the following scripts. The training configuration is in options/train/.

    # FSRCNN, x2, input=48x48, 1 GPU
    python basicsr/train.py -opt options/Train/train_FSRCNN_x2.yml
    
    # FSRCNN, x3, input=48x48, 1 GPU
    python basicsr/train.py -opt options/Train/train_FSRCNN_x3.yml
    
    # FSRCNN, x4, input=48x48, 1 GPU
    python basicsr/train.py -opt options/Train/train_FSRCNN_x4.yml
  • The training experiment is in experiments/.

⚒️ Testing

🔨 Test images with HR

  • Run the following scripts. The testing configuration is in options/test/ (e.g., test_FSRCNN_x4.yml).

    # FSRCNN, x2
    python basicsr/test.py -opt options/Test/test_FSRCNN_x2.yml
    
    # FSRCNN, x3
    python basicsr/test.py -opt options/Test/test_FSRCNN_x3.yml
    
    # FSRCNN, x4
    python basicsr/test.py -opt options/Test/test_FSRCNN_x4.yml
  • The output is in results/.

⛏️ Test images without HR

  • Put your dataset (single LR images) in datasets/single. Some test images are in this folder.

  • Run the following scripts. The testing configuration is in options/test/ (e.g., test_single_x4.yml).

    # Test on your dataset, x2
    python basicsr/test.py -opt options/Test/test_single_x2.yml
    
    # Test on your dataset, x3
    python basicsr/test.py -opt options/Test/test_single_x3.yml
    
    # Test on your dataset, x4
    python basicsr/test.py -opt options/Test/test_single_x4.yml
  • The output is in results/.

About

PyTorch code for our ICCV 2023 paper "Dual Aggregation Transformer for Image Super-Resolution"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%