Pytorch implementation of deformable 3D convolution network (D3Dnet). [PDF]
Our code is based on cuda and can perform deformation in any dimension of 3D convolution.
- Python 3
- pytorch (1.0.0), torchvision (0.2.2) / pytorch (1.2.0), torchvision (0.4.0)
- numpy, PIL
- Visual Studio 2015
Compile deformable 3D convolution:
- Cd to
code/dcn
. - For Windows users, run
cmd make.bat
. For Linux users, runbash make.sh
. The scripts will build D3D automatically and create some folders. - We offer customized settings for any dimension (e.g., Temporal, Height, Width) you want to deform. See
code/dcn/test.py
for more details.
- Download the Vimeo dataset and put the images in
code/data/Vimeo
. - Cd to
code/data/Vimeo
and rungenerate_LR_Vimeo90K.m
to generate training data as below:
Vimeo
└── sequences
├── 00001
├── 00002
├── ...
└── LR_x4
├── 00001
├── 00002
├── ...
├── readme.txt
├── sep_trainlist.txt
├── sep_testlist.txt
└── generate_LR_Vimeo90K.m
- Download the dataset Vid4 and SPMC-11 dataset in https://pan.baidu.com/s/1PKZeTo8HVklHU5Pe26qUtw (Code: 4l5r) and put the folder in
code/data
. - (optional) You can also download Vid4 and SPMC-11 or other video datasets and prepare test data in
code/data
as below:
data
└── dataset_1
└── scene_1
└── hr
├── hr_01.png
├── hr_02.png
├── ...
└── hr_M.png
└── lr_x4
├── lr_01.png
├── lr_02.png
├── ...
└── lr_M.png
├── ...
└── scene_M
├── ...
└── dataset_N
We have organized the Matlab code framework of Video Quality Assessment metric T-MOVIE and MOVIE. [Code]
Welcome to have a look and use our code.
A demo video is available at https://wyqdatabase.s3-us-west-1.amazonaws.com/D3Dnet.mp4
@article{D3Dnet,
author = {Ying, Xinyi and Wang, Longguang and Wang, Yingqian and Sheng, Weidong and An, Wei and Guo, Yulan},
title = {Deformable 3D Convolution for Video Super-Resolution},
journal = {IEEE Signal Processing Letters},
volume = {27},
pages = {1500-1504},
year = {2020},
}
This code is built on [DCNv2] and [SOF-VSR]. We thank the authors for sharing their codes.
Please contact us at [email protected] for any question.