Skip to content

Implementation of the 2023 CVPR Award Candidate: On Distillation of Guided Diffusion Models

License

Notifications You must be signed in to change notification settings

ruiqixu37/distill_diffusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

distill_diffusion

Personal pytorch implementation of the 2023 CVPR Award Candidate: On Distillation of Guided Diffusion Models. [link]

Dependencies

To install the required libraries, run:

pip install -e . 

Training

python main.py -c configs/<config-name.yaml>

Notice

This repository is currently a work in progress. Many features are still being implemented and some features may be different from the original paper due to my ongoing understanding of its details. The code has not been fully tested yet and cannot match the original due to my hardware limitations. I will continue to improve this repository in the future. If you have any questions, please feel free to open an issue so we can discuss and make this repo better.

Acknowledgement

This repository is based on the following repositories:

@misc{Subramanian2020,
  author = {Subramanian, A.K},
  title = {PyTorch-VAE},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/AntixK/PyTorch-VAE}}
}
@article{Kingma2021VariationalDM,
  title={Variational Diffusion Models},
  author={Diederik P. Kingma and Tim Salimans and Ben Poole and Jonathan Ho},
  journal={ArXiv},
  year={2021}
}

About

Implementation of the 2023 CVPR Award Candidate: On Distillation of Guided Diffusion Models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages