Skip to content

Reproduction of "Generative Modeling by Estimating Gradients of the Data Distribution" (NeurIPS, 2019) in Tensorflow 2.0

Notifications You must be signed in to change notification settings

Xemnas0/NCSN-TF2.0

Repository files navigation

NCSN-TF2.0

Reproduction of "Generative Modeling by Estimating Gradients of the Data Distribution" by Yang Song and Stefano Ermon (NeurIPS 2019) in Tensorflow 2.0.

Github | OpenReview | Paper

Created for the Reproducibility Challenge @ NeurIPS 2019.

Instructions for running the code

The main file to run is main.py. Different options are available depending on the task to perform.

--experiment train for training the model.

--experiment generate to sample through Langevin dynamics with a trained model.

--experiment inpainting for performing inpainting (different patterns of occlusion are available).

--experiment toytrain to run the toy experiment.

--experiment evaluation for computing inception and FID score.

--experiment k_nearest to samples images and finding the k pixel-wise nearest images in the dataset.

--experiment intermediate to sample images and save them at each level of noise.

--experiment celeb_a_statistics for computing inception and FID score on CelebA.

--help for additional info, or check out utils.py.

About

Reproduction of "Generative Modeling by Estimating Gradients of the Data Distribution" (NeurIPS, 2019) in Tensorflow 2.0

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •