Skip to content
/ PacGAN Public

[NeurIPS 2018] [JSAIT] PacGAN: The power of two samples in generative adversarial networks

License

Notifications You must be signed in to change notification settings

fjxmlzn/PacGAN

Repository files navigation

PacGAN: The power of two samples in generative adversarial networks

[paper (arXiv)] [paper (NeurIPS)] [paper (JSAIT)] [poster] [website] [interview (youtube)] [code]

Authors: Zinan Lin, Ashish Khetan, Giulia Fanti, Sewoong Oh

Abstract: Generative adversarial networks (GANs) are innovative techniques for learning generative models of complex data distributions from samples. Despite remarkable recent improvements in generating realistic images, one of their major shortcomings is the fact that in practice, they tend to produce samples with little diversity, even when trained on diverse datasets. This phenomenon, known as mode collapse, has been the main focus of several recent advances in GANs. Yet there is little understanding of why mode collapse happens and why existing approaches are able to mitigate mode collapse. We propose a principled approach to handling mode collapse, which we call packing. The main idea is to modify the discriminator to make decisions based on multiple samples from the same class, either real or artificially generated. We borrow analysis tools from binary hypothesis testing---in particular the seminal result of Blackwell [Bla53]---to prove a fundamental connection between packing and mode collapse. We show that packing naturally penalizes generators with mode collapse, thereby favoring generator distributions with less mode collapse during the training process. Numerical experiments on benchmark datasets suggests that packing provides significant improvements in practice as well.

Codes for reproducing results in paper

Prerequisites

The codes are based on GPUTaskScheduler library. Please install it first.

Code list

About

[NeurIPS 2018] [JSAIT] PacGAN: The power of two samples in generative adversarial networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages