by , , , , , ,
Generative Adversarial Networks (GANs) have shown remarkable results in mod- eling complex distributions, but their evaluation remains an unsettled issue. Evalua- tions are essential for: (i) relative assessment of different models and (ii) monitoring the progress of a single model throughout training. The latter cannot be determined by simply inspecting the generator and discriminator loss curves as they behave non-intuitively. We leverage the notion of duality gap from game theory to propose a measure that addresses both (i) and (ii) at a low computational cost. Exten- sive experiments show the effectiveness of this measure to rank different GAN models and capture the typical GAN failure scenarios, including mode collapse and non-convergent behaviours. This evaluation metric also provides meaningful monitoring on the progression of the loss during training. It highly correlates with FID on natural image datasets, and with domain specific scores for text, sound and cosmology data where FID is not directly suitable. In particular, our proposed metric requires no labels or a pretrained classifier, making it domain agnostic.
A Domain Agnostic Measure for Monitoring and Evaluating GANs P. Grnarova, K. Y. Levy, A. Lucchi, N. Perraudin, I. Goodfellow, T. Hofmann, A. KrauseIn Proc. Neural Information Processing Systems (NeurIPS), 2019
Bibtex Entry:
	author = {Paulina Grnarova and Kfir Y. Levy and Aurelien Lucchi and Nathanael Perraudin and Ian Goodfellow and Thomas Hofmann and Andreas Krause},
	booktitle = {Proc. Neural Information Processing Systems (NeurIPS)},
	month = {December},
	title = {A Domain Agnostic Measure for Monitoring and Evaluating GANs},
	year = {2019}}