by , , , , , ,
Abstract:
Capturing statistical relationships beyond the conditional mean is crucial in many applications. To this end, conditional density estimation (CDE) aims to learn the full conditional probability density from data. Though expressive, neural network based CDE models can suffer from severe over-fitting when trained with the maximum likelihood objective. Their particular structure renders classical regularization in the parameter space ineffective. To address this challenge, we propose a model-agnostic noise regularization method for CDE that adds carefully controlled random perturbations to the data during training. We prove that the proposed approach corresponds to a smoothness regularization and establish its asymptotic consistency. Our extensive experiments show that noise regularization consistently outperforms other regularization methods across a range of neural CDE models. Furthermore, we demonstrate the effectiveness of noise regularized neural CDE over classical non- and semi-parametric methods, even when training data is scarce.
Reference:
Noise Regularization for Conditional Density Estimation J. Rothfuss, F. Ferreira, S. Boehm, S. Walther, M. Ulrich, T. Asfour, A. KrauseArXiv, 2019
Bibtex Entry:
@misc{rothfuss19noisereg,
	archiveprefix = {arXiv},
	author = {Jonas Rothfuss and Fabio Ferreira and Simon Boehm and Simon Walther and Maxim Ulrich and Tamim Asfour and Andreas Krause},
	eprint = {1907.08982},
	month = {July},
	primaryclass = {stat.ML},
	publisher = {ArXiv},
	title = {Noise Regularization for Conditional Density Estimation},
	year = {2019}}