by , , , , , ,
Modelling statistical relationships beyond the conditional mean is crucial in many settings. Conditional density estimation (CDE) aims to learn the full conditional probability density from data. Though highly expressive, neural network based CDE models can suffer from severe over-fitting when trained with the maximum likelihood objective. Due to the inherent structure of such models, classical regularization approaches in the parameter space are rendered ineffective. To address this issue, we develop a model-agnostic noise regularization method for CDE that adds random perturbations to the data during training. We demonstrate that the proposed approach corresponds to a smoothness regularization and prove its asymptotic consistency. In our experiments, noise regularization significantly and consistently outperforms other regularization methods across seven data sets and three CDE models. The effectiveness of noise regularization makes neural network based CDE the preferable method over previous non- and semi-parametric approaches, even when training data is scarce.Archiveprefix=arXiv
Noise Regularization for Conditional Density Estimation J. Rothfuss, F. Ferreira, S. Boehm, S. Walther, M. Ulrich, T. Asfour, A. KrauseArXiv, 2019
Bibtex Entry:
@misc{rothfuss2019noiseregularizationArchiveprefix = {arXiv},
    Author = {Rothfuss, Jonas and Ferreira, Fabio and Boehm, Simon and Walther, Simon and Ulrich, Maxim and Asfour, Tamim and Krause, Andreas},
    Eprint = {1907.08982},
    Month = {July},
    Primaryclass = {stat.ML},
    Publisher = {ArXiv},
    Title = {Noise Regularization for Conditional Density Estimation},
    Year = {2019},