by R. Gomes, A. Krause, P. Perona
Abstract:
Is there a principled way to learn a probabilistic discriminative classifier from an unlabeled data set? We present a framework that simultaneously clusters the data and trains a discriminative classifier. We call it Regularized Information Maximization (RIM). RIM optimizes an intuitive information-theoretic objective function which balances class separation, class balance and classifier complexity. The approach can flexibly incorporate different likelihood functions, express prior assumptions about the relative size of different classes and incorporate partial labels for semi-supervised learning. In particular, we instantiate the framework to unsupervised, multi-class kernelized logistic regression. Our empirical evaluation indicates that RIM outperforms existing methods on several real data sets, and demonstrates that RIM is an effective model selection method.
Reference:
Discriminative Clustering by Regularized Information Maximization R. Gomes, A. Krause, P. PeronaIn Proc. Neural Information Processing Systems (NeurIPS), 2010
Bibtex Entry:
@inproceedings{gomes10discriminative,
author = {Ryan Gomes and Andreas Krause and Pietro Perona},
booktitle = {Proc. Neural Information Processing Systems (NeurIPS)},
title = {Discriminative Clustering by Regularized Information Maximization},
year = {2010}}