by , ,
Abstract:
We consider the Bayesian active learning and experimental design problem, where the goal is to learn the value of some unknown target variable through a sequence of informative, noisy tests. In contrast to prior work, we focus on the challenging, yet practically relevant setting where test outcomes can be conditionally dependent given the hidden target variable. Under such assumptions, common heuristics, such as greedily performing tests that maximize the reduction in uncertainty of the target, often perform poorly. We propose ECED, a novel, efficient active learning algorithm, and prove strong theoretical guarantees that hold with correlated, noisy tests. Rather than directly optimizing the prediction error, at each step, ECED picks the test that maximizes the gain in a surrogate objective, which takes into account the dependencies between tests. Our analysis relies on an information-theoretic auxiliary function to track the progress of ECED, and utilizes adaptive submodularity to attain the near-optimal bound. We demonstrate strong empirical performance of ECED on two problem instances, including a Bayesian experimental design task intended to distinguish among economic theories of how people make risky decisions, and an active preference learning task via pairwise comparisons.
Reference:
Near-optimal Bayesian Active Learning with Correlated and Noisy Tests Y. Chen, S. H. Hassani, A. KrauseIn Proc. International Conference on Artificial Intelligence and Statistics (AISTATS), 2017
Bibtex Entry:
@inproceedings{chen17noisyal,
	author = {Yuxin Chen and S. Hamed Hassani and Andreas Krause},
	booktitle = {Proc. International Conference on Artificial Intelligence and Statistics (AISTATS)},
	month = {April},
	title = {Near-optimal Bayesian Active Learning with Correlated and Noisy Tests},
	year = {2017}}