by , , ,
Abstract:
The Gumbel-max trick is a method to draw a sample from a categorical distribution, given by its unnormalized (log-)probabilities. Over the past years, the machine learning community has proposed several extensions of this trick to facilitate, e.g., drawing multiple samples, sampling from structured domains, or gradient estimation for error backpropagation in neural network optimization. The goal of this survey article is to present background about the Gumbel-max trick, and to provide a structured overview of its extensions to ease algorithm selection. Moreover, it presents a comprehensive outline of (machine learning) literature in which Gumbel-based algorithms have been leveraged, reviews commonly-made design choices, and sketches a future perspective.
Reference:
A Review of the Gumbel-max Trick and its Extensions for Discrete Stochasticity in Machine Learning I. A. M. Huijben, W. Kool, M. B. Paulus, R. J. Van SlounIEEE Transactions on Pattern Analysis and Machine Intelligence, 2022
Bibtex Entry:
@misc{huijben2022Review,
	author = {Huijben, Iris A.M. and Kool, Wouter and Paulus, Max Benedikt and Van Sloun, Ruud JG},
	month = {March},
	publisher = {IEEE Transactions on Pattern Analysis and Machine Intelligence},
	title = {A Review of the Gumbel-max Trick and its Extensions for Discrete Stochasticity in Machine Learning},
	year = {2022}}