Olivier Bachem

  • PhD Student
  • External Website
  • olivier.bachem@inf.ethz.ch
  • CAB E 62.2
  • +41 44 632 35 39
  • I am interested in scaling up machine learning using sampling based methods. In particular, I work on an approach originating from computational geometry called coresets: The idea behind coresets is that massive datasets can be represented by a small subset of representative data points while retaining theoretical guarantees on the approximation quality. It is hence possible to obtain provably “competitive” solutions by training models on such coresets. Coreset sizes are usually sublinear in the number of original observations if not independent of it. This allows for substantially faster approximate inference and enables the use of algorithms with superlinear complexity due to the sublinear (or even constant) size of coresets. My research focuses on developing novel coreset construction techniques and extending the notion of coresets to previously unexplored machine learning problems.

Publications

2016
2015