Introduction to Learning and Intelligent Systems

ETH Zurich, Prof. Andreas Krause, Spring Semester 2015

Course Description

The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project.


Date What?
  • We have updated the first part of the third problem in Exercise 6.
  • The sample questions can be found here.
  • Clarifications added to Excercise 7.
  • No class on 27.5.15. Tutorials will take place this week.
  • Project 3 and the test results for project 2 have been released.
  • Project 2 and the test results for project 1 have been released.
  • Webpage online.

General Information

VVZ Information: See here.

Time and Place

  • Lectures
    Tue13-15 ML D 28
    Wed13-15 ML D 28
  • Tutorials
    Tue15-17 LFW C 11Last Names A-E
    Tue15-17 LFW E 15Last Names F-K
    Fri13-15 HG D 3.1Last Names L-P
    Fri13-15 HG D 3.3Last Names Q-Z

* All tutorial sessions are identical, please only attend one session.


Lecture Topics Lecture Slides Tutorial Slides Exercise Sheets & Solutions
Introduction to Learning and Intelligent Systems [pdf]
Linear regression [pdf] [pdf] [solution]
Cross validation
Linear classification
Kernels and kernelized perceptron
Non-linear predictions
[pdf] [html] [ipynb] [pdf] [solution]
Feature selection/sparsity,
Class imbalance, multi-class
Neural networks
Feature learning
[pdf] [pdf] [solution]
Unsupervised learning:
k-Means, PCA
kernel-PCA, Autoencoders
Probabilistic modeling
Bias-variance tradeoff
Logistic regression
[pdf] [pdf] [solution]
Bayesian Decision Theory [pdf]
Discriminative vs. generative models
Naive Bayes classifiers
Latent variable models
Gaussian mixtures
[pdf] [pdf] [solution updated]
Time-series models
Markov Chains
[pdf] [pdf (typos fixed)] [solution]
Hidden Markov Models [pdf] [pdf (updated)] [solution] [code]

Some of the material is password protected, send an email from your address to to obtain it.


The exercise problems will include theoretical and programming problems. Please note that it is not mandatory to submit solutions. We will publish exercise solutions after one week.

If you choose to submit:
  • Send a soft copy of the exercise from your address to This can be latex, but also a simple scan or even a picture of a hand-written solution.
  • Please do not submit hard copies of your solutions.


Part of the coursework will be a project, carried out in groups of up to 3 students. The goal of this project is to get hands-on experience in machine learning tasks. The project grade will constitute 30% of the total grade. More details on the project will be given in the tutorials.


The mode of examination is written, 120 minutes length. The language of examination is English. As written aids, you can bring two A4 pages (i.e. one A4 sheet of paper), either handwritten or 11 point minimum font size. The written exam will constitute 70% of the total grade.



Java Applets

Text Books

  • K. Murphy. Machine Learning: a Probabilistic Perspective. MIT Press 2012.
  • C. Bishop. Pattern Recognition and Machine Learning. Springer 2007.
    This is an excellent introduction to machine learning that covers most topics which will be treated in the lecture. Contains lots of exercises, some with exemplary solutions.
  • T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001.
    Another comprehensive text, written by three Stanford statisticians. Covers additive models and boosting in great detail.
    A free PDF version (second edition) is available online
  • L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.
    This book is a compact treatment of statistics that facilitates a deeper understanding of machine learning methods.


  • The official Matlab documentation is available online at the Mathworks website.
  • If you have trouble accessing Matlab's built-in help function, you can use the online function reference on that page or use the command-line version (type help <function> at the prompt). There are several primers and tutorials on the web, a later edition of this one became the book Matlab Primer by T. Davis and K. Sigmon, CRC Press, 2005.


  • NIPS '13 tutorial on Deep Learning for Computer Vision by Rob Fergus available on his website.

Discussion Forum

We maintain a discussion board at the VIS inforum. Use it to ask questions of general interest and interact with other students of this class. We regularly visit the board to provide answers.


If you have any questions please send them to from your address.