Introduction to Learning and Intelligent Systems
ETH Zurich, Prof. Andreas Krause, Spring Semester 2015
Course Description
The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexitiy. We will discuss important machine learning algorithms used in practice, and provide handson experience in a course project.
News
Date  What? 

09.08.15 

02.07.15 

22.06.15 

26.05.15 

30.03.15 

17.03.15 

13.02.15 

General Information
VVZ Information: See here.Time and Place
 Lectures
Tue 1315 ML D 28 Wed 1315 ML D 28  Tutorials
Tue 1517 LFW C 11 Last Names AE Tue 1517 LFW E 15 Last Names FK Fri 1315 HG D 3.1 Last Names LP Fri 1315 HG D 3.3 Last Names QZ
* All tutorial sessions are identical, please only attend one session.
Syllabus
Lecture Topics  Lecture Slides  Tutorial Slides  Exercise Sheets & Solutions 

Introduction to Learning and Intelligent Systems  [pdf]  
Linear regression  [pdf]  [pdf] [solution]  
Cross validation Regularization 
[pdf]  
Linear classification Kernels and kernelized perceptron 
[pdf]  
Kernels Nonlinear predictions 
[pdf]  [html] [ipynb]  [pdf] [solution] 
Feature selection/sparsity, Class imbalance, multiclass 
[pdf]  
Neural networks Feature learning 
[pdf]  [pdf] [solution]  
Unsupervised learning: kMeans, PCA kernelPCA, Autoencoders 
[pdf]  
Probabilistic modeling Biasvariance tradeoff Logistic regression 
[pdf]  [pdf] [solution]  
Bayesian Decision Theory  [pdf]  
Discriminative vs. generative models Naive Bayes classifiers 
[pdf]  
Latent variable models Gaussian mixtures 
[pdf]  [pdf] [solution updated]  
Timeseries models Markov Chains 
[pdf]  [pdf (typos fixed)] [solution]  
Hidden Markov Models  [pdf]  [pdf (updated)] [solution] [code] 
Some of the material is password protected, send an email from your ethz.ch address to lis2015@lists.inf.ethz.ch to obtain it.
Exercises
The exercise problems will include theoretical and programming problems. Please note that it is not mandatory to submit solutions. We will publish exercise solutions after one week.
If you choose to submit: Send a soft copy of the exercise from your ethz.ch address to lis2015@lists.inf.ethz.ch. This can be latex, but also a simple scan or even a picture of a handwritten solution.
 Please do not submit hard copies of your solutions.
Project
Part of the coursework will be a project, carried out in groups of up to 3 students. The goal of this project is to get handson experience in machine learning tasks. The project grade will constitute 30% of the total grade. More details on the project will be given in the tutorials.
Exam
The mode of examination is written, 120 minutes length. The language of examination is English. As written aids, you can bring two A4 pages (i.e. one A4 sheet of paper), either handwritten or 11 point minimum font size. The written exam will constitute 70% of the total grade.
Resources
MATLAB Demos
 Linear regression with gradient descent
 Linear regression for polynomials with gradient descent
 Gradient descent on multimodal function with bold driver learning rate
 Perceptron
 Linear support vector machine training
 Knearest neighbours classification
 Cross validation
 SVM vs perceptron
 Cost sensitive perceptron
 Support vector machine with L1regularizer
 Multiclass perceptron  one vs all
 Multiclass perceptron  one vs one
 Backpropagation in neural networks
 Nonconvex objective of neural networks
 Illustration of the universal approximator theorem
 Handwritten digit recognition using neural networks, [data]
 L2regularized logistic regression
 ''Doubtful'' logistic regression
Java Applets
Text Books
 K. Murphy. Machine Learning: a Probabilistic Perspective.
MIT Press 2012.
 C. Bishop. Pattern Recognition and Machine Learning. Springer 2007.
This is an excellent introduction to machine learning that covers most topics which will be treated in the lecture. Contains lots of exercises, some with exemplary solutions.  T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001.
Another comprehensive text, written by three Stanford statisticians. Covers additive models and boosting in great detail.
A free PDF version (second edition) is available online  L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.
This book is a compact treatment of statistics that facilitates a deeper understanding of machine learning methods.
Matlab
 The official Matlab documentation is available online at the Mathworks website.
 If you have trouble accessing Matlab's builtin help function, you can use the online function reference on that page or use the commandline version (type help <function> at the prompt). There are several primers and tutorials on the web, a later edition of this one became the book Matlab Primer by T. Davis and K. Sigmon, CRC Press, 2005.
Presentations
 NIPS '13 tutorial on Deep Learning for Computer Vision by Rob Fergus available on his website.
Discussion Forum
We maintain a discussion board at the VIS inforum. Use it to ask questions of general interest and interact with other students of this class. We regularly visit the board to provide answers.
Contact
If you have any questions please send them to lis2015@lists.inf.ethz.ch from your ethz.ch address. Instructor: Prof. Andreas Krause
 Head Assistant: Josip Djolonga
 Assistants: Olivier Bachem, Alkis Gotovos, Baharan Mirzasoleiman, Mario Lučić, Adish Singla, Sebastian Tschiatschek