Learning and Intelligent Systems
The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexity. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project.
VVZ Information is available
here.
News
- The final exam reviews will take place on October 4 and October 5 at 3pm in CAB G 66.2
- The final exam will take place on August 13, from 9:00-11:00 (120 minutes) in HIL F15 (Hönggerberg).
- The exam review session will be held on Thursday, August 4, from 13-15 in ML D 28.
- There is a typo in the slides of Tue 22.3. “Sparsity (Lasso, L1-SVM), Class Imbalance”: it should say argmin( … MAX(0, -y w^Tx)) and not argmin( … MIN(0, -y w^Tx)) on p.28.
- Last year’s exam can be found here (PDF).
- The recitations on Tuesday, May 24th, are merged into one being held in NO C 60.
- The lectures on Tuesday, May 10th, and Wednesday, May 11th, have been cancelled.
- The lecture on Wednesday 4th of May has been cancelled.
- The rooms for the tutorials were changed/merged. For space reasons, we ask students who are scheduled on Fridays but need to go on Tuesdays (e.g. due to conflicts with other classes) to attend the Tuesday tutorial in NO C 60.
- Dummy project is available. For details check your email.
- The video recordings of the first week’s lectures are now available at ETH Videoportal.
- The files are password protected. To obtain the password you need to be inside the ETH network and click here. To establish a VPN connection click here.
- First class on 23.2. First tutorial/recitation on 1.3.
Syllabus
Date |
Topics |
Tutorial |
HW |
Solutions |
Tue 23.2. |
Introduction |
- |
[pdf] |
[pdf] |
Wed 24.2. |
Linear regression, gradient descent |
- |
- |
- |
Tue 1.3. |
Model selection, cross-validation |
[pdf] |
- |
- |
Wed 2.3. |
Linear classification |
- |
- |
- |
Tue 8.3. |
Perceptron, SVM, SGD |
[ipynb] |
- |
- |
Wed 9.3. |
Kernels |
- |
- |
- |
Tue 15.3. |
Kernels, k-NN, Kernelized Linear Regression |
- |
[pdf] |
[pdf] |
Wed 16.3. |
Kernel parameters, Feature Selection |
- |
- |
- |
Tue 22.3. |
Sparsity (Lasso, L1-SVM), Class Imbalance |
- |
- |
- |
Wed 23.3. |
Metrics for Class Imbalance, Multiclass Reductions |
- |
- |
- |
Tue 5.4. |
Neural networks |
- |
[pdf] |
[pdf] |
Wed 6.4. |
Neural networks training: SGD, Backpropagration |
- |
- |
- |
Tue 12.4. |
Neural networks: Practical aspects |
- |
- |
- |
Wed 13.4. |
Unsupervised learning: k-means |
- |
- |
- |
Tue 19.4. |
Unsupervised learning: PCA |
- |
[pdf] |
[pdf] |
Wed 20.4. |
Unsupervised learning: Kernel PCA, Autoencoders |
- |
- |
- |
Tue 26.4. |
Probabilistic modeling, Bias-variance tradeoff |
- |
- |
- |
Wed 27.4. |
MAP estimation, Logistic regression |
- |
- |
- |
Tue 3.5. |
Bayesian decision theory |
- |
[pdf] |
[pdf] |
Tue/Wed 17./18.5. |
Discriminative vs. Generative Modeling |
- |
[pdf] |
[pdf] |
Tue 24.5. |
Latent variable modeling: Gaussian mixtures, EM |
- |
- |
- |
Wed 25.5. |
Use cases of GMMs: Classification, anomaly detection, semi-supervised learning |
- |
- |
- |
Tue 31.5. |
EM convergence; Markov models |
- |
- |
- |
Wed 1.6. |
Time-series modeling, prediction and validation |
- |
- |
- |
Matlab Demos
Java Applets
Video Lectures
Lectures
Tue 13-15 |
ML D 28 |
Wed 13-15 |
ML D 28 |
Tutorials
Tue 15-17 |
NO C 60 |
Surnames A-E |
Tue 15-17 |
LFW E 15 |
Surnames F-K |
Fri 13-15 |
LFW C 1 |
Surnames L-Z |
Office Hours
Project
Part of the coursework will be a project, carried out in groups of up to 3 students. The goal of this project is to get hands-on experience in machine learning tasks. The project grade will constitute 30% of the total grade. More details on the project will be given in the tutorials.
Exam
The final exam takes place on August 13, from 9:00-11:00 in HIL F15 (Hönggerberg). The mode of examination is written, 120 minutes length. The language of examination is English. As written aids, you can bring two A4 pages (i.e. one A4 sheet of paper), either handwritten or 11 point minimum font size. The written exam will constitute 70% of the total grade.
Text Books
- K. Murphy. Machine Learning: a Probabilistic Perspective. MIT Press 2012
- C. Bishop. Pattern Recognition and Machine Learning. Springer, 2007 (optional)
- T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. Available online
- L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.
Matlab
- The official Matlab documentation is available online at the Mathworks website.
- If you have trouble accessing Matlab’s built-in help function, you can use the online function reference on that page or use the command-line version (type help at the prompt).
- There are several primers and tutorials on the web, a later edition of this one became the book Matlab Primer by T. Davis and K. Sigmon, CRC Press, 2005.