Introduction to Machine Learning
The course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexity. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. VVZ Information is available here.News
- Slides from the QA session: NN, Generative Models, Kernels, EM
- Solutions to Homework 4 updated.
- You can find previous exams below.
- For the written exam, you can bring two A4-pages (i.e. one A4-sheet of paper), either handwritten or 11 point minimum font size. Calculators are not allowed.
- Solutions to homework 7 published.
- No lecture on 23.5., 29.5 and 30.5; tutorials still take place. The last homework is a bit longer then usual, and contains some additional material. The homework will be discussed in next week’s tutorial session. The final project task is due to 1st June (submission) / 3rd June (hand-in).
- Homework 7 and solutions to homework 6 published.
- Homework 6 and solutions to homework 5 published.
- Homework 5 and solutions to homework 4 published.
- Homework 4 and solutions to homework 3 published.
- Homework 3 and solutions to homework 2 published.
- Homework 2 published.
- The project server is online now. Please find all information here.
- The room for the Friday tutorial was changed to ML D 28.
- The first homework assignment is online.
- The video recordings are available at the ETH Videoportal.
- In the first week’s tutorial sessions (Tue, Wed, Fri), we will offer a review session of required background material for the course. This will include a short recap of linear algebra, multivariate analysis and probability theory.
- Official tutorials on course material will start in the second week.
- Please attend the tutorials according to last name: A-F: Mon 15-17,HG D 1.2 G-K: Tue 15-17,HG D 1.2 L-R: Wed 15-17,CAB G 11 S-Z: Fri 13-15, ML D 28 (For students of the first group (A-F), who want to attend the introduction tutorial in the first week, please go to either Tue or Wed tutorial.)
- The files are password protected. To obtain the password you need to be inside the ETH network and click here. To establish a VPN connection click here.
- We have set up forum for IntroML on Piazza. Please use this for questions regarding the course material, projects and organisation.
Syllabus
Contact
Instructors | Prof. Andreas Krause |
Head TA | Johannes Kirschner |
Assistants | Natalie Davidson,Gideon Dresdner, Harun Mustafa, Esfandiar Mohammadi ,Stephanie Hyland,Aytunç Şahin, Matteo Turchetta, Zalan Borsos, Sebastian Curi, Hoda Heidari, Kfir Levy, Jens Witkowski, Mojmir Mutny, Anastasia Makarova, Mohammad Reza Karimi, Vincent Fortuin, Kjong Lehmann |
Piazza | If you have any questions, please use the Piazza Course Forum. |
Mailing List | Please use the Piazza Forum for questions regrading course material, organisation and projects. If this does not work for your request, you can send an email to introml18@lists.inf.ethz.ch from your ethz.ch address. |
Lectures
Tue 13-15 | ML D 28 | ML E 12 (via video) |
Wed 13-15 | ML D 28 | ML E 12 (via video) |
Tutorials
tutorial | Room | Group |
---|---|---|
Mon 15-17 | HG D 1.2 | A-F |
Tue 15-17 | HG D 1.2 | G-K |
Wed 15-17 | CAB G 11 | L-R |
Fri 13-15 | ML D 28 | S-Z |
Project
Please find all information about the project here.Demos
The Demo’s are based on jupyter notebook (with python 3). Please look at this intro for installing and running instructions. Helper files: (Please download them and save them on same directory as the demos). zipped helper files (Updated 22.3.2018) Demos:- Linear Regression (updated 06.04.2018)
- Classification (updated 06.04.2018)
- Kernelized Classification/k-NN (updated 06.04.2018)
- Kernelized Regression (updated 06.04.2018)
- Neural Networks (updated 18.05.2018)
- Unsupervised Learning (updated 18.05.2018)
- Bias, Variance, and Noise tradeoff (updated 18.05.2018)
- Probabilistic Modelling (updated 18.05.2018)
- Semi-supervised Learning (updated 18.05.2018)
Exam
The final grade will be determined by the written final exam and the projects, according to the formula max(0.3*project_grade + 0.7*exam_grade, exam_grade). Per the formula above, your semester project only accounts for a bonus, i.e. it will only be counted if it exceeds your exam grade. For the written exam, you can bring two A4-pages (i.e. one A4-sheet of paper), either handwritten or 11 point minimum font size. No calculators or other aids are allowed.Previous Exams:
Exam 2015
Exam 2016
Exam 2017
Text Books
- K. Murphy. Machine Learning: a Probabilistic Perspective. MIT Press 2012
- C. Bishop. Pattern Recognition and Machine Learning. Springer, 2007 (optional)
- T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001. Available online
- L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.