Introduction to Machine LearningThe course will introduce the foundations of learning and making predictions from data. We will study basic concepts such as trading goodness of fit and model complexity. We will discuss important machine learning algorithms used in practice, and provide hands-on experience in a course project. VVZ Information is available here.
- Information on the setup of the exam as well as room and time slot assignment can be found here.
- The exam will be an auto-multiple-choice exam on paper (as last year). Please inform us about any special request regarding a disability.
- In the first week’s tutorial sessions on 26. February, we will offer a review session of required background material for the course. This will include a short recap of linear algebra, multivariate analysis and probability theory.
- For programming background, we recommend knowing Python. For those without experience in it, check out this Python tutorial.
- Access to the files and Zoom calls and recordings of the tutorial and Q&A are password protected. To obtain the password you need to be inside the ETH network and click here. To establish a VPN connection click here.
- All lectures are being recorded. The recordings will be made available within a day after the lecture on the ETH video portal here.
- We have set up forum on Piazza. Please use this for questions regarding the course material, projects and organization.
- If this course is compulsory for your study program (Kernfach), you are able to register irrespective of the waiting list. Please allow some time for the transfer from the waiting list.
- The project is a mandatory part of the examination. Without achieving a passing grade (4), you are not allowed to sit the final examination.
- Distance examination is allowed but you need to file an official request via study administration. We do not handle these requests.
- Attendance at Tutorials and Lectures is not mandatory.
Lecture RecordingsThe video recordings of the lectures and tutorials are available on the ETH Videoportal. The lectures are password protected. Use your ETH credentials to access the recordings.
|Tue 23.2.||Introduction: Slides Annotations||Slides Recording I Recording II||–|
|Wed 24.2.||Regression I: Slides Annotations||–||–|
|Tue 02.3.||Regression II: Slides Annotations Notes||Slides Recording Code||Exercises|
|Wed 03.3.||Model Selection: Slides Annotations||–||–|
|Tue 09.3.||Optimization: Slides Annotations||Slides Recording||Solution|
|Wed 10.3.||Classification: Slides Annotations||–||–|
|Tue 16.3.||Other Metrics: Slides Annotations||Slides Recording||Exercises|
|Wed 17.3.||Kernels I: Slides Annotations||–||–|
|Tue 23.3.||Kernels II: Slides Annotations||Slides Recording||Solution|
|Wed 24.3.||Neural Networks I: Slides Annotations||–||–|
|Tue 30.3.||Neural Networks II: Slides Annotations||Slides Recording||Exercises|
|Wed 31.3.||Neural Networks III: Slides Annotations||–||–|
|Tue 13.4.||Neural Networks IV: Slides Annotations||Recording Code 1 2 3||Solution|
|Wed 14.4.||Clustering: Slides Annotations||–||–|
|Tue 20.4.||Dimensionality Reduction I: Slides Annotations||Slides Recording||Exercises|
|Wed 21.4.||Dimensionality Reduction II: Slides Annotations||–||–|
|Tue 27.4.||Decision Theory: Slides Annotations||Slides Recording||Solution|
|Wed 28.4.||Maximum Likelihood Estimation: Slides Annotations||–||–|
|Tue 4.5.||Bootstrapping & uncertainty quantification: Slides Annotations||Slides Recording||Exercises|
|Wed 5.5.||Bayesian Viewpoint: Slides Annotations||–||–|
|Tue 11.5.||Gaussian Mixture Model I: Slides Annotations||Slides Recording||Solution|
|Wed 12.5.||Gaussian Mixture Model II: Slides Annotations||–||–|
|Tue 18.5.||Gaussian Mixture Models III: Slides Annotations||Slides Recording||Exercises|
|Wed 19.5.||Generative Adversarial Networks: Slides Annotations||–||–|
|Tue 25.5.||Question and Answer: Slides||Recording Slides Code 1 2 3||Solution|
|Wed 26.5.||Question and Answer: Slides||-||–|
|Tue 01.6.||Question and Answer: Slides||Slides Recording||–|
|Wed 02.6.||no class||–||–|
|Instructors||Prof. Andreas Krause and Prof. Fanny Yang|
|Head TA||Charlotte Bunne|
|Assistants||Alex Tifrea, Andisheh Amrollahi, Andrii Zadaianchuk, Carl-Johann Simon-Gabriel, Chris Wendler, Cristina Pinneri, David Lindner, Fangjinhua Wang, Gideon Dresdner, Hugo Yeche, Joanna Ficek, Jonas Gehring, Jonas Rothfuss, Kjong Lehmann, Laurie Prelot, Lenart Treven, Max Paulus, Mohammad Reza Karimi, Mojmír Mutný, Nicolo Ruggeri, Niels Gleinig, Olga Mineeva, Scott Sussex, Sebastian Curi, Seyedmorteza Sadat, Stefan Stark, Vignesh Ram Somnath, Vincent Fortuin, Ya-Ping Hsieh|
|Mailing List||Please use the Piazza forum for questions regrading course material, organization and projects. In order to send private questions to instructors use the private thread function of Piazza. If this does not work for your request, please use the tutorial webinar to ask questions.|
Questions & Answers
Problem SetsHomeworks will be distributed electronically on the Moodle platform. They are intended for you to practice concepts and your performance in the homeworks will in no way affect your final grade. They are published bi-weekly, with solutions following one week after or being directly visible after entering your solutions in Moodle.
ProjectThe code projects will require solving machine learning problems with methods taught within the course. You are allowed to work in groups of 1 – 3 students, but it is your responsibility to find a group. You can search for teammates by posting on Piazza. Assignments will require handing in the solution code as well as a short report. In particular, there will be 6 code assignments. The first assignment is ungraded and will allow you to become familiar with our code submission workflow. The remaining projects are graded (pass/fail) and mandatory for passing the Introduction to Machine Learning course. You can find the tentative project schedule and further details in the project information sheet [pdf]. The projects can be accessed and submitted on our project sever https://project.las.ethz.ch/. You will need to be in the ETH network or use the VPN to access the server.
Q & A Project Sessions: [Task 2] , [Task 3], [Task 4]
DemosThe Demo’s are based on jupyter notebook (with Python 3). Please look at this intro for installing and running instructions. We recommend that you create a conda environment to maintain the code base. The demo’s are hosted at GitLab. If you do not have access, please request it.
Performance Assessment70% session examination, 30% code project; the final grade will be calculated as weighted average of both these elements. As a compulsory continuous performance assessment task, the project must be passed on its own. The coding projects are an integral part (60 hours of work, 2 credits) of the course. Participation is mandatory. To be eligible fir the examination of Introduction to Machine Learning (252-0220-00L), you need to pass the code projects, i.e., attain an overall project grade of 4 or higher. Students who do not pass the project are required to de-register from the exam and will otherwise be treated as a no show.
For the final exam, you can bring two A4-pages (i.e. one A4-sheet of paper), either handwritten or 11 point minimum font size. A simple non-programmable calculator is allowed during the exam. The exam will be multiple choice. Here you can find an example of the question types as well as how to fill out the answer sheet to guarantee successful automatic grading.
Exam Review Sessions: [Recording I, Recording II][Slides I, Slides II, Slides III]
Exam 2015 Exam 2016 Exam 2017 Exam 2018 Exam 2019 Exam 2020, Sol 2020 Exam 2021, Sol 2021
- Marc Peter Deisenroth, A Aldo Faisal, and Cheng Soon Ong Mathematics for Machine Learning. Cambridge University Press, 2020.
- K. Murphy. Machine Learning: a Probabilistic Perspective. MIT Press, 2012.
- C. Bishop. Pattern Recognition and Machine Learning. Springer, 2007. (optional)
- T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. Springer, 2001.
- L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.