Learning, Classification, and Compression
Offered in:
- Data Science Master: Wählbare Kernfächer
- Doktorat Departement Informationstechnologie und Elektrotechnik: Lehrangebot Doktorat und Postdoktorat
- Elektrotechnik und Informationstechnologie Master: Vertiefungsfächer
- Elektrotechnik und Informationstechnologie Master: Empfohlene Fächer
- Mathematik Bachelor: Auswahl: Weitere Gebiete
- Mathematik Master: Auswahl: Weitere Gebiete
- Physik Master: Allgemeine Wahlfächer
- Statistik Master: Statistische und mathematische Fächer
- Statistik Master: Fachbezogene Wahlfächer
Basic Information:
Lecture: | Wednesday, 09:15-11:00, HG D 3.2, live broadcast on the ETH video portal. The first lecture takes place on Wednesday 23 Feb. 2022, 09:15-11:00. |
Discussion session: | Wednesday, 11:15-12:00, HG D 3.2, live broadcast on the ETH video portal. The first discussion session takes place on Wednesday 02 Mar. 2022, 11:15-12:00. |
Office hours: | Wednesday, 15:15-16:15 via Zoom. The first office hour takes place on Wednesday 02 Mar. 2022, 15:15-16:15. |
Zoom Links: | The Zoom link for the office hours can be found at this page (access credentials are the same as for the lecture/exercise notes). |
Lecture Recording: | The recordings of the lecture + discussion session can be found at this page (access credentials are the same as for the lecture/exercise notes). |
Instructor: | Dr. Erwin Riegler |
Teaching assistants: | Hongruyu Chen, Stefan Stojanovic |
Lecture notes: | Detailed lecture notes will be made available as we go along. |
Prerequisites: | This course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis. |
Credits: | 4 ECTS credits. |
Course structure: | The class will be taught in English. There will be a written exam in English of duration 180 minutes. |
Course Information:
The focus of the course is aligned to a theoretical approach of learning theory and classification and an introduction to lossy and lossless compression for general sets and measures. We will mainly focus on a probabilistic approach, where an underlying distribution must be learned/compressed. The concepts acquired in the course are of broad and general interest in data sciences.
After attending this lecture and participating in the exercise sessions, students will have acquired a working knowledge of learning theory, classification, and compression.
News
We will post important announcements, links, and other information here in the course of the semester, so please check back often!
- Added Lemma 1 and Theorem 13 to the lecture notes (version 07.04.2022), which are related to a question of a student.
- The updated version of the lecture notes (version 24.05.2022) contains more details regarding the computation of the offset for SVMs in the nonseparable case.
- There will be no discussion session on June 1st. Instead, we offer a queston hour for students. Location and time is the same as for the discussion session.
Content of the Course:
- Learning Theory:
- Framework of Learning
- Hypothesis Spaces and Target Functions
- Reproducing Kernel Hilbert Spaces
- Bias-Variance Tradeoff
- Estimation of Sample and Approximation Error
- Classification:
- Binary Classifier
- Support Vector Machines (separable case)
- Support Vector Machines (nonseparable case)
- Kernel Trick
- Lossy and Lossless Compression:
- Basics of Compression
- Compressed Sensing for General Sets and Measures
- Quantization and Rate Distortion Theory for General Sets and Measures
Prerequisites
This course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis.
Lecture Notes:
Tracking of how far we've come in the lecture:
- Lecture 23.02.2022: We stopped at the end of page 9 in the lecture notes.
- Lecture 02.03.2022: We stopped at the end of Section 1.2 in the lecture notes.
- Lecture 09.03.2022: We stopped at the end of Section 1.3 in the lecture notes.
- Lecture 16.03.2022: We stopped at the end of Example 1 in the lecture notes.
- Lecture 23.03.2022: We stopped at the end of Example 2 in the lecture notes.
- Lecture 30.03.2022: We stopped at the end of Lemma 2 in the lecture notes.
- Lecture 06.04.2022: We stopped at the end of Theorem 4 in the lecture notes.
- Lecture 13.04.2022: We stopped at the end of Section 1.4 in the lecture notes.
- Lecture 27.04.2022: We stopped at the beginning of the proof of Theorem 7 in the lecture notes.
- Lecture 04.05.2022: We stopped at the beginning of Section 2.1.1 in the lecture notes.
- Lecture 11.05.2022: We stopped at the end of Lemma 12 in the lecture notes.
- Lecture 18.05.2022: We stopped at the middle of page 83 in the lecture notes.
Corrections:
- A list of corrections for the lecture notes and the problem sets can be downloaded here.
Problem sets and solutions
There will be several problem sets for this course, which will help you better understand the lectures and prepare you for the exam. All the problem sets will be discussed in the discussion session, and the solutions will be uploaded afterwards.
Parts of the notes relevant for the exam:
- Lecture notes: The appendices and the addendum are excluded. Required results from the addendum will be provided via handouts at the exam.
- All problem sets.
- The slides on lossy compression are excluded.
Note that you are not required to learn proofs by heart or to recite key steps in proofs. You are, however, expected to understand the main ideas/techniques/concepts used in the proofs in the relevant material listed above.
Previous years' exams and solutions
Summer Exam 2021: | Problems | Solutions | Handout |
Summer Exam 2022: | Problems | Solutions | Handout |