- Data Science Master: Wählbare Kernfächer
- Doktorat Departement Informationstechnologie und Elektrotechnik: Lehrangebot Doktorat und Postdoktorat
- Elektrotechnik und Informationstechnologie Master: Vertiefungsfächer
- Elektrotechnik und Informationstechnologie Master: Empfohlene Fächer
- Mathematik Bachelor: Auswahl: Weitere Gebiete
- Mathematik Master: Auswahl: Weitere Gebiete
- Physik Master: Allgemeine Wahlfächer
- Statistik Master: Statistische und mathematische Fächer
- Statistik Master: Fachbezogene Wahlfächer

Lecture: | Wednesday, 10:15-12:00, HG D 3.2. The first lecture takes place on Wednesday 22 Feb. 2023, 09:15-12:00. |

Discussion session: | Wednesday, 09:15-10:00, HG D 3.2. The first discussion session takes place on Wednesday 01 Mar. 2023, 09:15-10:00. |

Office hours: | Friday, 10:15-11:00 via Zoom. The first office hour takes place on Friday 03 Mar. 2023, 10:15-11:00. |

Zoom Links: | The Zoom link for the office hours can be found at this page (access credentials are the same as for the lecture/exercise notes). |

Instructor: | Dr. Erwin Riegler |

Teaching assistant: | Alex Bühler |

Lecture notes: | Detailed lecture notes will be made available as we go along. |

Prerequisites: | This course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis. |

Credits: | 4 ECTS credits. |

Course structure: | The class will be taught in English. There will be a written exam in English of duration 180 minutes. |

The focus of the course is aligned to a theoretical approach of learning theory and classification and an introduction to lossy and lossless compression for general sets and measures. We will mainly focus on a probabilistic approach, where an underlying distribution must be learned/compressed. The concepts acquired in the course are of broad and general interest in data sciences.

After attending this lecture and participating in the exercise sessions, students will have acquired a working knowledge of learning theory, classification, and compression.

We will post important announcements, links, and other information here in the course of the semester, so please check back often!

- The lectures will take place from 10:15-12:00 and the discussion sessions from 09:15-10:00.
- On March 1st, there is only the discussion sessions from 09:15-10:00. There will be no lecture
- On May 31st, there is no discussion session and no lecture. Instead, we offer a question hour from 10:30-12:00 in HG D 3.2

- Learning Theory:
- Framework of Learning
- Hypothesis Spaces and Target Functions
- Reproducing Kernel Hilbert Spaces
- Bias-Variance Tradeoff
- Estimation of Sample and Approximation Error
- Classification:
- Binary Classifier
- Support Vector Machines (separable case)
- Support Vector Machines (nonseparable case)
- Kernel Trick
- Lossy and Lossless Compression:
- Basics of Compression
- Compressed Sensing for General Sets and Measures
- Quantization and Rate Distortion Theory for General Sets and Measures

This course is aimed at students with a solid background in measure theory and linear algebra and basic knowledge in functional analysis.

- Lecture Notes Part I - Learning and Classification (Version 24.08.2022)
- Problems + Solutions (Version 24.08.2022)
- Slides on Lossy Compression

- Lecture 22.02.2023: We stopped at the end of Section 1.2 in the lecture notes.
- Lecture 08.03.2023: We stopped at the middle of page 23 in the lecture notes.
- Lecture 15.03.2023: We stopped after the proof of Proposition 3 in the lecture notes.
- Lecture 22.03.2023: We stopped at the end of Section 1.4.1 in the lecture notes.
- Lecture 29.03.2023: We stopped after the proof of Theorem 5 in the lecture notes.
- Lecture 05.04.2023: We stopped after the proof of Theorem 6 in the lecture notes.
- Lecture 19.04.2023: We stopped after the proof of Lemma 10 in the lecture notes.
- Lecture 26.04.2023: We stopped in the middle of page 80 in the lecture notes.
- Lecture 03.05.2023: We stopped at the end of page 83 in the lecture notes.
- Lecture 10.05.2023: We stopped after the proof of Lemma 14 in the lecture notes.
- Lecture 17.05.2023: We stopped on page 12 in the slides on lossy compression.

There will be several problem sets for this course, which will help you better understand the lectures and prepare you for the exam (See download link Problems + Solutions above). The following problem will be discussed in the discussion session:

Discussion Session: | Problem: |

01.03.2023 | Problem 1 |

08.03.2023 | Problem 2 |

15.03.2023 | Problem 3 |

22.03.2023 | Problem 4 |

29.03.2023 | Problem 5 |

05.04.2023 | Problem 6 |

19.04.2023 | Problem 7 |

26.04.2023 | Problem 8 |

03.05.2023 | Problem 9 |

10.05.2023 | Problem 10 |

17.05.2023 | Problem 11 |

24.05.2023 | Problem 12 |

- Lecture notes: The appendices and the addendum are excluded. Required results from the addendum will be provided via handouts at the exam.
- All problem sets.
- The slides on lossy compression are excluded.

Note that you are not required to learn proofs by heart or to recite key steps in proofs. You are, however, expected to understand the main ideas/techniques/concepts used in the proofs in the relevant material listed above.

Summer Exam 2021: | Problems | Solutions | Handout |

Summer Exam 2022: | Problems | Solutions | Handout |