Mathematics of Information
- Data Science Master: Information and Learning
- Doctoral and Post-Doctoral Studies: Department of Information Technology and Electrical Engineering
- Electrical Engineering and Information Technology Master: Core subjects (Kernfächer), Specialization courses (Vertiefungsfächer), Advanced core courses
- Mathematics Master: Selection: Further Realms (Auswahl: Weitere Gebiete)
- Physics Master: General Electives (Allgemeine Wahlfächer)
- Quantum Engineering Master: Electives (Wahlfächer)
- Computational Science and Engineering Master: Electives (Wahlfächer)
- Statistics Master: Statistical and Mathematical Courses (Statistische und mathematische Fächer)
|Lecture:||Thursday, 9:15-12:00, live broadcast on the ETH video portal. The first lecture takes place on Thursday 25 Feb 2021, 9:15-12:00.|
|Discussion session:||Monday, 14:15-16:00, live broadcast on the ETH video portal. The first discussion session takes place on Monday 1 Mar 2021, 14:15-16:00.|
|Instructor:||Prof. Dr. Helmut Bölcskei|
|Teaching assistants:||Thomas Allard, Recep Gül|
|Office hours:||Monday, 16:15-17:15, via Zoom. Please contact the TAs for their Zoom IDs.|
|Lecture notes:||Detailed lecture and exercise notes and problem sets with documented solutions will be made available as we go along.|
|Credits:||8 ECTS credits|
Lecture Recordings:The recordings of the lecture will be available the next day in the morning on the ETH video portal.
- The class will be taught in English. There will be a written exam in English of duration 180 minutes.
- Admission to the exam is contingent on the successful completion of a literature review project or a computational project. The research project can be carried out either individually or in groups of up to 6. The project consists of either 1. software development for the solution of a practical signal processing or machine learning problem or 2. the analysis of a research paper or 3. a theoretical research problem of suitable complexity. The outcomes of all projects have to be presented to the entire class at the end of the semester.
- Students are welcome to propose their own project. The project proposal should take the form of a .pdf file containing the names of the students who would work on the project, a description of the project, and a short paragraph on how it relates to the content of the course. The deadline for submitting the project proposals to the teaching assistants is Sunday, 28 March 2021. After this deadline, we will send out a form with further topics where the students who have not proposed their own topic will be able to sign up for a project.
We will post important announcements, links, and other information here in the course of the semester, so please check back often!
The class focuses on mathematical aspects of information science and learning theory.
- Mathematics of Information:
- Signal representations: Frame theory, wavelets, Gabor expansions, sampling theorems, density theorems
- Sparsity and compressed sensing: Sparse linear models, uncertainty relations in sparse signal recovery, super-resolution, spectrum-blind sampling, subspace algorithms (ESPRIT), estimation in the high-dimensional noisy case, Lasso
- Dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma
- Mathematics of Learning:
- Approximation theory: Nonlinear approximation theory, best M-term approximation, greedy algorithms, fundamental limits on compressibility of signal classes, Kolmogorov-Tikhomirov epsilon-entropy of signal classes, optimal compression of signal classes
- Uniform laws of large numbers: Rademacher complexity, Vapnik-Chervonenkis dimension, classes with polynomial discrimination
H. Bölcskei and A. Bandeira
This course is aimed at students with a background in linear algebra, probability, and basic functional analysis. In particular, familiarity with Hilbert spaces is expected on the level of the "Hilbert spaces" chapter posted below (excluding the appendices).
- Lecture notes
- Hilbert spaces
- Fourier transform
First chapter of the notes for the discussion sessions.
- Orthonormal wavelets
Second chapter of the notes for the discussion sessions.
- Compressive sensing
Third chapter of the notes for the discussion sessions.
- Metric entropy of Lipschitz function classes
Fourth chapter of the notes for the discussion sessions.
- iPad notes of lecture of 25.02.2021
- iPad notes of lecture of 04.03.2021
- iPad notes of lecture of 11.03.2021
- iPad notes of lecture of 18.03.2021
- iPad notes of lecture of 25.03.2021
- iPad notes of lecture of 01.04.2021
- iPad notes of lecture of 15.04.2021
- iPad notes of lecture of 22.04.2021
- iPad notes of lecture of 29.04.2021
- iPad notes of lecture of 06.05.2021
- iPad notes of lecture of 20.05.2021
- iPad notes of lecture of 27.05.2021
- iPad notes of lecture of 03.06.2021
There will be 5 homework assignments. You can hand in your solutions and get feedback from us, but it is not mandatory to turn in solutions. Complete solutions to the homework assignments will be posted on the course web page.
Homework Problem Sets
|Homework 1||Solutions to Homework 1|
|Homework 2||Solutions to Homework 2|
|Homework 3||Solutions to Homework 3|
|Homework 4||Solutions to Homework 4|
|Homework 5||Solutions to Homework 5|
- 2018 exam
- Solutions to the 2018 exam
- 2019 exam
- Solutions to the 2019 exam
- 2020 exam
- Solutions to the 2020 exam
- Winter 2020/2021 exam
- Solutions to the Winter 2020/2021 exam
- 2021 exam
- Solutions to the 2021 exam
Here is the material that you have to prepare for the exam:
- frame theory (Sections 1.1, 1.2, 1.3, 1.4.1 and 1.4.2, except the proofs of Theorems 1.15, 1.17, 1.18 points 2 and 3, 1.21 and 1.35 in the lecture notes plus the entire 'Fourier transform' discussion session);
- uncertainty relations (Sections 2.1 and 2.2, except Section 2.2.1, and 2.6 in the lecture notes);
- compressive sensing (Sections 3.1, 3.2, 3.3 and 3.4 in the lecture notes and Sections 1,2,3 and 4 in the 'Compressive sensing' discussion session);
- finite rate of innovation (entire Chapter 4 in the lecture notes);
- sampling of multi-band signals (entire Chapter 5 in the lecture notes);
- the ESPRIT algorithm (entire Chapter 6 in the lecture notes);
- the Johnson-Lindenstrauss lemma and the RIP (entire Chapter 8 except the proof of Lemma 8.2 and entire Chapter 9 in the lecture notes);
- approximation theory (entire Chapter 10 in the lecture notes and the entire 'metric entropy of Lipschitz function classes' discussion session).
If you want to go into more depth or if you need additional background material, please check out these books:
- S. Mallat, "A wavelet tour of signal processing: The sparse way", 3rd ed., Elsevier, 2009
- M. Vetterli and J. Kovačević, "Wavelets and subband coding", Prentice Hall, 1995
- I. Daubechies, "Ten lectures on wavelets", SIAM, 1992
- O. Christensen, "An introduction to frames and Riesz bases", Birkhäuser, 2003
- K. Gröchenig, "Foundations of time-frequency analysis", Springer, 2001
- M. Elad, "Sparse and redundant representations — From theory to applications in signal and image processing", Springer, 2010
- M. Vetterli, J. Kovačević, and V. K. Goyal, "Foundations of signal processing", 3rd ed., Cambridge University Press, 2014
- S. Foucart and H. Rauhut, "A mathematical introduction to compressive sensing", Springer, 2013
- M. J. Wainwright, "High-dimensional statistics: A non-asymptotic viewpoint", Vol. 48, Cambridge University Press, 2019
- Vershynin, Roman, "High-dimensional probability: An introduction with applications in data science". Vol. 47. Cambridge university press, 2018