Neural Network Theory
- Data Science Master: Information and Learning
- Electrical Engineering and Information Technology Master: Core Courses (Kernfächer)
- Electrical Engineering and Information Technology Master: Advanced Core Courses
- Electrical Engineering and Information Technology Master: Specialization Courses (Vertiefungsfächer)
- Electrical Engineering and Information Technology Master: Recommended Subjects (Empfohlene Fächer)
- Computer Science Master: Computer Science Elective Courses
- Mathematics Master: Selection: Further Realms (Auswahl: Weitere Gebiete)
- Physics Master: General Electives (Allgemeine Wahlfächer)
- Computational Science and Engineering Master: Electives (Wahlfächer)
- Statistics Master: Statistical and Mathematical Courses (Statistische und mathematische Fächer)
|Lecture:||Tuesday, 10:15-12:00, HG F 5, live broadcast on the ETH video portal (live). The projected screen and audio will be broadcast and recorded. The recordings will be available the next day in the morning on the ETH video portal (lectures).|
|Discussion session:||Tuesday, 12:15-13:00, via Zoom. The Zoom meeting link and the recordings can be found on this page (access credentials are the same as for the lecture/exercise notes).|
|Instructors:||Prof. Dr. Helmut Bölcskei|
|Teaching assistants:||Weigutian Ou, Dennis Elbrächter|
|Office hours:||Tuesday, 16:00-17:00, via Zoom. Please contact the TAs for their Zoom ID.|
|Lecture notes:||The download link is provided below.|
|Credits:||4 ECTS credits|
|Course structure:||The class will be taught in English. There will be a written exam in English of duration 180 minutes.|
We will post important announcements, links, and other information here in the course of the semester.
- The document "Lecture notes on VC dimension" is available for download.
- The first lecture will take place on Tuesday, Sept. 21st, 10:15-13:00 (please note 3 hours of lecture!). The first discussion session will take place on Tuesday, Sept. 28th, 12:15-13:00.
The class focuses on fundamental mathematical aspects of neural networks with an emphasis on deep networks.
- Universal approximation with single- and multi-layer networks
- Introduction to approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, non-linear approximation theory
- Fundamental limits of deep neural network learning
- Geometry of decision surfaces
- Separating capacity of nonlinear decision surfaces
- Vapnik-Chervonenkis (VC) dimension
- VC dimension of neural networks
- Generalization error in neural network learning
The course is aimed at students with a strong mathematical background in general, and in linear algebra, analysis, and probability theory in particular.
Handwritten notesHere we will post notes written on the iPad during the lectures.
Problem sets and solutions
There will be several problem sets for this course, which will help you better understand the lectures and prepare you for the exam. All the problem sets will be discussed in the exercise session, and the solutions will be uploaded afterwards.
Parts of the notes relevant for the exam:
- Sections 2.2, 2.3, 2.4, 2.5, 2.6, 2.7.
- All of the handwritten notes on second part of class.
- All problem sets.
|Winter Exam 2020:||Problems||Solutions|
|Summer Exam 2020:||Problems||Solutions|
|Winter Exam 2021:||Problems||Solutions|
|Summer Exam 2021:||Problems||Solutions|
|Winter Exam 2022:||Problems||Solutions|