We work on the mathematical foundations of information science. Current research topics include
- Machine learning theory: Fundamental limits of deep neural network learning, performance analysis of stochastic gradient descent algorithms, mathematical theory of generative adversarial networks, learning of nonlinear dynamical systems, recurrent neural networks, reinforcement learning, time-varying scattering networks
- Mathematical signal processing: Compressed sensing, super-resolution, uncertainty relations, frame theory, sampling theory, harmonic analysis over high-dimensional point sets, inverse problems, system identification, subspace algorithms
- Data science: Lossless analog compression, fundamental limits of matrix completion, rate-distortion theory and quantization for general sets and measures, Kolmogorov rate-distortion theory, approximation theory
We operate a signal processing lab. See our lab page for an overview of our lab projects.
Downloads and reproducible research
Algorithms, software, and measurement data related to our research can be found on our downloads page.