Constructive universal high-dimensional distribution generation through deep ReLU networks

Authors

Dmytro Perekrestenko, Stephan Müller, and Helmut Bölcskei

Reference

Proc. of the 37th International Conference on Machine Learning (ICML), Vienna, Austria, July 2020.

[BibTeX, LaTeX, and HTML Reference]

Abstract

We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution. The key ingredient of our design is a generalization of the "space-filling" property of sawtooth functions discovered in (Bailey & Telgarsky, 2018). We elicit the importance of depth - in our neural network construction - in driving the Wasserstein distance between the target distribution and the approximation realized by the network to zero. An extension to output distributions of arbitrary dimension is outlined. Finally, we show that the proposed construction does not incur a cost - in terms of error measured in Wasserstein-distance - relative to generating d-dimensional target distributions from d independent random variables.

Keywords

Neural networks, deep learning, generative networks, approximation theory, dimensionality increase


Download this document:

 

Copyright Notice: © 2020 D. Perekrestenko, S. Müller, and H. Bölcskei.

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.