Deep generative models for distribution-preserving lossy compression

Authors

Michael Tschannen, Eirikur Agustsson, and Mario Lucic

Reference

Neural Information Processing Systems (NeurIPS), 2018, to appear.

[BibTeX, LaTeX, and HTML Reference]

Abstract

We propose and study the problem of distribution-preserving lossy compression. Motivated by the recent advances in extreme image compression which allow to maintain artifact-free reconstructions even at very low bitrates, we propose to optimize the rate-distortion tradeoff under the constraint that the reconstructed samples follow the distribution of the training data. Such a compression system recovers both ends of the spectrum: On one hand, at zero bitrate it learns a generative model of the data, and at high enough bitrates it achieves perfect reconstruction. Furthermore, for intermediate bitrates it smoothly interpolates between matching the distribution of the training data and perfectly reconstructing the training samples. We study several methods to approximately solve the proposed optimization problem, including a novel combination of Wasserstein GAN and Wasserstein Autoencoder, and present strong theoretical and empirical results for the proposed compression system.

Keywords

Generative adversarial network, lossy compression, Wasserstein distance


Download this document:

 

Copyright Notice: © 2018 M. Tschannen, E. Agustsson, and M. Lucic.

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.