Compressive nonparametric graphical model selection for time series

Authors

Alexander Jung, Reinhard Heckel, Helmut Bölcskei, and Franz Hlawatsch

Reference

Proc. of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp. 769-773, May 2014.

DOI: 10.1109/ICASSP.2014.6853700

[BibTeX, LaTeX, and HTML Reference]

Abstract

We propose a method for inferring the conditional independence graph (CIG) of a high-dimensional discrete-time Gaussian vector random process from finite-length observations. Our approach does not rely on a parametric model (such as, e.g., an autoregressive model) for the vector random process; rather, it only assumes certain spectral smoothness properties. The proposed inference scheme is compressive in that it works for sample sizes that are (much) smaller than the number of scalar process components. We provide analytical conditions for our method to correctly identify the CIG with high probability.

Keywords

Sparsity, graphical model selection, multitask learning, nonparametric time series, LASSO


Download this document:

 

Copyright Notice: © 2014 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.