Advances in Sparse Recovery Algorithms
Nonsmooth optimization plays a central role in sparse recovery. The recent theory of compressed sensing has revitalized the interest in such optimization problems. Sparse recovery algorithms have an extensive literature, all of which can in essence be categorized into three main classes: convex (geometric), greedy (combinatorial), and probabilistic (variational/Belief Propagation) algorithms. The focus of this minisymposium is to give an overview of recent algorithmic advances in sparse recovery. We compare and contrast the advantages and limitations of these seemingly different approaches, and identify potential new research avenues for their interactions. Session will gather talks by leading experts in the field.
Organizer:
Jalal Fadili
CNRS-ENSICAEN-Université de Caen, France
Volkan Cevher
Ecole Polytechnique Fédérale de Lausanne, Switzerland
Part I of II
-
9:30-9:55 Graphical Models and Message Passing Algorithms to Solve Large Scale Regularized Regression Problems
-
Andrea Montanari, Stanford University, USA
-
10:00-10:25 Recovering cosparse vectors using convex vs greedy algorithms
-
Remi Gribonval, IRISA-INRIA, France
-
10:30-10:55 Simple and Practical Algorithm for Sparse Fourier Transform
-
Piotr Indyk, Massachusetts Institute of Technology, USA
-
11:00-11:25 Hard Thresholding with Norm Constraints
-
Volkan Cevher, Ecole Polytechnique Fédérale de Lausanne, Switzerland
Part II of II
-
2:00-2:25 Generalized Forward-Backward Splitting for Sparse Recovery
-
Jalal Fadili, CNRS-ENSICAEN-Université de Caen, France
-
2:30-2:55 Advances in first-order methods: constraints, non-smoothness and faster convergence
-
Stephen Becker, CNRS-UPMC, France
-
3:00-3:25 Constructing Test Instances for Sparse Recovery Algorithms
-
Christian Kruschel, TU Braunschweig, Germany