Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements
Skip to main content
eScholarship
Open Access Publications from the University of California

Department of Mathematics

Faculty bannerUC Davis

Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements

  • Author(s): Rudelson, Mark
  • Vershynin, Roman
  • et al.

Published Web Location

https://arxiv.org/pdf/math/0602559.pdf
No data is associated with this publication.
Abstract

We want to exactly reconstruct a sparse signal f (a vector in R^n of small support) from few linear measurements of f (inner products with some fixed vectors). A nice and intuitive reconstruction by Linear Programming has been advocated since 80-ies by Dave Donoho and his collaborators. Namely, one can relax the reconstruction problem, which is highly nonconvex, to a convex problem -- and, moreover, to a linear program. However, when is exactly the reconstruction problem equivalent to its convex relaxation is an open question. Recent work of many authors shows that the number of measurements k(r,n) needed to exactly reconstruct any r-sparse signal f of length n (a vector in R^n of support r) from its linear measurements with the convex relaxation method is usually O(r polylog(n)). However, known estimates of the number of measurements k(r,n) involve huge constants, in spite of very good performance of the algorithms in practice. In this paper, we consider random Gaussian measurements and random Fourier measurements (a frequency sample of f). For Gaussian measurements, we prove the first guarantees with reasonable constants: k(r,n) < 12 r (2 + log(n/r)), which is optimal up to constants. For Fourier measurements, we prove the best known bound k(r,n) = O(r log(n) . log^2(r) log(r log n)), which is optimal within the log log n and log^3 r factors. Our arguments are based on the technique of Geometric Functional Analysis and Probability in Banach spaces.

Item not freely available? Link broken?
Report a problem accessing this item