Skip to main content
eScholarship
Open Access Publications from the University of California

Department of Mathematics

Faculty bannerUC Davis

Regularized Gradient Descent: A Nonconvex Recipe for Fast Joint Blind Deconvolution and Demixing

Published Web Location

https://arxiv.org/pdf/1703.08642.pdf
No data is associated with this publication.
Abstract

We study the question of extracting a sequence of functions {fi,gi}si=1 from observing only the sum of their convolutions, i.e., from y=∑si=1fi∗gi. While convex optimization techniques are able to solve this joint blind deconvolution-demixing problem provably and robustly under certain conditions, for medium-size or large-size problems we need computationally faster methods without sacrificing the benefits of mathematical rigor that come with convex methods. In this paper we present a non-convex algorithm which guarantees exact recovery under conditions that are competitive with convex optimization methods, with the additional advantage of being computationally much more efficient. Our two-step algorithm converges to the global minimum linearly and is also robust in the presence of additive noise. While the derived performance bounds are suboptimal in terms of the information-theoretic limit, numerical simulations show remarkable performance even if the number of measurements is close to the number of degrees of freedom. We discuss an application of the proposed framework in wireless communications in connection with the Internet-of-Things.

Item not freely available? Link broken?
Report a problem accessing this item