Sensitivity of ℓ₁ minimization to parameter choice


The use of generalized Lasso is a common technique for recovery of structured high-dimensional signals. Each generalized Lasso program has a governing parameter whose optimal value depends on properties of the data. At this optimal value, compressed sensing theory explains why Lasso programs recover structured high-dimensional signals with minimax order-optimal error. Unfortunately in practice, the optimal choice is generally unknown and must be estimated. Thus, we investigate stability of each Lasso program with respect to its governing parameter. Our goal is to aid the practitioner in answering the following question: given real data, which Lasso program should be used? We take a step towards answering this by analyzing the case where the measurement matrix is identity (the so-called proximal denoising setup) and we use $\ell_{1}$ regularization. For each Lasso program, we specify settings in which that program is provably unstable with respect to its governing parameter. We support our analysis with detailed numerical simulations. For example, there are settings where a 0.1% underestimate of a Lasso parameter can increase the error significantly; and a 50% underestimate can cause the error to increase by a factor of $10^{9}$.

arXiv pre-print

(Submitted to Information and Inference: A Journal of the IMA on 31 March 2019.)