Sparse Regularization via Convex Analysis
Sparse approximate solutions to linear equations, which has numerous applications, can be obtained via L1 norm regularization. But the L1 norm tends to underestimate the true values. We introduce a non-convex alternative to the L1 norm. Unlike other non-convex regularizers, the proposed non-convex regularizer maintains the convexity of the objective function to be minimized. This allows one to retain beneficial properties of both convex and non-convex regularization. Although the new regularizer is non-convex, it is defined using tools of convex analysis. For this purpose, we define a generalization of the Moreau envelope and a generalized multivariate Huber function. The resulting optimization problem can be solved by proximal algorithms.
Ivan Selesnick works in signal and image processing, wavelet-based signal processing, sparsity techniques, and biomedical signal processing. He is with the Department of Electrical and Computer Engineering at New York University in the Tandon School of Engineering where he is Department Chair. He received the BS, MEE and PhD degrees in Electrical Engineering from Rice University in 1990, 1991 and 1996. He received the Jacobs Excellence in Education Award from Polytechnic University in 2003 and became an IEEE Fellow in 2016. He has been an associate editor for several IEEE Transactions and IEEE Signal Processing Letters.