Parameter tuning using asynchronous parallel pattern search in sparse signal reconstruction

Parameter tuning is an important but often overlooked step in signal recovery problems. For instance, the regularization parameter in compressed sensing dictates the sparsity of the approximate signal reconstruction. More recently, there has been evidence that non-convex ℓp quasi-norm minimization, where 0 < p < 1, leads to an improvement in reconstruction over existing models that use convex regularization. However, these methods rely on good estimates of the value of not only p (the choice of norm) but also on the value of the penalty regularization parameter. This paper describes a method for choosing suitable parameters. The method involves creating a score to determine the effectiveness of the choice of parameters by partially reconstructing the signal. We then efficiently search through different combinations of parameters using a pattern search approach that exploits parallelism and asynchronicity to find the pair with the optimal score. We demonstrate the efficiency and accuracy of the proposed method through numerical experiments.

[1]  Donald L. Snyder,et al.  Random Point Processes in Time and Space , 1991 .

[2]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[3]  Roummel F. Marcia,et al.  Nonconvex relaxation for Poisson intensity reconstruction , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[4]  Tamara G. Kolda,et al.  Asynchronous Parallel Generating Set Search for Linearly Constrained Optimization , 2008, SIAM J. Sci. Comput..

[5]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[6]  Tamara G. Kolda,et al.  Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods , 2003, SIAM Rev..

[7]  Virginia Torczon,et al.  On the Convergence of Pattern Search Algorithms , 1997, SIAM J. Optim..

[8]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[9]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[10]  Chih-Jen Lin,et al.  A Practical Guide to Support Vector Classication , 2008 .

[11]  Ron Kohavi,et al.  Automatic Parameter Selection by Minimizing Estimated Error , 1995, ICML.

[12]  Yoram Singer,et al.  Adaptive Subgradient Methods for Online Learning and Stochastic Optimization , 2011, J. Mach. Learn. Res..

[13]  Tamara G. Kolda,et al.  Algorithm 856: APPSPACK 4.0: asynchronous parallel pattern search for derivative-free optimization , 2006, TOMS.

[14]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[15]  Yann LeCun,et al.  Deep learning with Elastic Averaging SGD , 2014, NIPS.

[16]  Todd D. Plantenga,et al.  HOPSPACK 2.0 user manual. , 2009 .

[17]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[18]  Sayan Mukherjee,et al.  Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.

[19]  Sandia Report,et al.  Revisiting Asynchronous Parallel Pattern Search for Nonlinear Optimization , 2004 .

[20]  Robert Hooke,et al.  `` Direct Search'' Solution of Numerical and Statistical Problems , 1961, JACM.

[21]  Huan Liu,et al.  Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution , 2003, ICML.

[22]  Rebecca Willett,et al.  This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms—Theory and Practice , 2010, IEEE Transactions on Image Processing.

[23]  Tamara G. Kolda,et al.  Asynchronous parallel pattern search for nonlinear optimization , 2000 .

[24]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[25]  Sheng-De Wang,et al.  Choosing the kernel parameters for support vector machines by the inter-cluster distance in the feature space , 2009, Pattern Recognit..

[26]  J. Dennis,et al.  Direct Search Methods on Parallel Machines , 1991 .