An affine scaling methodology for best basis selection

A methodology is developed to derive algorithms for optimal basis selection by minimizing diversity measures proposed by Wickerhauser (1994) and Donoho (1994). These measures include the p-norm-like (l/sub (p/spl les/1)/) diversity measures and the Gaussian and Shannon entropies. The algorithm development methodology uses a factored representation for the gradient and involves successive relaxation of the Lagrangian necessary condition. This yields algorithms that are intimately related to the affine scaling transformation (AST) based methods commonly employed by the interior point approach to nonlinear optimization. The algorithms minimizing the (l/sub (p/spl les/1)/) diversity measures are equivalent to a previously developed class of algorithms called focal underdetermined system solver (FOCUSS). The general nature of the methodology provides a systematic approach for deriving this class of algorithms and a natural mechanism for extending them. It also facilitates a better understanding of the convergence behavior and a strengthening of the convergence results. The Gaussian entropy minimization algorithm is shown to be equivalent to a well-behaved p=0 norm-like optimization algorithm. Computer experiments demonstrate that the p-norm-like and the Gaussian entropy algorithms perform well, converging to sparse solutions. The Shannon entropy algorithm produces solutions that are concentrated but are shown to not converge to a fully sparse solution.

[1]  Ahmet M. Kondoz,et al.  Digital Speech: Coding for Low Bit Rate Communication Systems , 1995 .

[2]  Yoram Bresler,et al.  A new algorithm for computing sparse solutions to linear inverse problems , 1996, 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing Conference Proceedings.

[3]  B.D. Rao,et al.  Comparison of basis selection methods , 1996, Conference Record of The Thirtieth Asilomar Conference on Signals, Systems and Computers.

[4]  B. Rao Analysis and extensions of the FOCUSS algorithm , 1996, Conference Record of The Thirtieth Asilomar Conference on Signals, Systems and Computers.

[5]  M. Victor Wickerhauser,et al.  Adapted wavelet analysis from theory to software , 1994 .

[6]  Philip E. Gill,et al.  Practical optimization , 1981 .

[7]  Thomas W. Parks,et al.  Extrapolation and spectral estimation with iterative weighted norm modification , 1991, IEEE Trans. Signal Process..

[8]  R. Haddad,et al.  Multiresolution Signal Decomposition: Transforms, Subbands, and Wavelets , 1992 .

[9]  Bhaskar D. Rao,et al.  Convergence analysis of a class of adaptive weighted norm extrapolation algorithms , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[10]  D. Donoho,et al.  Basis pursuit , 1994, Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.

[11]  M. Wickerhauser,et al.  Wavelets and time-frequency analysis , 1996, Proc. IEEE.

[12]  Paul S. Bradley,et al.  Feature Selection via Mathematical Programming , 1997, INFORMS J. Comput..

[13]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[14]  Antonio Artés-Rodríguez,et al.  Sparse deconvolution using adaptive mixed-Gaussian models , 1996, Signal Process..

[15]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[16]  Carl Taswell,et al.  Satisficing search algorithms for selecting near-best bases in adaptive tree-structured wavelet transforms , 1996, IEEE Trans. Signal Process..

[17]  Bhaskar D. Rao,et al.  Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm , 1997, IEEE Trans. Signal Process..

[18]  D. Donoho On Minimum Entropy Segmentation , 1994 .

[19]  Brian D. Jeffs,et al.  Restoration of blurred star field images by maximally sparse optimization , 1993, IEEE Trans. Image Process..

[20]  M. Vetterli,et al.  Wavelets, subband coding, and best bases , 1996, Proc. IEEE.

[21]  Y. Meyer,et al.  Wavelets and Filter Banks , 1991 .

[22]  R. E. Carlson,et al.  Sparse approximate multiquadric interpolation , 1994 .

[23]  I F Gorodnitsky,et al.  Neuromagnetic source imaging with FOCUSS: a recursive weighted minimum norm algorithm. , 1995, Electroencephalography and clinical neurophysiology.

[24]  Y. C. Pati,et al.  Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[25]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[26]  Thomas S. Huang,et al.  Improvement of discrete band-limited signal extrapolation by iterative subspace modification , 1987, ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing.

[27]  Dick den Hertog,et al.  Interior Point Approach to Linear, Quadratic and Convex Programming: Algorithms and Complexity , 1994 .

[28]  Bhaskar D. Rao,et al.  A recursive weighted minimum norm algorithm: Analysis and applications , 1993, 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[29]  Ronald R. Coifman,et al.  Entropy-based algorithms for best basis selection , 1992, IEEE Trans. Inf. Theory.

[30]  Pierre Duhamel,et al.  Automatic test generation techniques for analog circuits and systems: A review , 1979 .

[31]  M. Padberg Linear Optimization and Extensions , 1995 .

[32]  K. Kreutz-Delgado,et al.  A General Approach to Sparse Basis Selection : Majorization , Concavity , and Affine Scaling — — — — — – , 1997 .

[33]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[34]  Ingrid Daubechies,et al.  Ten Lectures on Wavelets , 1992 .

[35]  S. Chen,et al.  Fast orthogonal least squares algorithm for efficient subset model selection , 1995, IEEE Trans. Signal Process..

[36]  Jelena Kovacevic,et al.  Wavelets and Subband Coding , 2013, Prentice Hall Signal Processing Series.

[37]  Yuying Li,et al.  A Globally Convergent Method for lp Problems , 1991, SIAM J. Optim..