Regressing Kernel Dictionary Learning

In this paper, we present a kernelized dictionary learning framework for carrying out regression to model signals having a complex nonlinear nature. A joint optimization is carried out where the regression weights are learnt together with the dictionary and coefficients. Relevant formulation and dictionary building steps are provided. To demonstrate the effectiveness of the proposed technique, elaborate experimental results using different real-life datasets are presented. The results show that non-linear dictionary is more accurate for data modeling and provides significant improvement in estimation accuracy over the other popular traditional techniques especially when the data is highly non-linear.

[1]  Rama Chellappa,et al.  Design of Non-Linear Kernel Dictionaries for Object Recognition , 2013, IEEE Transactions on Image Processing.

[2]  Guillermo Sapiro,et al.  Online dictionary learning for sparse coding , 2009, ICML '09.

[3]  Michael Elad,et al.  Linearized Kernel Dictionary Learning , 2015, IEEE Journal of Selected Topics in Signal Processing.

[4]  Zuowei Shen,et al.  Dictionary Learning for Sparse Coding: Algorithms and Convergence Analysis , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Larry S. Davis,et al.  Label Consistent K-SVD: Learning a Discriminative Dictionary for Recognition , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  David J. Field,et al.  Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.

[7]  Rebecca Willett,et al.  Sparse Linear Regression With Missing Data , 2015, ArXiv.

[8]  Athanasios Tsanas,et al.  Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools , 2012 .

[9]  Michael Elad,et al.  K-SVD and its non-negative variant for dictionary design , 2005, SPIE Optics + Photonics.

[10]  Pascal Frossard,et al.  Dictionary Learning , 2011, IEEE Signal Processing Magazine.

[11]  Paulo Cortez,et al.  Modeling wine preferences by data mining from physicochemical properties , 2009, Decis. Support Syst..

[12]  William W. Hsieh,et al.  Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels , 2009 .

[13]  Michael Elad,et al.  Sparse Representation for Color Image Restoration , 2008, IEEE Transactions on Image Processing.

[14]  Kjersti Engan,et al.  Method of optimal directions for frame design , 1999, 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258).

[15]  W. Härdle Applied Nonparametric Regression , 1992 .

[16]  Deanna Needell,et al.  Compressed Sensing and Dictionary Learning , 2014 .

[17]  Ziyu Wang,et al.  Joint sparse model-based discriminative K-SVD for hyperspectral image classification , 2017, Signal Process..

[18]  Rama Chellappa,et al.  Kernel dictionary learning , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[19]  Michael Elad,et al.  Double Sparsity: Learning Sparse Dictionaries for Sparse Signal Approximation , 2010, IEEE Transactions on Signal Processing.

[20]  Pascal Frossard,et al.  Dictionary learning: What is the right representation for my signal? , 2011 .

[21]  William W. Hsieh,et al.  Nonlinear canonical correlation analysis by neural networks , 2000, Neural Networks.

[22]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[23]  Guillermo Sapiro,et al.  Supervised Dictionary Learning , 2008, NIPS.