Permissible thyroid datasets assessment through kernel PC Algorithm and Vapnik Chervonenkis theory for categorization and classification

The proposed work assesses the composed mixture of machine learning algorithms(MLA) mentioned as principal component analysis (PCA) and kernel principal component analysis (KPCA) along with vapnik chervonenkis theory (VCT) on the datasets. The main hound is to trace the optimistic values obtained for classification and categorization of the datasets. In this induced methodology, PCA and KPCA are used for extraction of features, and VCT-KPCA is the proposed method which is used in categorization of features as an innovative hybrid approach (HA). The optimistic outcomes produced by the consistent datasets by PCA vary from the outcomes of KPCA, and subsequently, the hybrid proposal of VCT-KPCA method is indexed with high optimization ideals. The above work is considered on thyroid datasets which are from two different repositories UCI and e-IETD named as dataset-1(DS1) and dataset-2(DS2), respectively. To learn the differentiation between DS1 and DS2 canonical correlation analysis (CCA), weights in this act as an add-on procedure.

[1]  Mohamed Cherif Nait-Hamoud,et al.  Two novel methods for multiclass ECG arrhythmias classification based on PCA, fuzzy support vector machine and unbalanced clustering , 2010, 2010 International Conference on Machine and Web Intelligence.

[2]  Hua Huang,et al.  Super-Resolution Method for Face Recognition Using Nonlinear Mappings on Coherent Features , 2011, IEEE Transactions on Neural Networks.

[3]  Vladimir Cherkassky,et al.  Myopotential denoising of ECG signals using wavelet thresholding methods , 2001, Neural Networks.

[4]  Yang Liu,et al.  Ensemble Classifiers Based on Kernel PCA for Cancer Data Classification , 2009, ICIC.

[5]  David Haussler,et al.  What Size Net Gives Valid Generalization? , 1989, Neural Computation.

[6]  Yann LeCun,et al.  Measuring the VC-Dimension of a Learning Machine , 1994, Neural Computation.

[7]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[8]  Xiaogang Wang,et al.  Hallucinating face by eigentransformation , 2005, IEEE Trans. Syst. Man Cybern. Part C.

[9]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[10]  T. Srinivasa Rao,et al.  Health Diagnosis Expert Advisory System on Trained Data Sets for Hyperthyroid , 2014 .

[11]  R. Shibata An optimal selection of regression variables , 1981 .

[12]  Vladimir Cherkassky,et al.  Model complexity control and statistical learning theory , 2002, Natural Computing.

[13]  Federico Girosi,et al.  Regularization Theory, Radial Basis Functions and Networks , 1994 .

[14]  H. Akaike Statistical predictor identification , 1970 .

[15]  Junping Zhang,et al.  Super-resolution of human face image using canonical correlation analysis , 2010, Pattern Recognit..

[16]  I. Johnstone,et al.  Ideal denoising in an orthonormal basis chosen from a library of bases , 1994 .

[17]  David L. Donoho,et al.  De-noising by soft-thresholding , 1995, IEEE Trans. Inf. Theory.

[18]  T. Srinivasa Rao,et al.  Improvised prophecy using regularization method of machine learning algorithms on medical data , 2016 .

[19]  William Li,et al.  Measuring the VC-Dimension Using Optimized Experimental Design , 2000, Neural Computation.

[20]  Adam Kapelner,et al.  bartMachine: Machine Learning with Bayesian Additive Regression Trees , 2013, 1312.2171.

[21]  Peter Craven,et al.  Smoothing noisy data with spline functions , 1978 .

[22]  Eduardo D. Sontag,et al.  Neural Networks with Quadratic VC Dimension , 1995, J. Comput. Syst. Sci..

[23]  Jerome H. Friedman,et al.  An Overview of Predictive Learning and Function Approximation , 1994 .

[24]  Jing Shen,et al.  Image Semantic Classification algorithm Research On Kernel PCA support vector machine , 2008, 2008 IEEE International Symposium on Knowledge Acquisition and Modeling Workshop.

[25]  Vladimir Cherkassky,et al.  Signal estimation and denoising using VC-theory , 2001, Neural Networks.

[26]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[27]  Ka Yee Yeung,et al.  Principal component analysis for clustering gene expression data , 2001, Bioinform..

[28]  G. Wahba Smoothing noisy data with spline functions , 1975 .

[29]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[30]  Vladimir Cherkassky,et al.  Model complexity control for regression using VC generalization bounds , 1999, IEEE Trans. Neural Networks.

[31]  Anton Buhagiar,et al.  Exploration and reduction of data using principal component analysis , 2002 .

[32]  V. Prasad,et al.  Thyroid disease diagnosis via hybrid architecture composing rough data sets theory and machine learning algorithms , 2016, Soft Comput..

[33]  T. Srinivas Rao,et al.  Offline Analysis & Optimistic Approach on Livestock Expert Advisory System , 2013 .

[34]  Vladimir Cherkassky,et al.  Learning from Data: Concepts, Theory, and Methods , 1998 .

[35]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[36]  Juan José Rodríguez Diez,et al.  An Experimental Study on Rotation Forest Ensembles , 2007, MCS.

[37]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[38]  Nikita Singh,et al.  A segmentation method and classification of diagnosis for thyroid nodules , 2012 .