Fractional Gradient Descent Optimizer for Linear Classifier Support Vector Machine

Supervised learning is one of the activities in data mining that aims to classify or predict data. One of the powerful supervised learning algorithms is the Support Vector Machine which is included in the linear classifier. In data prediction activities, efforts are needed to improve the accuracy of predictions by optimizing parameters in the classification algorithm. In this study, the proposed Fractional Gradient Descent as an unconstraint optimization algorithm on objective functions in the SVM classifier. With Fractional Gradient Descent as an optimizer classification model in training data activities to progress the exactness of prediction models. Fractional Gradient Descent optimizes the SVM classification model using fractional values so that it has small steps with a small learning rate in the process of reaching global minimums, and achieving convergence with lower iterations. With a learning rate of 0.0001 SVM Classifier with fractional gradient descent have error rate = 0.273083, at learning rate 0.001 with error rate = 0.273070, and at learning rate 0.01 with error rate = 0.273134. The results of the SVM Classifier with stochastic gradient descent optimization reach the convergence point at iteration 350. With fractional gradient descent optimization, it reaches a convergence point of 50 iterations smaller than the SVM Classifier with stochastic gradient descent.

[1]  Hua Chen,et al.  Fractional-order gradient descent learning of BP neural networks with Caputo derivative , 2017, Neural Networks.

[2]  Toshiharu Sugie,et al.  On a relationship between integral compensation and stochastic gradient descent , 2017, 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE).

[3]  Sergios Theodoridis,et al.  A geometric approach to Support Vector Machine (SVM) classification , 2006, IEEE Transactions on Neural Networks.

[4]  Jian Wang,et al.  Convergence Analysis of Caputo-Type Fractional Order Complex-Valued Neural Networks , 2017, IEEE Access.

[5]  J. K. Liu,et al.  New hybrid conjugate gradient method for unconstrained optimization , 2014, Appl. Math. Comput..

[6]  Paul Tseng,et al.  Approximation accuracy, gradient methods, and error bound for structured convex optimization , 2010, Math. Program..

[7]  Christopher Leckie,et al.  High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning , 2016, Pattern Recognit..

[8]  Ming Yang,et al.  Large-scale image classification: Fast feature extraction and SVM training , 2011, CVPR 2011.

[9]  Yoram Singer,et al.  Pegasos: primal estimated sub-gradient solver for SVM , 2011, Math. Program..

[10]  Jin-kui Liu,et al.  New three-term conjugate gradient method for solving unconstrained optimization problems , 2014 .

[11]  Zhang Yi,et al.  Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[12]  N. Zhang,et al.  An Improved Adagrad Gradient Descent Optimization Algorithm , 2018, 2018 Chinese Automation Congress (CAC).

[13]  Yuanzhi Li,et al.  Learning Overparameterized Neural Networks via Stochastic Gradient Descent on Structured Data , 2018, NeurIPS.

[14]  Zhipeng Jia,et al.  Large scale tissue histopathology image classification, segmentation, and visualization via deep convolutional activation features , 2017, BMC Bioinformatics.

[15]  Wei Wu,et al.  Convergence analysis of online gradient method for BP neural networks , 2011, Neural Networks.

[16]  Róbert Busa-Fekete,et al.  Distributed Stochastic Optimization via Adaptive SGD , 2018, NeurIPS.

[17]  P. Deepa Shenoy,et al.  Aspect term extraction for sentiment analysis in large movie reviews using Gini Index feature selection method and SVM classifier , 2016, World Wide Web.

[18]  Takahiro Watanabe,et al.  Nonlinear Optimization Method Based on Stochastic Gradient Descent for Fast Convergence , 2018, 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[19]  J. Wu,et al.  A Localization Method for Multistatic SAR Based on Convex Optimization , 2015, PloS one.

[20]  Sattar Vakili,et al.  A Random Walk Approach to First-Order Stochastic Convex Optimization , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).