Cosine K-Nearest Neighbor in Milkfish Eye Classification

K-Nearest Neighbors (K-NN) classification method gains refined version proposed by the researcher. The refinement aims to solve noise sensitive when using small K, and irrelevant class as classification result when using large K. The problem in the previous version of method was that the weights were calculated individually, so the result was not optimal. We propose recent weighting scheme where the weights were no longer gained from the nearest neighbor individually, but by involving all pair of the nearest neighbor, called Cosine K-NN (CosKNN). We also introduce a trigonometric map to describe the Cosine weight. CosKNN is soft value to represent ownership of each class to the testing data. Empirically, CosKNN is tested and compared with other K-NN refinement using milkfish eye, UCI, and KEEL dataset. The result shows that CosKNN hold superior performance compared to the other methods although K number is higher of which accuracy is 96.79%.

[1]  Ömer Faruk Ertugrul,et al.  A novel version of k nearest neighbor: Dependent nearest neighbor , 2017, Appl. Soft Comput..

[2]  O S Sitompul,et al.  Improving the accuracy of k-nearest neighbor using local mean based and distance weight , 2018 .

[3]  Xindong Wu,et al.  The Top Ten Algorithms in Data Mining , 2009 .

[4]  Rani Purbaningtyas,et al.  Classification of Segmented Milkfish Eyes using Cosine K-Nearest Neighbor , 2019, 2019 2nd International Conference on Applied Information Technology and Innovation (ICAITI).

[5]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[6]  Jesús Alcalá-Fdez,et al.  KEEL Data-Mining Software Tool: Data Set Repository, Integration of Algorithms and Experimental Analysis Framework , 2011, J. Multiple Valued Log. Soft Comput..

[7]  Ahmed I. Saleh,et al.  Gene expression cancer classification using modified K-Nearest Neighbors technique , 2019, Biosyst..

[8]  Daniel T. Larose,et al.  Discovering Knowledge in Data: An Introduction to Data Mining , 2005 .

[9]  Jianping Gou,et al.  A generalized mean distance-based k-nearest neighbor classifier , 2019, Expert Syst. Appl..

[10]  María José del Jesús,et al.  KEEL: a software tool to assess evolutionary algorithms for data mining problems , 2008, Soft Comput..

[11]  Jianping Gou,et al.  A Novel Weighted Voting for K-Nearest Neighbor Rule , 2011, J. Comput..

[12]  P. Zsombor-Murray,et al.  Elementary Mathematics from an Advanced Standpoint , 1940, Nature.

[13]  Gustavo E. A. P. A. Batista,et al.  How k-nearest neighbor parameters affect its performance , 2009 .

[14]  Julio López,et al.  Redefining nearest neighbor classification in high-dimensional settings , 2018, Pattern Recognit. Lett..

[15]  Yidi Wang,et al.  A new general nearest neighbor classification based on the mutual neighborhood information , 2017, Knowl. Based Syst..

[16]  B. Ripley,et al.  Pattern Recognition , 1968, Nature.

[17]  Felix Klein Elementary Mathematics from an Advanced Standpoint: Geometry , 1941 .

[18]  Ahmad Basheer Hassanat,et al.  Solving the Problem of the K Parameter in the KNN Classifier Using an Ensemble Learning Approach , 2014, ArXiv.

[19]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.