Identifying Non-pulsar Radiation and Predicting Chess Endgame Result Using ARSkNN

We are currently living in a data age. Due to the expansion of Internet of Things platform, there has been an upsurge in the number of devices connected to the Internet. Every device, ranging from smart sensors and smart phones to systems installed in manufacturing units, hospitals and vehicles is generating data. Such developments have not only escalated the generation of data but also created a need for analysis of raw data to identify patterns. Thus, data mining techniques are being deployed extensively to extract information. The accuracy and effectiveness of data mining techniques in providing better outcomes and cost-effective methods in various domains has already been established. Usually, in supervised learning, distance estimation is used by instance-based learning classifiers like kNN. In this analysis, the regular kNN classifier has been compared with ARSkNN which instead of following the conventional procedure of distance estimation uses the mass estimation approach. ARSkNN has been proved to be commensurate (or superior) to kNN in accuracy and has been found to reduce the computation time drastically on datasets chosen for this analysis.

[1]  G. Gates,et al.  The reduced nearest neighbor rule (Corresp.) , 1972, IEEE Trans. Inf. Theory.

[2]  A. Tsybakov,et al.  Fast learning rates for plug-in classifiers , 2005, 0708.2321.

[3]  Yu Shiwen,et al.  An adaptive k -nearest neighbor text categorization strategy , 2004 .

[4]  Anil K. Jain,et al.  NOTE ON DISTANCE-WEIGHTED k-NEAREST NEIGHBOR RULES. , 1978 .

[5]  Eric Bax,et al.  Validation of nearest neighbor classifiers , 2000, IEEE Trans. Inf. Theory.

[6]  S. Archana,et al.  Survey of Classification Techniques in Data Mining , 2014 .

[7]  Hideyuki Imai,et al.  Probably correct k-nearest neighbor search in high dimensions , 2010, Pattern Recognit..

[8]  Kai Ming Ting,et al.  Mass estimation and its applications , 2010, KDD.

[9]  Yoram Singer,et al.  Online and batch learning of pseudo-metrics , 2004, ICML.

[10]  Ethem Alpaydin,et al.  Voting over Multiple Condensed Nearest Neighbors , 1997, Artificial Intelligence Review.

[11]  Pol Coppin,et al.  Satellite inventory of Minnesota forest resources , 1994 .

[12]  Ales Leonardis,et al.  High-Dimensional Feature Matching: Employing the Concept of Meaningful Nearest Neighbors , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[13]  Yi-Ping Hung,et al.  Fast and versatile algorithm for nearest neighbor search based on a lower bound tree , 2007, Pattern Recognit..

[14]  Zheng-Zhi Wang,et al.  Center-based nearest neighbor classifier , 2007, Pattern Recognit..

[15]  Kilian Q. Weinberger,et al.  Distance Metric Learning for Large Margin Nearest Neighbor Classification , 2005, NIPS.

[16]  Sumit Srivastava,et al.  ARSkNN-A k-NN Classifier Using Mass Based Similarity Measure , 2015 .

[17]  D. Lorimer,et al.  Handbook of Pulsar Astronomy , 2004 .

[18]  J. L. Hodges,et al.  Discriminatory Analysis - Nonparametric Discrimination: Consistency Properties , 1989 .

[19]  Joshua D. Knowles,et al.  Fifty years of pulsar candidate selection: from simple filters to a new principled real-time classification approach , 2016, Monthly Notices of the Royal Astronomical Society.

[20]  Peter E. Hart,et al.  The condensed nearest neighbor rule (Corresp.) , 1968, IEEE Trans. Inf. Theory.