Strengthening learning algorithms by feature discovery

This paper presents a new feature discovery approach called FEADIS that strengthens learning algorithms with discovered features. The discovered features are formed by various mathematical functions including ceil, mod, sin, and similar. These features are constructed in an iterative manner to improve gradually its learning performance. We demonstrate FEADIS capabilities by testing different types of datasets including periodical datasets. From the results, we conclude that FEADIS increases the performance of learning algorithms in a wide range of datasets for nominal or numeric target feature. Furthermore, most of the well known classifiers without FEADIS strengthening have severe difficulty in handling datasets that have periodical functional relations between input features and target feature - a difficulty circumvented by their potential use of FEADIS.

[1]  Paul E. Utgoff,et al.  Classification Using Φ-Machines and Constructive Function Approximation , 1998, Machine Learning.

[2]  María José del Jesús,et al.  Genetic feature selection in a fuzzy rule-based classification system learning process for high-dimensional problems , 2001, Inf. Sci..

[3]  Stephen R. Garner,et al.  WEKA: The Waikato Environment for Knowledge Analysis , 1996 .

[4]  Richard S. Sutton,et al.  Learning Polynomial Functions by Feature Construction , 1991, ML.

[5]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques with Java implementations , 2002, SGMD.

[6]  Shaul Markovitch,et al.  Feature Generation Using General Constructor Functions , 2002, Machine Learning.

[7]  Meng Wang,et al.  Unified Video Annotation via Multigraph Learning , 2009, IEEE Transactions on Circuits and Systems for Video Technology.

[8]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[9]  János Abonyi,et al.  Learning fuzzy classification rules from labeled data , 2003, Inf. Sci..

[10]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[11]  Claude Sammut,et al.  Classification of Multivariate Time Series and Structured Data Using Constructive Induction , 2005, Machine Learning.

[12]  C. Leung,et al.  Use of periodic and monotonic activation functions in multilayer feedforward neural networks trained by extended Kalman filter algorithm , 2002 .

[13]  Chao-Ton Su,et al.  Applying electromagnetism-like mechanism for feature selection , 2011, Inf. Sci..

[14]  Pavlos Protopapas,et al.  Kernels for Periodic Time Series Arising in Astronomy , 2009, ECML/PKDD.

[15]  Víctor Robles,et al.  Feature selection for multi-label naive Bayes classification , 2009, Inf. Sci..

[16]  Sayan Mukherjee,et al.  Feature Selection for SVMs , 2000, NIPS.

[17]  N. Kamaraj,et al.  Evolving decision tree rule based system for audio stego anomalies detection based on Hausdorff distance statistics , 2010, Inf. Sci..

[18]  Roman Słowiński,et al.  Sequential covering rule induction algorithm for variable consistency rough set approaches , 2011, Inf. Sci..

[19]  Eduardo Pérez,et al.  Constructive induction and genetic algorithms for learning concepts with complex interaction , 2005, GECCO '05.

[20]  Michael Schlosser,et al.  Discovery of Relevant New Features by Generating Non-Linear Decision Trees , 1996, KDD.

[21]  Robert E. Schapire,et al.  Theoretical Views of Boosting , 1999, EuroCOLT.

[22]  J. Sopena,et al.  Neural networks with periodic and monotonic activation functions: a comparative study in classification problems , 1999 .

[23]  Naftali Tishby,et al.  Learning to Select Features using their Properties , 2008 .

[24]  Evgeniy Gabrilovich,et al.  Feature Generation for Text Categorization Using World Knowledge , 2005, IJCAI.

[25]  Larry Bull,et al.  Genetic Programming with a Genetic Algorithm for Feature Construction and Selection , 2005, Genetic Programming and Evolvable Machines.

[26]  George D. Smith,et al.  Evolutionary constructive induction , 2005, IEEE Transactions on Knowledge and Data Engineering.

[27]  Chris H. Q. Ding,et al.  Evolving Feature Selection , 2005, IEEE Intell. Syst..

[28]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[29]  Selwyn Piramuthu Feature construction for reduction of tabular knowledge-based systems , 2004, Inf. Sci..

[30]  Eytan Ruppin,et al.  Feature Selection via Coalitional Game Theory , 2007, Neural Computation.