Boolean factors based Artificial Neural Network

Due to its ability to solve nonlinear problems, Artificial Neural Network (ANN) could be applied in several areas of life. However, defining its architecture for solving a given problem is not formalized and remains an open research problem. On the other hand the complexity of such a technique due to its “black box” aspect, makes its interpretation more tedious. Since optimal factors completely cover the data and therefore give an explanation to these data, we propose in this paper to build feedforward ANNs using the optimal factors obtained from the boolean context representing a data. We show through experiments and comparisons on the use datasets that this approach provides relatively better results than those existing in the literature.

[1]  Yochanan Shachmurove,et al.  Applying Artificial Neural Networks to Business, Economics and Finance , 2002 .

[2]  Jihoon Yang,et al.  Constructive Neural-Network Learning Algorithms for Pattern Classification , 2000 .

[3]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[4]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[5]  Jean-François Boulicaut,et al.  Free-Sets: A Condensed Representation of Boolean Data for the Approximation of Frequency Queries , 2004, Data Mining and Knowledge Discovery.

[6]  Engelbert Mephu Nguifo,et al.  A new generic basis of "factual" and "implicative" association rules , 2009, Intell. Data Anal..

[7]  Rajesh Parekh,et al.  Constructive Neural Network Learning Algorithms for Multi-Category Real-Valued Pattern Classification , 1997 .

[8]  Bernard Gosselin Multilayer perceptrons combination applied to handwritten character recognition , 2004, Neural Processing Letters.

[9]  Junfei Qiao,et al.  Constructive algorithm for fully connected cascade feedforward neural networks , 2016, Neurocomputing.

[10]  L. Beran,et al.  [Formal concept analysis]. , 1996, Casopis lekaru ceskych.

[11]  Dorota Witkowska,et al.  Applying artificial neural networks to bank-decision simulations , 1999 .

[12]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[13]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[14]  Dianhui Wang,et al.  An iterative learning algorithm for feedforward neural networks with random weights , 2016, Inf. Sci..

[15]  A. Guénoche Construction du treillis de Galois d'une relation binaire , 1990 .

[16]  R. Palmer,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[17]  Engelbert Mephu Nguifo,et al.  Frequent closed itemset based algorithms: a thorough structural and analytical survey , 2006, SKDD.

[18]  Raúl Rojas,et al.  Neural Networks - A Systematic Introduction , 1996 .

[19]  Lawrence D. Jackel,et al.  Handwritten character recognition using neural network architectures , 1990 .

[20]  Engelbert Mephu Nguifo,et al.  Treillis de concepts et classification supervisée , 2005, Tech. Sci. Informatiques.

[21]  Graeme D Ruxton,et al.  Introduction. The use of artificial neural networks to study perception in animals , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[22]  Engelbert Mephu Nguifo,et al.  Towards a generalization of decompositional approach of rule extraction from multilayer artificial neural network , 2011, The 2011 International Joint Conference on Neural Networks.

[23]  Rajesh Parekh,et al.  Constructive Neural Network Learning Algorithms for Multi-Category Pattern Classification , 1995 .

[24]  Thomas R. Shultz,et al.  A constructive neural-network approach to modeling psychological development , 2012 .

[25]  Vasant G Honavar,et al.  MTiling A Constructive Neural Network Learning Algorithm for Multi Category Pattern Classi cation , 1996 .

[26]  Vilém Vychodil,et al.  Discovery of optimal factors in binary data via a novel method of matrix decomposition , 2010, J. Comput. Syst. Sci..

[27]  G. Dreyfus,et al.  Réseaux de neurones - Méthodologie et applications , 2002 .

[28]  Shih-Hung Yang,et al.  An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications , 2012, Neurocomputing.

[29]  Engelbert Mephu Nguifo,et al.  Étude et conception d'algorithmes de génération de concepts formels , 2004, Ingénierie des Systèmes d Inf..

[30]  Marcus Frean,et al.  The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks , 1990, Neural Computation.

[31]  Fernando José Von Zuben,et al.  A constructive algorithm to synthesize arbitrarily connected feedforward neural networks , 2012, Neurocomputing.

[32]  Allan Pinkus,et al.  Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.

[33]  Engelbert Mephu Nguifo,et al.  M-CLANN: Multi-class Concept Lattice-Based Artificial Neural Network for Supervised Classification , 2008, ICANN.

[34]  Vilém Vychodil,et al.  Formal Concepts as Optimal Factors in Boolean Factor Analysis: Implications and Experiments , 2007, CLA.

[35]  Buse Melis Ozyildirim,et al.  Generalized classifier neural network , 2013, Neural Networks.

[36]  DistAl: An inter-pattern distance-based constructive learning algorithm , 1999, Intell. Data Anal..

[37]  Bernhard Ganter,et al.  Formal Concept Analysis: Mathematical Foundations , 1998 .

[38]  Rudolf Wille,et al.  Restructuring Lattice Theory: An Approach Based on Hierarchies of Concepts , 2009, ICFCA.