Learning Gradual Argumentation Frameworks using Genetic Algorithms

Gradual argumentation frameworks represent arguments and their relationships in a weighted graph. Their graphical structure and intuitive semantics makes them a potentially interesting tool for interpretable machine learning. It has been noted recently that their mechanics are closely related to neural networks, which allows learning their weights from data by standard deep learning frameworks. As a first proof of concept, we propose a genetic algorithm to simultaneously learn the structure of argumentative classification models. To obtain a well interpretable model, the fitness function balances sparseness and accuracy of the classifier. We discuss our algorithm and present first experimental results on standard benchmarks from the UCI machine learning repository. Our prototype learns argumentative classification models that are comparable to decision trees in terms of learning performance and interpretability.

[1]  Francesca Toni,et al.  Argumentation-Based Recommendations: Fantastic Explanations and How to Find Them , 2018, IJCAI.

[2]  Geoffrey E. Hinton,et al.  Distilling the Knowledge in a Neural Network , 2015, ArXiv.

[3]  Vasant Honavar,et al.  Evolutionary Design of Neural Architectures -- A Preliminary Taxonomy and Guide to Literature , 1995 .

[4]  Risto Miikkulainen,et al.  Efficient evolution of neural network topologies , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[5]  Leila Amgoud,et al.  Evaluation of Arguments in Weighted Bipolar Graphs , 2017, ECSQARU.

[6]  Hassan Foroosh,et al.  Sparse Convolutional Neural Networks , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Matthias Thimm,et al.  Using Graph Convolutional Networks for Approximate Reasoning with Abstract Argumentation Frameworks: A Feasibility Study , 2019, SUM.

[8]  Yoshua Bengio,et al.  BinaryConnect: Training Deep Neural Networks with binary weights during propagations , 2015, NIPS.

[9]  Andrew Zisserman,et al.  Speeding up Convolutional Neural Networks with Low Rank Expansions , 2014, BMVC.

[10]  Fernando A. Tohmé,et al.  Collective argumentation: A survey of aggregation issues around argumentation frameworks , 2017, Argument Comput..

[11]  Pietro Baroni,et al.  Automatic evaluation of design alternatives with quantitative argumentation , 2015, Argument Comput..

[12]  Till Mossakowski,et al.  Modular Semantics and Characteristics for Bipolar Weighted Argumentation Graphs , 2018, ArXiv.

[13]  Frank Hutter,et al.  Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..

[14]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[15]  Yoshua Bengio,et al.  BinaryNet: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1 , 2016, ArXiv.

[16]  Kristian Kersting,et al.  Towards Argumentation-based Classification , 2017 .

[17]  Song Han,et al.  Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.

[18]  Claudette Cayrol,et al.  On bipolarity in argumentation frameworks , 2008, NMR.

[19]  Nico Potyka A Tutorial for Weighted Bipolar Argumentation with Continuous Dynamical Systems and the Java Library Attractor , 2018, ArXiv.

[20]  Mykola Pechenizkiy,et al.  Sparse evolutionary deep learning with over one million artificial neurons on commodity hardware , 2019, Neural Computing and Applications.

[21]  Francesca Toni,et al.  Combining Deep Learning and Argumentative Reasoning for the Analysis of Social Media Textual Content Using Small Data Sets , 2018, Computational Linguistics.

[22]  Serena Villata,et al.  Support in Abstract Argumentation , 2010, COMMA.

[23]  Kalyanmoy Deb,et al.  A Comparative Analysis of Selection Schemes Used in Genetic Algorithms , 1990, FOGA.

[24]  Francesca Toni,et al.  Data-Empowered Argumentation for Dialectically Explainable Predictions , 2020, ECAI.

[25]  K. D. Jong Learning with Genetic Algorithms: An Overview , 2005, Machine Learning.

[26]  Yoshua Bengio,et al.  Neural Networks with Few Multiplications , 2015, ICLR.

[27]  Yixin Chen,et al.  Compressing Convolutional Neural Networks , 2015, ArXiv.

[28]  Peter M. Todd,et al.  Designing Neural Networks using Genetic Algorithms , 1989, ICGA.

[29]  Paolo Torroni,et al.  Argument Mining: A Machine Learning Perspective , 2015, TAFA.

[30]  Artur S. d'Avila Garcez,et al.  Neuro-Symbolic Probabilistic Argumentation Machines , 2020, KR.

[31]  Dan Boneh,et al.  On genetic algorithms , 1995, COLT '95.

[32]  Srdjan Vesic,et al.  Acceptability Semantics for Weighted Argumentation Frameworks , 2017, IJCAI.

[33]  Nico Potyka Continuous Dynamical Systems for Weighted Bipolar Argumentation , 2018, KR.

[34]  Pietro Baroni,et al.  How Many Properties Do We Need for Gradual Argumentation? , 2018, AAAI.

[35]  Pietro Baroni,et al.  Discontinuity-Free Decision Support with Quantitative Argumentation Debates , 2016, KR.

[36]  Lutz Prechelt,et al.  Early Stopping - But When? , 2012, Neural Networks: Tricks of the Trade.

[37]  Dorothea Heiss-Czedik,et al.  An Introduction to Genetic Algorithms. , 1997, Artificial Life.

[38]  Darrell Whitley,et al.  Next Generation Genetic Algorithms: A User’s Guide and Tutorial , 2018, Handbook of Metaheuristics.

[39]  Rich Caruana,et al.  Removing the Genetics from the Standard Genetic Algorithm , 1995, ICML.

[40]  Phan Minh Dung,et al.  On the Acceptability of Arguments and its Fundamental Role in Nonmonotonic Reasoning, Logic Programming and n-Person Games , 1995, Artif. Intell..

[41]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[42]  Thomas Bäck,et al.  Evolutionary algorithms in theory and practice - evolution strategies, evolutionary programming, genetic algorithms , 1996 .

[43]  Max Welling,et al.  Bayesian Compression for Deep Learning , 2017, NIPS.

[44]  Jaafar Abouchabaka,et al.  Analyzing the Performance of Mutation Operators to Solve the Travelling Salesman Problem , 2012, ArXiv.

[45]  Xiaogang Wang,et al.  Convolutional neural networks with low-rank regularization , 2015, ICLR.

[46]  Lawrence Davis,et al.  Training Feedforward Neural Networks Using Genetic Algorithms , 1989, IJCAI.

[47]  Yurong Chen,et al.  Dynamic Network Surgery for Efficient DNNs , 2016, NIPS.

[48]  Risto Miikkulainen,et al.  Designing neural networks through neuroevolution , 2019, Nat. Mach. Intell..

[49]  Francesca Toni,et al.  Detecting deceptive reviews using Argumentation , 2016, PrAISe@ECAI.

[50]  Floris Bex,et al.  Deep Learning for Abstract Argumentation Semantics , 2020, ArXiv.

[51]  Peter A. Beerel,et al.  Characterizing Sparse Connectivity Patterns in Neural Networks , 2018, 2018 Information Theory and Applications Workshop (ITA).

[52]  Simon Parsons,et al.  A Generalization of Dung's Abstract Framework for Argumentation: Arguing with Sets of Attacking Arguments , 2006, ArgMAS.

[53]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[54]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.

[55]  Arash Ardakani,et al.  Sparsely-Connected Neural Networks: Towards Efficient VLSI Implementation of Deep Neural Networks , 2016, ICLR.

[56]  Shujun Liu,et al.  Deep Adaptive Network: An Efficient Deep Neural Network with Sparse Binary Connections , 2016, ArXiv.

[57]  Nico Potyka Open-Mindedness of Gradual Argumentation Semantics , 2019, SUM.

[58]  Misha Denil,et al.  Predicting Parameters in Deep Learning , 2014 .

[59]  Foundations for Solving Classification Problems with Quantitative Abstract Argumentation , 2020, XI-ML@KI.

[60]  J. van Leeuwen,et al.  Neural Networks: Tricks of the Trade , 2002, Lecture Notes in Computer Science.

[61]  Song Han,et al.  Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.