Structure-based fitness prediction for the variable-structure DANNA neuromorphic architecture

In recent years, research on neuromporphic computing platforms has focused on variable-structure, spiking network models. An important methodology for programming these networks is evoluationary optimization (EO), where thousands of networks are generated and then evaluated by determining fitness scores on specific tasks. Fitness scores guide the generation of new networks until a target fitness is achieved. One source of performance overhead during EO is the simulation of the task on each network to determing its fitness. To mitigate this source of overhead, we formulate the Static Fitness Prediction Task (SFPT), for predicting a network's fitness without direct simulation. Our hypothesis is that we can use SFPT to predict a network's fitness sufficiently accurately to reject a significant portion of networks during EO without having to simulate them, thereby making the EO more efficient. We propose a data-driven approach to the SFPT on the neuromorphic model DANNA [1]. Our approach transforms networks into directed graphs and extracts structural features to train an ancillary model for predicting the fitness of new networks. We analyze the extracted features and evaluate several predictive models to predict the fitness of networks for five tasks. Our results demonstrate a predictive capacity in these features and models. Our primary contribution is to demonstrate the utility of graph-level features extracted from variable-structure networks to predict network fitness and circumvent expensive simulations.

[1]  Vasant Dhar,et al.  Prediction in Economic Networks , 2014, Inf. Syst. Res..

[2]  Wolfgang Maass,et al.  A Statistical Analysis of Information- Processing Properties of Lamina-specific Cortical Microcircuit Models , 2022 .

[3]  O. Sporns,et al.  Identification and Classification of Hubs in Brain Networks , 2007, PloS one.

[4]  Maumita Bhattacharya,et al.  Evolutionary Approaches to Expensive Optimisation , 2013, ArXiv.

[5]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[6]  Omid Bozorg Haddad,et al.  Algorithm for Increasing the Speed of Evolutionary Optimization and its Accuracy in Multi-objective Problems , 2013, Water Resources Management.

[7]  Catherine D. Schuman,et al.  Dynamic Adaptive Neural Network Array , 2014, UCNC.

[8]  Uri Alon,et al.  Varying environments can speed up evolution , 2007, Proceedings of the National Academy of Sciences.

[9]  Karsten Beckmann,et al.  A Hafnium-Oxide Memristive Dynamic Adaptive Neural Network Array , 2016 .

[10]  R. D'Agostino An omnibus test of normality for moderate and large size samples , 1971 .

[11]  E. S. Pearson,et al.  Tests for departure from normality. Empirical results for the distributions of b2 and √b1 , 1973 .

[12]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[13]  Olaf Sporns,et al.  Complex network measures of brain connectivity: Uses and interpretations , 2010, NeuroImage.

[14]  Catherine D. Schuman Neuroscience-Inspired Dynamic Architectures , 2015 .

[15]  Catherine D. Schuman,et al.  Parallel Evolutionary Optimization for Neuromorphic Network Training , 2016, 2016 2nd Workshop on Machine Learning in HPC Environments (MLHPC).

[16]  Gábor Csárdi,et al.  The igraph software package for complex network research , 2006 .

[17]  Catherine D. Schuman,et al.  An evolutionary optimization framework for neural networks and neuromorphic architectures , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).