Discovering Unexpected Local Nonlinear Interactions in Scientific Black-box Models

Scientific computational models are crucial for analyzing and understanding complex real-life systems that are otherwise difficult for experimentation. However, the complex behavior and the vast input-output space of these models often make them opaque, slowing the discovery of novel phenomena. In this work, we present HINT (Hessian INTerestingness) -- a new algorithm that can automatically and systematically explore black-box models and highlight local nonlinear interactions in the input-output space of the model. This tool aims to facilitate the discovery of interesting model behaviors that are unknown to the researchers. Using this simple yet powerful tool, we were able to correctly rank all pairwise interactions in known benchmark models and do so faster and with greater accuracy than state-of-the-art methods. We further applied HINT to existing computational neuroscience models, and were able to reproduce important scientific discoveries that were published years after the creation of those models. Finally, we ran HINT on two real-world models (in neuroscience and earth science) and found new behaviors of the model that were of value to domain experts.

[1]  C. Stevens,et al.  Voltage dependence of NMDA-activated macroscopic conductances predicted by single-channel kinetics , 1990, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[2]  Carlos Guestrin,et al.  "Why Should I Trust You?": Explaining the Predictions of Any Classifier , 2016, ArXiv.

[3]  Bogdan E. Popescu,et al.  PREDICTIVE LEARNING VIA RULE ENSEMBLES , 2008, 0811.1679.

[4]  Jürgen Schmidhuber,et al.  Formal Theory of Creativity, Fun, and Intrinsic Motivation (1990–2010) , 2010, IEEE Transactions on Autonomous Mental Development.

[5]  Jun B. Ding,et al.  Cell-type–specific inhibition of the dendritic plateau potential in striatal spiny projection neurons , 2017, Proceedings of the National Academy of Sciences.

[6]  C. Gray,et al.  Chattering Cells: Superficial Pyramidal Neurons Contributing to the Generation of Synchronous Oscillations in the Visual Cortex , 1996, Science.

[7]  A. Azzouz 2011 , 2020, City.

[8]  A. Larkman,et al.  Dendritic morphology of pyramidal neurones of the visual cortex of the rat: III. Spine distributions , 1991, The Journal of comparative neurology.

[9]  Giles Hooker,et al.  Discovering additive structure in black box functions , 2004, KDD.

[10]  H. Gildor,et al.  Stepwise seasonal restratification and the evolution of salinity minimum in the Gulf of Aqaba (Gulf of Eilat) , 2011 .

[11]  B. Sakmann,et al.  Active propagation of somatic action potentials into neocortical pyramidal cell dendrites , 1994, Nature.

[12]  Henry Markram,et al.  Timed Synaptic Inhibition Shapes NMDA Spikes, Influencing Local Dendritic Processing and Global I/O Properties of Cortical Neurons. , 2017, Cell reports.

[13]  Eve Marder,et al.  Alternative to hand-tuning conductance-based models: construction and analysis of databases of model neurons. , 2003, Journal of neurophysiology.

[14]  Hod Lipson,et al.  Distilling Free-Form Natural Laws from Experimental Data , 2009, Science.

[15]  Alan Y. Chiang,et al.  Generalized Additive Models: An Introduction With R , 2007, Technometrics.

[16]  Andreas Oschlies,et al.  Simultaneous data-based optimization of a 1D-ecosystem model at three locations in the North Atlantic Ocean: Part 2) Standing stocks and nitrogen fluxes , 2003 .

[17]  Ken E. Whelan,et al.  The Automation of Science , 2009, Science.

[18]  H. Sverdrup,et al.  On Conditions for the Vernal Blooming of Phytoplankton , 1953 .

[19]  E. Tronci,et al.  1996 , 1997, Affair of the Heart.

[20]  M. Behrenfeld,et al.  Abandoning Sverdrup's Critical Depth Hypothesis on phytoplankton blooms. , 2010, Ecology.

[21]  G. Hooker Generalized Functional ANOVA Diagnostics for High-Dimensional Functions of Dependent Variables , 2007 .

[22]  M. Häusser,et al.  Dendritic coincidence detection of EPSPs and action potentials , 2001, Nature Neuroscience.

[23]  A. James 2010 , 2011, Philo of Alexandria: an Annotated Bibliography 2007-2016.

[24]  J. Friedman Greedy function approximation: A gradient boosting machine. , 2001 .

[25]  K. Fennel,et al.  Model investigations of the North Atlantic spring bloom initiation , 2015 .

[26]  W Rall,et al.  Computational study of an excitable dendritic spine. , 1988, Journal of neurophysiology.

[27]  H. Simon,et al.  Studying Scientific Discovery by Computer Simulation , 1983, Science.

[28]  Max Tegmark,et al.  Toward an artificial intelligence physicist for unsupervised learning. , 2019, Physical review. E.

[29]  Michael L. Hines,et al.  The NEURON Book , 2006 .

[30]  Bert Sakmann,et al.  Backpropagating action potentials in neurones: measurement, mechanisms and potential functions. , 2005, Progress in biophysics and molecular biology.

[31]  Raúl Santos-Rodríguez,et al.  SICA: subjectively interesting component analysis , 2018, Data Mining and Knowledge Discovery.

[32]  Henry Markram,et al.  BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience , 2016, Front. Neuroinform..

[33]  Wen-Liang L Zhou,et al.  The decade of the dendritic NMDA spike , 2010, Journal of neuroscience research.

[34]  Addolorata Marasco,et al.  On the mechanisms underlying the depolarization block in the spiking dynamics of CA1 pyramidal neurons , 2012, Journal of Computational Neuroscience.

[35]  J. Schiller,et al.  NMDA spikes in basal dendrites of cortical pyramidal neurons , 2000, Nature.

[36]  G. Jantzen 1988 , 1988, The Winning Cars of the Indianapolis 500.

[37]  Brandon M. Greenwell,et al.  Interpretable Machine Learning , 2019, Hands-On Machine Learning with R.

[38]  Gaute T. Einevoll,et al.  Uncertainpy: A Python Toolbox for Uncertainty Quantification and Sensitivity Analysis in Computational Neuroscience , 2018, bioRxiv.

[39]  S. Wood Generalized Additive Models: An Introduction with R , 2006 .

[40]  Bernd Bischl,et al.  iml: An R package for Interpretable Machine Learning , 2018, J. Open Source Softw..

[41]  Bartlett W. Mel,et al.  Location-Dependent Effects of Inhibition on Local Spiking in Pyramidal Neuron Dendrites , 2012, PLoS Comput. Biol..

[42]  A. M. Edwards,et al.  Zooplankton mortality and the dynamical behaviour of plankton population models , 1999, Bulletin of mathematical biology.

[43]  J. Schiller,et al.  Active properties of neocortical pyramidal neuron dendrites. , 2013, Annual review of neuroscience.

[44]  Mark Craven,et al.  A review of active learning approaches to experimental design for uncovering biological networks , 2017, PLoS Comput. Biol..

[45]  R. Tibshirani,et al.  Generalized Additive Models , 1986 .

[46]  G. Stuart,et al.  Dendritic small conductance calcium-activated potassium channels activated by action potentials suppress EPSPs and gate spike-timing dependent synaptic plasticity , 2017, eLife.

[47]  Kamran Khodakhah,et al.  Somatic and Dendritic Small-Conductance Calcium-Activated Potassium Channels Regulate the Output of Cerebellar Purkinje Neurons , 2003, The Journal of Neuroscience.

[48]  Johannes Gehrke,et al.  Accurate intelligible models with pairwise interactions , 2013, KDD.

[49]  Rajnish Ranjan,et al.  Mapping the function of neuronal ion channels in model and experiment , 2017, eLife.

[50]  Henry Markram,et al.  Models of Neocortical Layer 5b Pyramidal Cells Capturing a Wide Range of Dendritic and Perisomatic Active Properties , 2011, PLoS Comput. Biol..

[51]  Richard Bertram,et al.  Fast-Activating Voltage- and Calcium-Dependent Potassium (BK) Conductance Promotes Bursting in Pituitary Cells: A Dynamic Clamp Study , 2011, The Journal of Neuroscience.

[52]  Ryszard S. Michalski,et al.  Integrating Quantitative and Qualitative Discovery: The ABACUS System , 1990, Machine Learning.

[53]  Francisco Bezanilla,et al.  Ion Channels: From Conductance to Structure , 2008, Neuron.

[54]  Brian Falkenhainer,et al.  Integrating quantitative and qualitative discovery: The ABACUS system , 2004, Machine Learning.