Density-based clustering: A ‘landscape view’ of multi-channel neural data for inference and dynamic complexity analysis

Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures. We apply a variant of the ‘mean-shift’ algorithm to perform state-space clustering, and validate it on an Hopfield network in the glassy phase, in which metastable states are largely uncorrelated from memories embedded in the synaptic matrix. In this context, we show that the neural states identified as clusters’ centroids offer a parsimonious parametrization of the synaptic matrix, which allows a significant improvement in inferring the synaptic couplings from the neural activities. Moving to the more realistic case of a multi-modular spiking network, with spike-frequency adaptation inducing history-dependent effects, we propose a procedure inspired by Boltzmann learning, but extending its domain of application, to learn inter-module synaptic couplings so that the spiking network reproduces a prescribed pattern of spatial correlations; we then illustrate, in the spiking network, how clustering is effective in extracting relevant features of the network’s state-space landscape. Finally, we show that the knowledge of the cluster structure allows casting the multi-dimensional neural dynamics in the form of a symbolic dynamics of transitions between clusters; as an illustration of the potential of such reduction, we define and analyze a measure of complexity of the neural time series.

[1]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[2]  Jascha Sohl-Dickstein,et al.  A new method for parameter estimation in probabilistic models: Minimum probability flow , 2011, Physical review letters.

[3]  G. Tononi,et al.  Lempel-Ziv complexity of cortical activity during sleep and waking in rats , 2015, Journal of neurophysiology.

[4]  M. Abeles,et al.  Detecting precise firing sequences in experimental data , 2001, Journal of Neuroscience Methods.

[5]  Federico Ricci-Tersenghi,et al.  Inferring Synaptic Structure in Presence of Neural Interaction Time Scales , 2014, PloS one.

[6]  Maurizio Mattia,et al.  Frequency-dependent response properties of adapting spiking neurons. , 2007, Mathematical biosciences.

[7]  John Hertz,et al.  Ising Models for Inferring Network Structure From Spike Data , 2011, 1106.1752.

[8]  José María Amigó,et al.  Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity , 2004, Neural Computation.

[9]  Athena Akrami,et al.  Lateral thinking, from the Hopfield model to cortical dynamics , 2012, Brain Research.

[10]  G. La Camera,et al.  Dynamics of Multistable States during Ongoing and Evoked Cortical Activity , 2015, The Journal of Neuroscience.

[11]  Jonathan W. Pillow,et al.  Single-trial spike trains in parietal cortex reveal discrete steps during decision-making , 2015, Science.

[12]  R. Dawes Judgment under uncertainty: The robust beauty of improper linear models in decision making , 1979 .

[13]  James P Roach,et al.  Memory recall and spike-frequency adaptation. , 2016, Physical review. E.

[14]  Maurizio Mattia,et al.  Diverse population-bursting modes of adapting spiking neurons. , 2007, Physical review letters.

[15]  John Rinzel,et al.  Noise and adaptation in multistable perception: noise drives when to switch, adaptation determines percept choice. , 2014, Journal of vision.

[16]  Naftali Tishby,et al.  Cortical activity flips among quasi-stationary states. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Abigail Morrison,et al.  Dynamic stability of sequential stimulus representations in adapting neuronal networks , 2014, Front. Comput. Neurosci..

[18]  S. Ferraina,et al.  Heterogeneous Attractor Cell Assemblies for Motor Planning in Premotor Cortex , 2013, The Journal of Neuroscience.

[19]  Maria V. Sanchez-Vives,et al.  Exploring the spectrum of dynamical regimes and timescales in spontaneous cortical activity , 2012, Cognitive Neurodynamics.

[20]  Konrad P Kording,et al.  How advances in neural recording affect data analysis , 2011, Nature Neuroscience.

[21]  G. Buzsáki,et al.  Sequential structure of neocortical spontaneous activity in vivo , 2007, Proceedings of the National Academy of Sciences.

[22]  G. Deco,et al.  Neuronal Adaptation Effects in Decision Making , 2011, The Journal of Neuroscience.

[23]  Christian Igel,et al.  Empirical evaluation of the improved Rprop learning algorithms , 2003, Neurocomputing.

[24]  Yasser Roudi,et al.  Multi-neuronal activity and functional connectivity in cell assemblies , 2015, Current Opinion in Neurobiology.

[25]  Andy Harter,et al.  Parameterisation of a stochastic model for human face identification , 1994, Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.

[26]  Ari S. Morcos,et al.  History-dependent variability in population dynamics during evidence accumulation in cortex , 2016, Nature Neuroscience.

[27]  Gustavo Deco,et al.  Sequential Memory: A Putative Neural and Synaptic Dynamical Mechanism , 2005, Journal of Cognitive Neuroscience.

[28]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[29]  G. Tononi,et al.  A Theoretically Based Index of Consciousness Independent of Sensory Processing and Behavior , 2013, Science Translational Medicine.

[30]  Larry D. Hostetler,et al.  The estimation of the gradient of a density function, with applications in pattern recognition , 1975, IEEE Trans. Inf. Theory.

[31]  Xiao-Jing Wang,et al.  Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron , 2004, Journal of Computational Neuroscience.

[32]  Walter Senn,et al.  Minimal Models of Adapted Neuronal Response to In VivoLike Input Currents , 2004, Neural Computation.

[33]  Schuster,et al.  Easily calculable measure for the complexity of spatiotemporal patterns. , 1987, Physical review. A, General physics.

[34]  J. Hertz,et al.  Ising model for neural data: model quality and approximate methods for extracting functional connectivity. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[35]  Paolo Del Giudice,et al.  Efficient Event-Driven Simulation of Large Networks of Spiking Neurons and Dynamical Synapses , 2000, Neural Computation.

[36]  Martin A. Riedmiller,et al.  RPROP - A Fast Adaptive Learning Algorithm , 1992 .

[37]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[38]  Daniel J. Amit,et al.  Modeling brain function: the world of attractor neural networks, 1st Edition , 1989 .

[39]  Abdolhossein Abbassian,et al.  Optimal region of latching activity in an adaptive Potts model for networks of neurons , 2012 .

[40]  Sean Hughes,et al.  Clustering by Fast Search and Find of Density Peaks , 2016 .