Representational Content of Oscillatory Brain Activity during Object Recognition: Contrasting Cortical and Deep Neural Network Hierarchies

Numerous theories propose a key role for brain oscillations in visual perception. Most of these theories postulate that sensory information is encoded in specific oscillatory components (e.g., power or phase) of specific frequency bands. These theories are often tested with whole-brain recording methods of low spatial resolution (EEG or MEG), or depth recordings that provide a local, incomplete view of the brain. Opportunities to bridge the gap between local neural populations and whole-brain signals are rare. Here, using representational similarity analysis we ask which MEG oscillatory components (power and phase, across various frequency bands) correspond to low or high-level visual object representations, using brain representations from fMRI, or layer-wise representations in Deep Neural Networks (DNNs) as a template for low/high-level object representations. The results showed that around stimulus onset and offset, most transient oscillatory signals correlated with low-level brain patterns (V1). During stimulus presentation, sustained beta (∼20Hz) and gamma (>60Hz) power best correlated with V1, while oscillatory phase components correlated with IT representations. Surprisingly, this pattern of results did not always correspond to low- or high-level DNN layer activity. In particular, sustained beta-band oscillatory power reflected high-level DNN layers, suggestive of a feed-back component. These results begin to bridge the gap between whole-brain oscillatory signals and object representations supported by local neuronal activations.

[1]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  P. Roelfsema,et al.  Alpha and gamma oscillations characterize feedback and feedforward processing in monkey visual cortex , 2014, Proceedings of the National Academy of Sciences.

[3]  Radoslaw Martin Cichy,et al.  The representational dynamics of task and object processing in humans , 2018, eLife.

[4]  P. Fries A mechanism for cognitive dynamics: neuronal communication through neuronal coherence , 2005, Trends in Cognitive Sciences.

[5]  Alexander Borst,et al.  How does Nature Program Neuron Types? , 2008, Front. Neurosci..

[6]  H. Kennedy,et al.  Visual Areas Exert Feedforward and Feedback Influences through Distinct Frequency Channels , 2014, Neuron.

[7]  Antonio Torralba,et al.  Comparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondence , 2016, Scientific Reports.

[8]  O. Jensen,et al.  Shaping Functional Architecture by Oscillatory Alpha Activity: Gating by Inhibition , 2010, Front. Hum. Neurosci..

[9]  Radoslaw Martin Cichy,et al.  Resolving human object recognition in space and time , 2014, Nature Neuroscience.

[10]  Pieter R. Roelfsema,et al.  Object-based attention in the primary visual cortex of the macaque monkey , 1998, Nature.

[11]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[12]  V. Lamme,et al.  The distinct modes of vision offered by feedforward and recurrent processing , 2000, Trends in Neurosciences.

[13]  Jean Bullier,et al.  The Timing of Information Transfer in the Visual System , 1997 .

[14]  Anne-Lise Giraud,et al.  The contribution of frequency-specific activity to hierarchical information processing in the human auditory cortex , 2014, Nature Communications.

[15]  Nikolaus Kriegeskorte,et al.  Deep Supervised, but Not Unsupervised, Models May Explain IT Cortical Representation , 2014, PLoS Comput. Biol..

[16]  James J. DiCarlo,et al.  Evidence that recurrent circuits are critical to the ventral stream’s execution of core object recognition behavior , 2018, Nature Neuroscience.

[17]  Dimitrios Pantazis,et al.  Tracking the Spatiotemporal Neural Dynamics of Real-world Object Size and Animacy in the Human Brain , 2018, Journal of Cognitive Neuroscience.

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  Keiji Tanaka,et al.  Object category structure in response patterns of neuronal population in monkey inferior temporal cortex. , 2007, Journal of neurophysiology.

[20]  Jonas Kubilius,et al.  Evidence that recurrent circuits are critical to the ventral stream’s execution of core object recognition behavior , 2019, Nature Neuroscience.

[21]  Robert Oostenveld,et al.  FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..

[22]  H. Kennedy,et al.  Alpha-Beta and Gamma Rhythms Subserve Feedback and Feedforward Influences among Human Visual Cortical Areas , 2016, Neuron.

[23]  Philippe G Schyns,et al.  Perceptual moments of conscious visual experience inferred from oscillatory brain activity. , 2006, Proceedings of the National Academy of Sciences of the United States of America.

[24]  Nikolaus Kriegeskorte,et al.  Frontiers in Systems Neuroscience Systems Neuroscience , 2022 .

[25]  Nikolaus Kriegeskorte,et al.  Recurrence is required to capture the representational dynamics of the human visual system , 2019, Proceedings of the National Academy of Sciences.

[26]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[27]  Gregor Thut,et al.  Rhythmic TMS over Parietal Cortex Links Distinct Brain Frequencies to Global versus Local Visual Processing , 2011, Current Biology.