Body Form Modulates the Prediction of Human and Artificial Behaviour from Gaze Observation

[1]  Emily S. Cross,et al.  Human but not robotic gaze facilitates action prediction , 2022, iScience.

[2]  Michael J. Richardson,et al.  Gaze facilitates responsivity during hand coordinated joint attention , 2021, Scientific Reports.

[3]  L. Fogassi,et al.  Decoding grip type and action goal during the observation of reaching-grasping actions: A multivariate fMRI study , 2021, NeuroImage.

[4]  Indrajeet Patil,et al.  performance: An R Package for Assessment, Comparison and Testing of Statistical Models , 2021, J. Open Source Softw..

[5]  A. Bayliss,et al.  From Gaze Perception to Social Cognition: The Shared-Attention System , 2021, Perspectives on psychological science : a journal of the Association for Psychological Science.

[6]  Mattan S. Ben-Shachar,et al.  effectsize: Estimation of Effect Size Indices and Standardized Parameters , 2020, J. Open Source Softw..

[7]  C. Urgesi,et al.  Autistic Traits Differently Account for Context-Based Predictions of Physical and Social Events , 2020, Brain sciences.

[8]  C. Urgesi,et al.  Spatial frequency tuning of motor responses reveals differential contribution of dorsal and ventral systems to action comprehension , 2020, Proceedings of the National Academy of Sciences.

[9]  Ramón D. Castillo,et al.  How Do Object Shape, Semantic Cues, and Apparent Velocity Affect the Attribution of Intentionality to Figures With Different Types of Movements? , 2020, Frontiers in Psychology.

[10]  Rossitza Setchi,et al.  Explainable Robotics in Human-Robot Interactions , 2020, KES.

[11]  G. Bird,et al.  Conceptualizing and testing action understanding , 2019, Neuroscience & Biobehavioral Reviews.

[12]  G. McArthur,et al.  The mind minds minds: The effect of intentional stance on the neural encoding of joint attention , 2019, Cognitive, Affective, & Behavioral Neuroscience.

[13]  J. Gray,et al.  PsychoPy2: Experiments in behavior made easy , 2019, Behavior research methods.

[14]  Richard Ramsey,et al.  Neural Integration in Body Perception , 2018, Journal of Cognitive Neuroscience.

[15]  Kirstie J. Whitaker,et al.  Raincloud plots: a multi-platform tool for robust data visualization , 2018, PeerJ Prepr..

[16]  Christian Keysers,et al.  Where and how our brain represents the temporal structure of observed action , 2018, NeuroImage.

[17]  Stefan Palan,et al.  Prolific.ac—A subject pool for online experiments , 2017 .

[18]  J. Brock,et al.  Human agency beliefs influence behaviour during virtual social interactions , 2017, PeerJ.

[19]  Arielle R. Mandell,et al.  Mind Perception in Humanoid Agents has Negative Effects on Cognitive Processing , 2017 .

[20]  Chen Yu,et al.  Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention. , 2017, Cognitive science.

[21]  Patric Bach,et al.  Looking ahead: Anticipatory cueing of attention to objects others will look at , 2016, Cognitive neuroscience.

[22]  Natalie A. Wyer,et al.  The Things You Do: Internal Models of Others’ Expected Behaviour Guide Action Observation , 2016, PloS one.

[23]  C. Becchio,et al.  Altercentric interference in level 1 visual perspective taking reflects the ascription of mental states, not submentalizing. , 2016, Journal of experimental psychology. Human perception and performance.

[24]  Emily S. Cross,et al.  The shaping of social perception by stimulus and knowledge cues to human animacy , 2016, Philosophical Transactions of the Royal Society B: Biological Sciences.

[25]  Juan Manuel Contreras,et al.  Neural evidence that three dimensions organize mental state representation: Rationality, social impact, and valence , 2015, Proceedings of the National Academy of Sciences.

[26]  J. Hietanen,et al.  From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at , 2015, Visual cognition.

[27]  Hiroshi Ishiguro,et al.  Robot Form and Motion Influences Social Attention , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  C. Keysers,et al.  The role of the theory of mind network in action observation - an rTMS study , 2015, Brain Stimulation.

[29]  Timothy R. Brick,et al.  Is This Car Looking at You? How Anthropomorphism Predicts Fusiform Face Area Activation when Seeing Cars , 2014, PloS one.

[30]  Emily S. Cross,et al.  The Control of Automatic Imitation Based on Bottom–Up and Top–Down Cues to Animacy: Insights from Brain and Behavior , 2014, Journal of Cognitive Neuroscience.

[31]  D. Bates,et al.  Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.

[32]  Patric Bach,et al.  The affordance-matching hypothesis: how objects guide action understanding and prediction , 2014, Front. Hum. Neurosci..

[33]  E. Wiese,et al.  Beliefs about the Minds of Others Influence How We Process Sensory Information , 2014, PloS one.

[34]  Luca Turella,et al.  Corticospinal Facilitation during Observation of Graspable Objects: A Transcranial Magnetic Stimulation Study , 2012, PloS one.

[35]  Jan Zwickel,et al.  I See What You Mean: How Attentional Selection Is Shaped by Ascribing Intentions to Others , 2012, PloS one.

[36]  Emily S. Cross,et al.  Robotic movement preferentially engages the action observation network , 2012, Human brain mapping.

[37]  Richard Ramsey,et al.  Predicting others’ actions via grasp and gaze: evidence for distinct brain networks , 2012, Psychological research.

[38]  V. Manera,et al.  Grasping intentions: from thought experiments to empirical evidence , 2012, Front. Hum. Neurosci..

[39]  Scott T. Grafton,et al.  Decoding intention: A neuroergonomic perspective , 2012, NeuroImage.

[40]  P. Fletcher,et al.  Seeing other minds: attributed mental states influence perception , 2010, Trends in Cognitive Sciences.

[41]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[42]  James Stanley,et al.  Effects of Agency on Movement Interference During Observation of a Moving Dot Stimulus , 2007, Journal of experimental psychology. Human perception and performance.

[43]  Luca Turella,et al.  When Gaze Turns into Grasp , 2006, Journal of Cognitive Neuroscience.

[44]  P. Downing,et al.  Why does the gaze of others direct visual attention? , 2004 .

[45]  C. Frith,et al.  “Hey John”: Signals Conveying Communicative Intention toward the Self Activate Brain Regions Associated with “Mentalizing,” Regardless of Modality , 2003, The Journal of Neuroscience.

[46]  Scott T. Grafton,et al.  Graspable objects grab attention when the potential for action is recognized , 2003, Nature Neuroscience.

[47]  Thomas W. Schubert,et al.  Overlap of Self, Ingroup, and Outgroup: Pictorial Measures of Self-Categorization , 2002 .

[48]  S. Baron-Cohen,et al.  Gaze Perception Triggers Reflexive Visuospatial Orienting , 1999 .

[49]  A. Raftery Bayesian Model Selection in Social Research , 1995 .

[50]  J. Freyd,et al.  Apparent Motion of the Human Body , 1990 .