Group benefits in joint perceptual tasks—a review

In daily life, humans often perform perceptual tasks together to reach a shared goal. In these situations, individuals may collaborate (e.g., by distributing task demands) to perform the task better than when the task is performed alone (i.e., attain a group benefit). In this review, we identify the factors influencing if, and to what extent, a group benefit is attained and provide a framework of measures to assess group benefits in perceptual tasks. In particular, we integrate findings from two frequently investigated joint perceptual tasks: visuospatial tasks and decision‐making tasks. For both task types, we find that an exchange of information between coactors is critical to improve joint performance. Yet, the type of exchanged information and how coactors collaborate differs between tasks. In visuospatial tasks, coactors exchange information about the performed actions to distribute task demands. In perceptual decision‐making tasks, coactors exchange their confidence on their individual perceptual judgments to negotiate a joint decision. We argue that these differences can be explained by the task structure: coactors distribute task demands if a joint task allows for a spatial division and stimuli can be accurately processed by one individual. Otherwise, they perform the task individually and then integrate their individual judgments.

[1]  Peter König,et al.  Benefiting from Being Alike: Interindividual Skill Differences Predict Collective Benefit in Joint Object Control , 2019, CogSci.

[2]  P. König,et al.  Performance similarities predict collective benefits in dyadic and triadic joint visual search , 2018, PloS one.

[3]  Peter König,et al.  Can Limitations of Visuospatial Attention Be Circumvented? A Review , 2017, Front. Psychol..

[4]  Dan Bang,et al.  Making better decisions in groups , 2017, Royal Society Open Science.

[5]  P. Latham,et al.  Confidence matching in group decision-making , 2017, Nature Human Behaviour.

[6]  Caroline Szymanski,et al.  Teams on the same wavelength perform better: Inter-brain phase synchronization constitutes a neural substrate for social facilitation , 2017, NeuroImage.

[7]  P. König,et al.  Two Trackers Are Better than One: Information about the Co-actor's Actions and Performance Scores Contribute to the Collective Benefit in a Joint Visuospatial Task , 2017, Front. Psychol..

[8]  Johan Wagemans,et al.  Diagnosing the Periphery: Using the Rey–Osterrieth Complex Figure Drawing Test to Characterize Peripheral Visual Function , 2017, i-Perception.

[9]  Peter König,et al.  Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent? , 2017, Advances in cognitive psychology.

[10]  Mitsuo Kawato,et al.  Physically interacting individuals estimate the partner’s goal to enhance their movements , 2017, Nature Human Behaviour.

[11]  J. Wolfe,et al.  Five factors that guide attention in visual search , 2017, Nature Human Behaviour.

[12]  Sari R. R. Nijssen,et al.  Joint Action: Mental Representations, Shared Information and General Mechanisms for Coordinating with Others , 2017, Front. Psychol..

[13]  Scott Sinnett,et al.  Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources , 2017, i-Perception.

[14]  W. David Hairston,et al.  Pupil Sizes Scale with Attentional Load and Task Experience in a Multiple Object Tracking Task , 2016, PloS one.

[15]  Stefan M. Herzog,et al.  Boosting medical diagnostics by pooling independent judgments , 2016, Proceedings of the National Academy of Sciences.

[16]  B. Bahrami,et al.  The Perceptual and Social Components of Metacognition , 2016, Journal of experimental psychology. General.

[17]  P. König,et al.  Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search , 2016, Ergonomics.

[18]  Peter König,et al.  Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources , 2016, Front. Integr. Neurosci..

[19]  J. Enns,et al.  What’s in a Friendship? Partner Visibility Supports Cognitive Collaboration between Friends , 2015, PloS one.

[20]  J. Enns,et al.  When two heads are better than one: Interactive versus independent benefits of collaborative cognition , 2015, Psychonomic bulletin & review.

[21]  Peter König,et al.  Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration , 2015, Front. Psychol..

[22]  C. Frith,et al.  Equality bias impairs collective decision-making across cultures , 2015, Proceedings of the National Academy of Sciences.

[23]  J. S. Russell,et al.  Groupthink , 2014, Philosophical Studies.

[24]  Peter E. Latham,et al.  Does interaction matter? Testing whether a confidence heuristic can replace interaction in collective decision-making , 2014, Consciousness and Cognition.

[25]  M. Kawato,et al.  Two is better than one: Physical interactions improve motor performance in humans , 2014, Scientific Reports.

[26]  Markus Lappe,et al.  When humanoid robots become human-like interaction partners: corepresentation of robotic actions. , 2012, Journal of experimental psychology. Human perception and performance.

[27]  C. Frith,et al.  What failure in collective decision-making tells us about metacognition , 2012, Philosophical Transactions of the Royal Society B: Biological Sciences.

[28]  Robrecht P R D van der Wel,et al.  Let the force be with us: dyads exploit haptic coupling for coordination. , 2011, Journal of experimental psychology. Human perception and performance.

[29]  D. Burr,et al.  Vision and Audition Do Not Share Attentional Resources in Sustained Tasks , 2011, Front. Psychology.

[30]  S. Rajaram,et al.  Collaborative Memory: Cognitive Research and Theory , 2010, Perspectives on psychological science : a journal of the Association for Psychological Science.

[31]  Christopher A. Dickinson,et al.  Coordinating spatial referencing using shared gaze , 2010, Psychonomic bulletin & review.

[32]  Robin L. Hill,et al.  Eyetracking for two-person tasks with manipulation of a virtual world , 2010, Behavior research methods.

[33]  Ruud G. J. Meulenbroek,et al.  Anatomical substrates of cooperative joint-action in a continuous motor task: Virtual lifting and balancing , 2008, NeuroImage.

[34]  Christopher A. Dickinson,et al.  Coordinating cognition: The costs and benefits of shared gaze during collaborative search , 2008, Cognition.

[35]  George A Alvarez,et al.  How many objects can you track? Evidence for a resource-limited attentive tracking mechanism. , 2007, Journal of vision.

[36]  S. Schulz-Hardt,et al.  Group decision making in hidden profile situations: dissent as a facilitator for decision quality. , 2006, Journal of personality and social psychology.

[37]  Peter M. Vishton,et al.  Haptically Linked Dyads , 2006, Psychological science.

[38]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[39]  A. Hollingshead,et al.  From cooperative to motivated information sharing in groups: moving beyond the hidden profile paradigm , 2004 .

[40]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[41]  N. Kerr,et al.  Group performance and decision making. , 2004, Annual review of psychology.

[42]  John R. Austin Transactive memory in organizational groups: the effects of content, consensus, specialization, and accuracy on group performance. , 2003, The Journal of applied psychology.

[43]  P. R. Laughlin,et al.  Groups perform better than the best individuals on letters-to-numbers problems: informative equations and effective strategies. , 2003, Journal of personality and social psychology.

[44]  G. Knoblich,et al.  Action coordination in groups and individuals: learning anticipatory control. , 2003, Journal of experimental psychology. Learning, memory, and cognition.

[45]  P. R. Laughlin,et al.  Groups perform better than the best individuals on Letters-to-Numbers problems , 2002 .

[46]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[47]  Aaron M. Brower,et al.  What's Social about Social Cognition?: Research on Socially Shared Cognition in Small Groups , 1996 .

[48]  Roman Liepelt,et al.  Joint Simon effects for non-human co-actors , 2016, Attention, perception & psychophysics.

[49]  Riccardo Fusaroli,et al.  Investigating Conversational Dynamics: Interactive Alignment, Interpersonal Synergy, and Collective Task Performance , 2016, Cogn. Sci..

[50]  P. König,et al.  Vision and Haptics Share Spatial Attentional Resources and Visuotactile Integration Is Not Affected by High Attentional Load. , 2015, Multisensory research.

[51]  J. Sutton,et al.  Interacting to remember at multiple timescales: Coordination, collaboration, cooperation and culture in joint remembering , 2015 .

[52]  Jeremy M Wolfe,et al.  Journal of Experimental Psychology: Human Perception and Performance Failures of Perception in the Low-Prevalence Effect: Evidence From Active and Passive Visual Search , 2015 .

[53]  C. Frith,et al.  Mechanisms of social cognition. , 2012, Annual review of psychology.

[54]  R D Sorkin,et al.  Signal-detection analysis of group decision making. , 2001, Psychological review.

[55]  Linda Argote,et al.  Socially shared cognition at work: Transactive memory and group performance. , 1996 .

[56]  Z W Pylyshyn,et al.  Tracking multiple independent targets: evidence for a parallel tracking mechanism. , 1988, Spatial vision.

[57]  D. Wegner Transactive Memory: A Contemporary Analysis of the Group Mind , 1987 .

[58]  Sta,et al.  References and Notes Supporting Online Material Optimally Interacting Minds , 2022 .