Assessing Collaborative Physical Tasks Via Gestural Analysis

Recent studies have shown that gestures are useful indicators of understanding, learning, and memory retention. However, and specially in collaborative settings, current metrics that estimate task understanding often neglect the information expressed through gestures. This work introduces the physical instruction assimilation (PIA) metric, a novel approach to estimate task understanding by analyzing the way in which collaborators use gestures to convey, assimilate, and execute physical instructions. PIA estimates task understanding by inspecting the number of necessary gestures required to complete a shared task. PIA is calculated based on the multiagent gestural instruction comparer (MAGIC) architecture, a previously proposed framework to represent, assess, and compare gestures. To evaluate our metric, we collected gestures from collaborators remotely completing the following three tasks: block assembly, origami, and ultrasound training. The PIA scores of these individuals are compared against two other metrics used to estimate task understanding: number of errors and amount of idle time during the task. Statistically significant correlations between PIA and these metrics are found. Additionally, a Taguchi design is used to evaluate PIA's sensitivity to changes in the MAGIC architecture. The factors evaluated the effect of changes in time, order, and motion trajectories of the collaborators’ gestures. PIA is shown to be robust to these changes, having an average mean change of 0.45. These results hint that gestures, in the form of the assimilation of physical instructions, can reveal insights of task understanding and complement other commonly used metrics.

[1]  Susan Goldin-Meadow,et al.  Understanding gesture: is the listener's motor system involved? , 2014, Journal of experimental psychology. General.

[2]  Sandy Schuman,et al.  Creating a culture of collaboration : the International Association of Facilitators handbook , 2006 .

[3]  Thor Grünbaum,et al.  The body in action , 2008 .

[4]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[5]  Arash Eshghi,et al.  Incremental Semantic Construction in a Dialogue System , 2011, IWCS.

[6]  K. Pearson VII. Note on regression and inheritance in the case of two parents , 1895, Proceedings of the Royal Society of London.

[7]  Thomas W. Shiland Probing for Understanding. , 2002 .

[8]  Tobias Höllerer,et al.  Integrating the physical environment into mobile remote collaboration , 2012, Mobile HCI.

[9]  Fred Paas,et al.  Are gesture and speech mismatches produced by an integrated gesture-speech system? A more dynamically embodied perspective is needed for understanding gesture-related learning , 2017, Behavioral and Brain Sciences.

[10]  Donna Frick-Horbury The use of hand gestures as self-generated cues for recall of verbally associated targets. , 2002, The American journal of psychology.

[11]  Barbara Tversky,et al.  From hands to minds: Gestures promote understanding , 2016, Cognitive Research: Principles and Implications.

[12]  Juan Pablo Wachs,et al.  MAGIC: A Fundamental Framework for Gesture Representation, Comparison and Assessment , 2019, 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019).

[13]  Susan Goldin-Meadow,et al.  Gestures, but not meaningless movements, lighten working memory load when explaining math , 2012, Language and cognitive processes.

[14]  John Stufken,et al.  Taguchi Methods: A Hands-On Approach , 1992 .

[15]  Timothy Koschmann Why would the discovery of gestures produced by signers jeopardize the experimental finding of gesture-speech mismatch? , 2017, The Behavioral and brain sciences.

[16]  E. Sloth,et al.  A Technique for Ultrasound-Guided Blood Sampling from a Dry and Gel-Free Puncture Area , 2016, The journal of vascular access.

[17]  Guy Hoffman,et al.  Evaluating Fluency in Human–Robot Collaboration , 2019, IEEE Transactions on Human-Machine Systems.

[18]  Susan Goldin-Meadow,et al.  Action’s Influence on Thought: The Case of Gesture , 2010, Perspectives on psychological science : a journal of the Association for Psychological Science.

[19]  David E. Kieras,et al.  The Role of a Mental Model in Learning to Operate a Device , 1990, Cogn. Sci..

[20]  Rita Cucchiara,et al.  Paying More Attention to Saliency: Image Captioning with Saliency and Context Attention , 2017 .

[21]  Ruzena Bajcsy,et al.  Immersive 3D Environment for Remote Collaboration and Training of Physical Activities , 2008, 2008 IEEE Virtual Reality Conference.

[22]  K. James,et al.  Learning math by hand: The neural effects of gesture-based instruction in 8-year-old children , 2019, Attention, perception & psychophysics.

[23]  S. Cook,et al.  Gesture during math instruction specifically benefits learners with high visuospatial working memory capacity , 2020, Cognitive research: principles and implications.

[24]  H. Bekkering,et al.  Joint action: bodies and minds moving together , 2006, Trends in Cognitive Sciences.

[25]  Woontack Woo,et al.  Evaluating the Combination of Visual Communication Cues for HMD-based Mixed Reality Remote Collaboration , 2019, CHI.

[26]  Günther Knoblich,et al.  The Social Nature of Perception and Action , 2006 .

[27]  R. B. Church,et al.  Gesture enhances learning of a complex statistical concept , 2017, Cognitive research: principles and implications.

[28]  Gloria Dall'Alba,et al.  Understanding complex assessment: A lesson from aviation , 2011 .

[29]  S. Goldin-Meadow,et al.  Better together: Simultaneous presentation of speech and gesture in math instruction supports generalization and retention. , 2017, Learning and instruction.

[30]  Yingxue Gao,et al.  Production and Interaction between Gesture and Speech: A Review , 2016 .

[31]  Isabella Poggi Iconicity in different types of gestures , 2008 .

[32]  Hideaki Kuzuoka,et al.  Improving visibility of remote gestures in distributed tabletop collaboration , 2011, CSCW '11.

[33]  V. Mittal,et al.  Why We Should Take a Closer Look at Gestures , 2016, Schizophrenia bulletin.

[34]  Weidong Huang,et al.  HandsinAir: a wearable system for remote collaboration on physical tasks , 2013, CSCW '13.

[35]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[36]  James R. Glass,et al.  Towards Unsupervised Speech-to-text Translation , 2018, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[37]  J. Holler,et al.  Age-related differences in multimodal recipient design: younger, but not older adults, adapt speech and co-speech gestures to common ground , 2018, Language, Cognition and Neuroscience.

[38]  Alexander Okhotin Hardest languages for conjunctive and Boolean grammars , 2019, Inf. Comput..

[39]  H. H. Clark,et al.  Collaborating on contributions to conversations , 1987 .

[40]  Matthew Stone,et al.  A Formal Semantic Analysis of Gesture , 2009, J. Semant..

[41]  P. P. Zubcsek,et al.  How being busy can increase motivation and reduce task completion time. , 2016, Journal of personality and social psychology.

[42]  Mark S. Young,et al.  The Field Guide to Understanding Human Error , 2008 .

[43]  Elizabeth A. Gunderson,et al.  Number gestures predict learning of number words. , 2019, Developmental science.

[44]  Paul Strauss,et al.  Foundations Of The Theory Of Signs , 2016 .

[45]  Nicoletta Adamo,et al.  The Effects of Body Gestures and Gender on Viewer's Perception of Animated Pedagogical Agent's Emotions , 2020, HCI.

[46]  David Biffar,et al.  A novel and inexpensive ballistic gel phantom for ultrasound training. , 2015, World journal of emergency medicine.

[47]  K. Collins-Thompson,et al.  Using Text Messaging, Social Media, and Interviews to Understand What Pregnant Youth Think About Weight Gain During Pregnancy , 2019, JMIR formative research.

[48]  Marcel Danesi,et al.  The Forms of Meaning: Modeling Systems Theory and Semiotic Analysis , 2000 .

[49]  S. Cook,et al.  Bridging gaps in common ground: Speakers design their gestures for their listeners. , 2016, Journal of experimental psychology. Learning, memory, and cognition.

[50]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[51]  Alex Lascarides,et al.  Logics of Conversation , 2005, Studies in natural language processing.

[52]  Weidong Huang,et al.  Sharing hand gesture and sketch cues in remote collaboration , 2019, J. Vis. Commun. Image Represent..

[53]  Stefan Kopp,et al.  Towards integrated microplanning of language and iconic gesture for multimodal output , 2004, ICMI '04.

[54]  A. Kendon Gesture: Visible Action as Utterance , 2004 .