Behavioral informatics from multimodal human interaction cues

shri@sipi.usc.edu Abstract The confluence of advances in sensing, communication and computing technologies is allowing capture and access to data of human interaction and its context, in diverse forms and modalities, in ways that were unimaginable even a few years ago. Importantly, these data afford the analysis and interpretation of multi-modal cues of verbal and non-verbal human behaviour. These signals carry crucial information about not only a person’s intent and identity but also underlying attitudes and emotions. Automatically capturing these cues, although vastly challenging, offers the promise of not just efficient data processing but in tools for discovery that enable hitherto unimagined insights. Recent computational approaches that have leveraged judicious use of both data and domain knowledge have shown promising advances in this regards, for example in deriving rich information about behaviour constructs. This talk will focus on some of the advances (and challenges) in gathering such data and creating algorithms for machine processing of such cues. It will highlight some of our ongoing efforts in Behavioural Signal Processing (BSP)—technology and algorithms for quantitatively and objectively understanding typical, atypical and distressed human behaviour—with a specific focus on communicative, affective and social behaviour. The talk will illustrate Behavioural Informatics applications of these techniques that contribute to quantifying higher-level, often subjectively described, human behaviour in a domain-sensitive fashion using examples from Autism, Couple therapy and Addiction counselling.

[1]  Athanasios Katsamanis,et al.  An acoustic analysis of shared enjoyment in ECA interactions of children with autism , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[2]  Shrikanth S. Narayanan,et al.  Analyzing the Nature of ECA Interactions in Children with Autism , 2011, INTERSPEECH.

[3]  Shrikanth S. Narayanan,et al.  Analyzing the structure of parent-moderated narratives from children with ASD using an entity-based approach , 2013, INTERSPEECH.

[4]  Angeliki Metallinou,et al.  Quantifying atypicality in affective facial expressions of children with autism spectrum disorders , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).

[5]  Shrikanth S. Narayanan,et al.  The USC CARE Corpus: Child-Psychologist Interactions of Children with Autism Spectrum Disorders , 2011, INTERSPEECH.

[6]  Shrikanth S. Narayanan,et al.  Acoustic-prosodic, turn-taking, and language cues in child-psychologist interactions for varying social demand , 2013, INTERSPEECH.

[7]  Shrikanth S. Narayanan,et al.  Interplay between verbal response latency and physiology of children with autism during ECA interactions , 2012, INTERSPEECH.

[8]  Shrikanth Narayanan,et al.  ASSESSMENT OF A CHILD ’ S ENGAGEMENT USING SEQUENCE MODEL BASED FEATURES , 2013 .

[9]  Agata Rozga,et al.  Acoustical analysis of engagement behavior in children , 2012, WOCCI.

[10]  Shrikanth S. Narayanan,et al.  Spontaneous-Speech Acoustic-Prosodic Features of Children with Autism and the Interacting Psychologist , 2012, INTERSPEECH.

[11]  Shrikanth S. Narayanan,et al.  Using physiology and language cues for modeling verbal response latencies of children with ASD , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.