How to Address Smart Homes with a Social Robot? A Multi-modal Corpus of User Interactions with an Intelligent Environment

In order to explore intuitive verbal and non-verbal interfaces in smart environments we recorded user interactions with an intelligent apartment. Besides offering various interactive capabilities itself, the apartment is also inhabited by a social robot that is available as a humanoid interface. This paper presents a multi-modal corpus that contains goal-directed actions of naive users in attempts to solve a number of predefined tasks. Alongside audio and video recordings, our data-set consists of large amount of temporally aligned sensory data and system behavior provided by the environment and its interactive components. Non-verbal system responses such as changes in light or display contents, as well as robot and apartment utterances and gestures serve as a rich basis for later in-depth analysis. Manual annotations provide further information about meta data like the current course of study and user behavior including the incorporated modality, all literal utterances, language features, emotional expressions, foci of attention, and addressees.

[1]  J. F. Kelley,et al.  An iterative design methodology for user-friendly natural language office information applications , 1984, TOIS.

[2]  Britta Wrede,et al.  Pamini: A framework for assembling mixed-initiative human-robot interaction from generic interaction patterns , 2010, SIGDIAL Conference.

[3]  Sebastian Wrede,et al.  A Cross-Platform Data Acquisition and Transformation Approach for Whole-Systems Experimentation – Status and Challenges , 2013 .

[4]  Sebastian Wrede,et al.  A middleware for collaborative research in experimental robotics , 2011, 2011 IEEE/SICE International Symposium on System Integration (SII).

[5]  Mei-Yuh Hwang,et al.  The SPHINX-II speech recognition system: an overview , 1993, Comput. Speech Lang..

[6]  Sven Wachsmuth,et al.  ToBI - Team of Bielefeld: The Human-Robot Interaction System for RoboCup@Home 2009 , 2009 .

[7]  Paolo Dario,et al.  Improving Domiciliary Robotic Services by Integrating the ASTRO Robot in an AmI Infrastructure , 2014, Technology Transfer Experiments from the ECHORD Project.

[8]  Claudiu Danilov,et al.  The Spread Toolkit: Architecture and Performance , 2004 .

[9]  Brigitte Meillon,et al.  The EEE corpus: socio-affective "glue" cues in elderly-robot interactions in a Smart Home with the EmOz platform , 2014 .

[10]  K. J. Miller,et al.  Smart-Home Technologies to Assist Older People to Live Well at Home , 2013 .

[11]  David Schlangen,et al.  Towards Closed Feedback Loops in HRI: Integrating InproTK and PaMini , 2014, MMRWHRI '14.

[12]  Roy Kalawsky,et al.  Capturing user requirements for an integrated home environment , 2004, NordiCHI '04.

[13]  David Schlangen,et al.  The InproTK 2012 release , 2012, SDCTD@NAACL-HLT.

[14]  Jianwei Zhang,et al.  PEIS, MIRA, and ROS: Three frameworks, one service robot — A tale of integration , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).

[15]  Peter Wittenburg,et al.  ELAN: a Professional Framework for Multimodality Research , 2006, LREC.

[16]  Ali Mazalek,et al.  Exploring the design space of gestural interaction with active tokens through user-defined gestures , 2014, CHI.

[17]  Stephen C. Levinson,et al.  MAX PLANCK INSTITUTE FOR PSYCHOUNGUISTICS , 2003 .