Speech as a feedback modality for smart objects

One part of the vision of ubiquitous computing is the integration of sensing and actuation nodes into everyday objects, clothes worn on the body, and in large numbers into the environment. These augmented environments require novel types of interfaces that provide for naturalistic and adaptive interaction depending on user context. In this paper, we investigate the use of speech synthesis on sensor nodes that may be integrated into smart objects. We evaluate the so-called Wireless Voice Node, a small, wireless sensor node with the ability to generate voice output as a novel feedback modality for applications for ambient intelligence. As an example, we present the design and implementation of a speaking doll integrating this node. Using voice output from the doll we aim at using speech synthesis to improve the playing experience of children.

[1]  G. Troster,et al.  An automatic parameter extraction method for the 7×50m Stroke Efficiency Test , 2008, 2008 Third International Conference on Pervasive Computing and Applications.

[2]  J. Gutknecht,et al.  Qi energy flow visualisation using wearable computing , 2007, 2007 2nd International Conference on Pervasive Computing and Applications.

[3]  Antonio Krüger,et al.  Modelling personality in voices of talking products through prosodic parameters , 2007, IUI '07.

[4]  Willem Fontijn,et al.  StoryToy the Interactive Storytelling Toy , 2005 .

[5]  Gerhard Tröster,et al.  On-body activity recognition in a dynamic sensor network , 2007, BODYNETS.

[6]  Kent Lyons,et al.  The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[7]  Cynthia Breazeal,et al.  Recognition of Affective Communicative Intent in Robot-Directed Speech , 2002, Auton. Robots.

[8]  Paul Lukowicz,et al.  Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers , 2004, Pervasive.

[9]  Gaetano Borriello,et al.  Proximity interactions between wireless sensors and their application , 2003, WSNA '03.

[10]  Kay Römer,et al.  The design space of wireless sensor networks , 2004, IEEE Wireless Communications.

[11]  Gerhard Tröster,et al.  Titan: A Tiny Task Network for Dynamically Reconfigurable Heterogeneous Sensor Networks , 2007, KiVS.

[12]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[13]  Kiyohiro Shikano,et al.  Robots that can hear, understand and talk , 2004, Adv. Robotics.

[14]  Florian Michahelles,et al.  Proactive Instructions for Furniture Assembly , 2002, UbiComp.

[15]  Albrecht Schmidt,et al.  Mediacups: experience with design and use of computer-augmented everyday artefacts , 2001, Comput. Networks.

[16]  Paul Lukowicz,et al.  Rapid Prototyping of Activity Recognition Applications , 2008, IEEE Pervasive Computing.

[17]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.