Touch Versus In-Air Hand Gestures: Evaluating the Acceptance by Seniors of Human-Robot Interaction

Do elderly people have a preference between performing in-air gestures or pressing screen buttons to interact with an assistive robot? This study attempts to provide answers to this question by measuring the level of acceptance, performance as well as knowledge of both interaction modalities during a scenario where elderly participants interacted with an assistive robot. Two interaction modalities were compared; in-air gestures and touch. A scenario has been chosen in which the elderly people perform exercises in order to improve lifestyle behavior. The seniors in this scenario stand in front of the assistive robot. The robot displays several exercises on the robot screen. After each successfully performed exercise the senior navigates to the next or previous exercise. No significant differences were found between the interaction modalities on the technology acceptance measures on effort, ease, anxiety, performance and attitude. The results on these measures were very high for both interaction modalities, indicating that both modalities were accepted by the elderly people. In a final interview participants reacted more positive on the use of in-air gestures.

[1]  F. W. Fikkert,et al.  Gesture Interaction at a Distance , 2010 .

[2]  Arthur Browne Robots-the future , 1986 .

[3]  Ho-Sub Yoon,et al.  Hand gesture recognition using combined features of location, angle and velocity , 2001, Pattern Recognit..

[4]  J. P. Foley,et al.  Gesture and Environment , 1942 .

[5]  Gordon B. Davis,et al.  User Acceptance of Information Technology: Toward a Unified View , 2003, MIS Q..

[6]  A. Hassani Discovering the level of robot acceptance of seniors using scenarios based on assistive technologies , 2011 .

[7]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[8]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[9]  Ben J. A. Kröse,et al.  The Influence of Social Presence on Acceptance of an Assistive Social Robot and Screen Agent by Elderly Users , 2009 .

[10]  K. Dautenhahn,et al.  Robots as assistive technology - does appearance matter? , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[11]  Alexander H. Waibel,et al.  Natural human-robot interaction using speech, head pose and gestures , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[12]  Ayoub Al-Hamadi,et al.  A Hidden Markov Model-based continuous gesture recognition system for hand motion trajectory , 2008, 2008 19th International Conference on Pattern Recognition.

[13]  Kristiina Jokinen,et al.  Multimodality – technology , visions and demands for the future , 2003 .

[14]  Seong-Whan Lee,et al.  Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter , 2011, Image Vis. Comput..

[15]  C. McCreadie,et al.  The acceptability of assistive technology to older people , 2005, Ageing and Society.

[16]  Roger K. Moore,et al.  Handbook of Multimodal and Spoken Dialogue Systems: Resources, Terminology and Product Evaluation , 2000 .

[17]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[18]  B. Kröse,et al.  Human-Robot User Studies in Eldercare: Lessons Learned , 2006 .

[19]  Ben J. A. Kröse,et al.  Measuring the influence of social abilities on acceptance of an interface robot and a screen agent by elderly users , 2009, BCS HCI.

[20]  D. McNeill Hand and Mind , 1995 .

[21]  Richard Tynan,et al.  Towards evolutionary ambient assisted living systems , 2010, J. Ambient Intell. Humaniz. Comput..

[22]  Horst-Michael Groß,et al.  An approach to multi-modal human-machine interaction for intelligent service robots , 2003, Robotics Auton. Syst..

[23]  Peter A. Todd,et al.  Understanding Information Technology Usage: A Test of Competing Models , 1995, Inf. Syst. Res..

[24]  Cem Keskin,et al.  REAL TIME HAND TRACKING AND 3D GESTURE RECOGNITION FOR INTERACTIVE INTERFACES USING HMM , 2003 .

[25]  Jeroen Arendsen Seeing Signs: On the appearance of manual movements in gestures , 2009 .

[26]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[27]  Yi Li,et al.  SOM-based hand gesture recognition for virtual interactions , 2011, 2011 IEEE International Symposium on VR Innovation.

[28]  Fred D. Davis Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology , 1989, MIS Q..

[29]  Vanessa Evers,et al.  Measuring acceptance of an assistive social robot: a suggested toolkit , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[30]  Yihsiu Chen,et al.  Language and Gesture: Lexical gestures and lexical access: a process model , 2000 .