Personalized Robot Assistant for Support in Dressing

Robot-assisted dressing is performed in close physical interaction with users who may have a wide range of physical characteristics and abilities. Design of user adaptive and personalized robots in this context is still indicating limited, or no consideration, of specific user-related issues. This paper describes the development of a multimodal robotic system for a specific dressing scenario—putting on a shoe, where users’ personalized inputs contribute to a much improved task success rate. We have developed: 1) user tracking, gesture recognition, and posture recognition algorithms relying on images provided by a depth camera; 2) a shoe recognition algorithm from RGB and depth images; and 3) speech recognition and text-to-speech algorithms implemented to allow verbal interaction between the robot and user. The interaction is further enhanced by calibrated recognition of the users’ pointing gestures and adjusted robot’s shoe delivery position. A series of shoe fitting experiments have been performed on two groups of users, with and without previous robot personalization, to assess how it affects the interaction performance. Our results show that the shoe fitting task with the personalized robot is completed in shorter time, with a smaller number of user commands, and reduced workload.

[1]  Takamitsu Matsubara,et al.  Estimation of Human Cloth Topological Relationship using Depth Sensor for Robotic Clothing Assistance , 2013, AIR '13.

[2]  Wendy A. Rogers,et al.  “Commanding Your Robot” Older Adults’ Preferences for Methods of Robot Control , 2012, Proceedings of the Human Factors and Ergonomics Society ... Annual Meeting. Human Factors and Ergonomics Society. Annual Meeting.

[3]  Takamitsu Matsubara,et al.  Real-time estimation of Human-Cloth topological relationship using depth sensor for robotic clothing assistance , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[5]  Jörg Stückler,et al.  Learning to interpret pointing gestures with a time-of-flight camera , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  C. Karen Liu,et al.  What does the person feel? Learning to infer applied forces during robot-assisted dressing , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Dylan F. Glas,et al.  How to Approach Humans?-Strategies for Social Robots to Initiate Interaction- , 2010 .

[8]  Martha E. Pollack,et al.  Intelligent Technology for an Aging Population: The Use of AI to Assist Elders with Cognitive Impairment , 2005, AI Mag..

[9]  Kimitoshi Yamazaki,et al.  A method of state recognition of dressing clothes based on dynamic state matching , 2013, Proceedings of the 2013 IEEE/SICE International Symposium on System Integration.

[10]  Kimitoshi Yamazaki,et al.  Bottom Dressing by a Dual-Arm Robot Using a Clothing State Estimation Based on Dynamic Shape Changes , 2016 .

[11]  Takamitsu Matsubara,et al.  Reinforcement learning of clothing assistance with a dual-arm robot , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[12]  Bruce A. MacDonald,et al.  The Role of Healthcare Robots for Older People at Home: A Review , 2014, Int. J. Soc. Robotics.

[13]  D. C. Kennie,et al.  Functional disability in the hospitalized elderly. , 1982, JAMA.

[14]  Greg Chance,et al.  A Quantitative Analysis of Dressing Dynamics for Robotic Dressing Assistance , 2017, Front. Robot. AI.

[15]  Carme Torras,et al.  A friction-model-based framework for Reinforcement Learning of robotic tasks in non-rigid environments , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[16]  C. Torras,et al.  Closed-Loop Inverse Kinematics for Redundant Robots: Comparative Assessment and Two Enhancements , 2015, IEEE/ASME Transactions on Mechatronics.

[17]  Greg Chance,et al.  An assistive robot to support dressing - strategies for planning and error handling , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[18]  Manuela M. Veloso,et al.  Personalized Assistance for Dressing Users , 2015, ICSR.

[19]  Fan Zhang,et al.  Personalized robot-assisted dressing using user modeling in latent spaces , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Takamitsu Matsubara,et al.  Reinforcement learning of a motor skill for wearing a T-shirt using topology coordinates , 2013, Adv. Robotics.

[21]  Helge J. Ritter,et al.  Active Boundary Component Models for robotic dressing assistance , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[22]  C. Karen Liu,et al.  Haptic simulation for robot-assisted dressing , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[23]  Yael Edan,et al.  Comparison of Interaction Modalities for Mobile Indoor Robot Guidance: Direct Physical Interaction, Person Following, and Pointing Control , 2015, IEEE Transactions on Human-Machine Systems.

[24]  Takayuki Kanda,et al.  How to approach humans?-strategies for social robots to initiate interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Sylvain Calinon,et al.  Learning adaptive dressing assistance from human demonstration , 2017, Robotics Auton. Syst..

[26]  Joel E. Cohen,et al.  Human Population: The Next Half Century , 2003, Science.

[27]  Yael Edan,et al.  Human-Robot Interaction through 3D Vision and Force Control , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  Yiannis Demiris,et al.  User modelling for personalised dressing assistance by humanoid robots , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[29]  C. Karen Liu,et al.  Data-driven haptic perception for robot-assisted dressing , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[30]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[31]  Yiannis Demiris,et al.  Iterative path optimisation for personalised dressing assistance using vision and force information , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).