Advanced Usability Through Constrained Multi Modal Interactive Strategies: The CookieBot

Service robots are becoming able to perform a variety of tasks and they are currently used for many different applications. For this reason people with different backgrounds and also without robotic experience need to interact with them. Enabling the user to control the motion of the robot end-effector, it is important to provide an easy and intuitive interface. In this work we propose an intuitive method for the control of a robot TCP position and orientation. This is done taking into account the robot kinematics in order to avoid dangerous configuration and defining rotational constraints. The user is enabled to interact with the robot and control its end-effector using a set of objects tracked by a camera system. The autonomy level of the robot changes depending on the different phases of the interaction for a better efficiency. An intuitive GUI has been developed to ease the interaction and help the user to achieve a better precision in the control. This is possible also through the scaling of the tracked motion, which is represented as visual feedback. We tested the system through multiple experiments that took into account how people with no experience interact with the robot and the precision of the method.

[1]  Christopher Assad,et al.  Gesture-based robot control with variable autonomy from the JPL BioSleeve , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Florian Richter,et al.  Motion Scaling Solutions for Improved Performance in High Delay Surgical Teleoperation , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[3]  Carlo Alberto Avizzano,et al.  Immersive ROS-integrated framework for robot teleoperation , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[4]  Tomio Watanabe,et al.  InterRobot: speech-driven embodied interaction robot , 2001, Adv. Robotics.

[5]  Hironao Yamada,et al.  Construction Tele-robot System With Virtual Reality , 2008, 2008 IEEE Conference on Robotics, Automation and Mechatronics.

[6]  Holly A. Yanco,et al.  Autonomy mode suggestions for improving human-robot interaction , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[7]  Rüdiger Dillmann,et al.  Inverse Kinematics with Forward Dynamics Solvers for Sampled Motion Tracking , 2019, 2019 19th International Conference on Advanced Robotics (ICAR).

[8]  Adolfo Rodríguez Tsouroukdissian,et al.  ros_control: A generic and simple control framework for ROS , 2017, J. Open Source Softw..

[9]  Rüdiger Dillmann,et al.  Service robots in the field: The BratWurst Bot , 2017, 2017 18th International Conference on Advanced Robotics (ICAR).

[10]  Tadeusz Szkodny,et al.  Gesture Based Robot Control , 2012, ICCVG.

[11]  Kun Qian,et al.  Developing a Gesture Based Remote Human-Robot Interaction System Using Kinect , 2013 .

[12]  Alexander H. Waibel,et al.  Natural human-robot interaction using speech, head pose and gestures , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[13]  Joseph J. LaViola,et al.  Natural User Interfaces for Adjustable Autonomy in Robot Control , 2015, IEEE Computer Graphics and Applications.

[14]  Tomio Watanabe,et al.  InterRobot: a speech driven embodied interaction robot , 2000, Proceedings 9th IEEE International Workshop on Robot and Human Interactive Communication. IEEE RO-MAN 2000 (Cat. No.00TH8499).

[15]  Xiaoli Zhang,et al.  Implicit Intention Communication in Human–Robot Interaction Through Visual Behavior Studies , 2017, IEEE Transactions on Human-Machine Systems.

[16]  Allison M. Okamura,et al.  Training in divergent and convergent force fields during 6-DOF teleoperation with a robot-assisted surgical system , 2017, 2017 IEEE World Haptics Conference (WHC).

[17]  Jörg Stückler,et al.  Adjustable autonomy for mobile teleoperation of personal service robots , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Rüdiger Dillmann,et al.  Towards a Vision-Based Concept for Gesture Control of a Robot Providing Visual Feedback , 2018, 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[19]  Azhana Ahmad,et al.  A Flexible Human-Agent Interaction model for supervised autonomous systems , 2016, 2016 2nd International Symposium on Agent, Multi-Agent Systems and Robotics (ISAMSR).

[20]  Oskar von Stryk,et al.  Human-robot collaborative high-level control with application to rescue robotics , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[22]  Rüdiger Dillmann,et al.  Transparent Robot Behavior by Adding Intuitive Visual and Acoustic Feedback to Motion Replanning , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[23]  Domenico Prattichizzo,et al.  Design of a wearable interface for lightweight robotic arm for people with mobility impairments , 2017, 2017 International Conference on Rehabilitation Robotics (ICORR).

[24]  Markus Vincze,et al.  Results of a real world trial with a mobile social service robot for older adults , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Dorthe Sølvason,et al.  Teleoperation for learning by demonstration: Data glove versus object manipulation for intuitive robot control , 2014, 2014 6th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT).