Investigation of human-robot interface performance in household environments

Today, assistive robots are being introduced into human environments at an increasing rate. Human environments are highly cluttered and dynamic, making it difficult to foresee all necessary capabilities and pre-program all desirable future skills of the robot. One approach to increase robot performance is semi-autonomous operation, allowing users to intervene and guide the robot through difficult tasks. To this end, robots need intuitive Human-Machine Interfaces (HMIs) that support fine motion control without overwhelming the operator. In this study we evaluate the performance of several interfaces that balance autonomy and teleoperation of a mobile manipulator for accomplishing several household tasks. Our proposed HMI framework includes teleoperation devices such as a tablet, as well as physical interfaces in the form of piezoresistive pressure sensor arrays. Mobile manipulation experiments were performed with a sensorized KUKA youBot, an omnidirectional platform with a 5 degrees of freedom (DOF) arm. The pick and place tasks involved navigation and manipulation of objects in household environments. Performance metrics included time for task completion and position accuracy.

[1]  Dieter Fox,et al.  RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments , 2012, Int. J. Robotics Res..

[2]  Dan O. Popa,et al.  EHD printing of PEDOT: PSS inks for fabricating pressure and strain sensor arrays on flexible substrates , 2015, Commercial + Scientific Sensing and Imaging.

[3]  Kerstin Dautenhahn,et al.  New Frontiers in Human-Robot Interaction , 2011 .

[4]  Dan O. Popa,et al.  Multi-modal sensor and HMI integration with applications in personal robotics , 2015, Commercial + Scientific Sensing and Imaging.

[5]  Antonio Bicchi,et al.  An atlas of physical human-robot interaction , 2008 .

[6]  이상조,et al.  적응 제어를 이용한 로보트 매니퓰레이터의 위치제어 ( Robot Manipulator Control by Adaptive control ) , 1987 .

[7]  Peter Kopacek Development Trends in Robotics , 2016 .

[8]  Aude Billard,et al.  A survey of Tactile Human-Robot Interactions , 2010, Robotics Auton. Syst..

[9]  Guido Herrmann,et al.  Towards Safety in Human Robot Interaction , 2010, Int. J. Soc. Robotics.

[10]  Gary R. Bradski,et al.  Learning OpenCV - computer vision with the OpenCV library: software that sees , 2008 .

[11]  Hao Ding,et al.  Optimized task distribution for industrial assembly in mixed human-robot environments - Case study on IO module assembly , 2014, 2014 IEEE International Conference on Automation Science and Engineering (CASE).

[12]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[13]  Dan O. Popa,et al.  Implementation of advanced manipulation tasks on humanoids through kinesthetic teaching , 2014, PETRA '14.

[14]  Cliff Fitzgerald,et al.  Developing baxter , 2013, 2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA).

[15]  Frank L. Lewis,et al.  Neuroadaptive control for safe robots in human environments: A case study , 2015, 2015 IEEE International Conference on Automation Science and Engineering (CASE).

[16]  Dieter Fox,et al.  RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments , 2010, ISER.

[17]  Frank L. Lewis,et al.  Intent aware adaptive admittance control for physical Human-Robot Interaction , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Gary R. Bradski,et al.  Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library , 2016 .