An empirical study with simulated ADL tasks using a vision-guided assistive robot arm

In this paper, we describe an empirical study with healthy subjects with simulated ADL tasks using UCF-ARM - a 6-DOF assistive robot that is visually guided through a calibrated stereo camera system fitted in the gripper of Exact Dynamics' Manus ARM. The goal of the research is to reduce time to task completion and cognitive burden for users interacting with an unstructured environment via a Wheelchair Mounted Robotic Arm (WMRA). Our WMRA, UCF-ARM, provides access to a multimodal user-customizable human computer interface and accomplishes visual servoing through an in-hand stereo rig as well as adaptive grip force application through force sensing resistors embedded in the fingers of the gripper. Choice of user interface is dependent on the user's functional level and injury. Two level object grasping tasks are used to assess the time to task completion and cognitive burden for users evaluating the system under different tasks, control modes, and interface modalities. Results of the statistical analysis are provided to compare the advantages and disadvantages of the evaluated options.

[1]  Dae-Jin Kim,et al.  Eye-in-hand stereo visual servoing of an assistive robot arm in unstructured environments , 2009, 2009 IEEE International Conference on Robotics and Automation.

[2]  C. Leroux,et al.  Intuitive human interaction with an arm robot for severely handicapped people - A One Click Approach , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[3]  H. Lindman Analysis of variance in complex experimental designs , 1974 .

[4]  Alexandra Barry,et al.  The Spinal Nerves , 1883, Edinburgh Medical Journal.

[5]  J. Sijs,et al.  Vision-based control of the Manus using SIFT , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[6]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[7]  Warren E. Dixon,et al.  Adaptive homography-based visual servo tracking for a fixed camera configuration with a camera-in-hand extension , 2005, IEEE Transactions on Control Systems Technology.

[8]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[9]  H. Nguyen El-E: An Assistive Robot that Fetches Objects from Flat Surfaces , 2008 .

[10]  M. O'Mahony Sensory Evaluation of Food: Statistical Methods and Procedures , 1986 .

[11]  Ulrich Borgolte,et al.  An Omnidirectional Wheelchair with Enhanced Comfort Features , 1997 .

[12]  Oliver Lang,et al.  A FRIEND for assisting handicapped people , 2001, IEEE Robotics Autom. Mag..

[13]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[14]  Christophe Leroux,et al.  SAM: A robotic butler for handicapped people , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  David Nistér,et al.  Scalable Recognition with a Vocabulary Tree , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[16]  Vincent Lepetit,et al.  Fast Keypoint Recognition in Ten Lines of Code , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  William Harwin,et al.  Devices for assisting manipulation: a summary of user task priorities , 1994 .