Continuous vocalization control of a full-scale assistive robot

We present a physical robotic arm performing real-world tasks using continuous non-verbal vocalizations for control. Vocalization control provides fewer degrees of control freedom than are necessary to directly control complex robotic platforms. To bridge this gap, we evaluated three control methods: direct joint angle control of a selectable subset of joints, inverse kinematics control of the end effector, and control in a reduced-dimensionality synergy space. The synergy method is inspired by neural solutions to biological body redundancy problems. We conducted several evaluations of the three methods involving the real-world tasks of water bottle recycling and grocery bag moving. Users with no prior exposure to the system were able to perform these tasks effectively and were able to learn to be more efficient. This study demonstrates the feasibility of continuous non-verbal vocalizations for control of a full-scale assitive robot in a realistic context.

[1]  L. Leifer,et al.  Clinical evaluation of a desktop robotic assistant. , 1989, Journal of rehabilitation research and development.

[2]  Holly A. Yanco,et al.  Wheelesley: A Robotic Wheelchair System: Indoor Navigation and User Interface , 1998, Assistive Technology and Artificial Intelligence.

[3]  Martin J Watson MSc Mcsp Neurophysiological Basis of Movement , 1999 .

[4]  Zoubin Ghahramani,et al.  Unsupervised learning of sensory-motor primitives , 2003, Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No.03CH37439).

[5]  Emilio Bizzi,et al.  Combinations of muscle synergies in the construction of a natural motor behavior , 2003, Nature Neuroscience.

[6]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[7]  Richard Wright,et al.  The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.

[8]  Xiao Li,et al.  The vocal joystick:: evaluation of voice-based cursor control techniques , 2006, Assets '06.

[9]  Hirochika Inoue,et al.  HRP-2W: A humanoid platform for research on support behavior in daily life environments , 2009, Robotics Auton. Syst..

[10]  E. Torres-Jara,et al.  Challenges for Robot Manipulation in Human Environments , 2006 .

[11]  Axel Gräser,et al.  The Rehabilitation Robots FRIEND-I & II: Daily Life Independency through Semi-Autonomous Task-Execution , 2007 .

[12]  Charles C. Kemp,et al.  Challenges for robot manipulation in human environments [Grand Challenges of Robotics] , 2007, IEEE Robotics & Automation Magazine.

[13]  Charles C. Kemp,et al.  Laser pointers and a touch screen: intuitive interfaces for autonomous mobile manipulation for the motor impaired , 2008, Assets '08.

[14]  Odest Chadwicke Jenkins Sparse control for high-DOF assistive robots , 2008, Intell. Serv. Robotics.

[15]  Jeff A. Bilmes,et al.  The VoiceBot: a voice controlled robot arm , 2009, CHI.

[16]  Ian D. Walker,et al.  Octopus-inspired grasp-synergies for continuum manipulators , 2009, 2008 IEEE International Conference on Robotics and Biomimetics.

[17]  Cara E. Stepp,et al.  Relative to direct haptic feedback, remote vibrotactile feedback improves but slows object manipulation , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[18]  Chih-Hung King,et al.  Effects of Force Feedback and Arm Compliance on Teleoperation for a Hygiene Task , 2010, EuroHaptics.

[19]  Lotte N. S. Andreasen Struijk,et al.  Inductive tongue control of powered wheelchairs , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[20]  Rajesh P. N. Rao,et al.  A Hierarchical Architecture for Adaptive Brain-Computer Interfacing , 2011, IJCAI.