From Bot to Bot: Using a Chat Bot to Synthesize Robot Motion

We present Bot to Bot, a system for developers to write voice controlled applications in a high-level language while retaining portability over a variety of different robot hardware platforms. In this paper we describe how Bot to Bot leverages advances in natural language processing and robotic control to take a user’s voice command and translate it into a structured intent for the robot through the following intermediate representations: verbal bites, robot assembly, and robot control primitives. Our long-term goal is to find a verbal instruction set for human-robot interaction. We provide our software as open source to encourage future research.

[1]  A. Schwartz,et al.  High-performance neuroprosthetic control by an individual with tetraplegia , 2013, The Lancet.

[2]  Kiyotaka Izumi,et al.  A particle-swarm-optimized fuzzy-neural network for voice-controlled robot systems , 2005, IEEE Transactions on Industrial Electronics.

[3]  Oussama Khatib,et al.  Human Motion Reconstruction and Synthesis of Human Skills , 2010 .

[4]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[5]  R. Andersen,et al.  Decoding motor imagery from the posterior parietal cortex of a tetraplegic human , 2015, Science.

[6]  Oussama Khatib,et al.  Adaptive human-inspired compliant contact primitives to perform surface–surface contact under uncertainty , 2016, Int. J. Robotics Res..

[7]  Matthias Scheutz,et al.  Let Me Tell You! : Investigating the Effects of Robot Communication Strategies in Advice-giving Situations based on Robot Appearance, Interaction Modality and Distance , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[9]  Mark R. Cutkosky,et al.  Comparison of Skin Stretch and Vibrotactile Stimulation for Feedback of Proprioceptive Information , 2008, 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.

[10]  P.X. Liu,et al.  Voice based robot control , 2005, 2005 IEEE International Conference on Information Acquisition.

[11]  Oussama Khatib,et al.  Synthesis of Whole-Body Behaviors through Hierarchical Control of Behavioral Primitives , 2005, Int. J. Humanoid Robotics.

[12]  Oussama Khatib,et al.  Controlling articulated robots in task-space with spiking silicon neurons , 2014, 5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics.

[13]  Oussama Khatib,et al.  Springer Handbook of Robotics , 2007, Springer Handbooks.

[14]  Robert B. Miller,et al.  Response time in man-computer conversational transactions , 1899, AFIPS Fall Joint Computing Conference.

[15]  Vincent De Sapio,et al.  Human-Like Motion from Physiologically-Based Potential Energies , 2004 .

[16]  Oussama Khatib,et al.  A unified approach for motion and force control of robot manipulators: The operational space formulation , 1987, IEEE J. Robotics Autom..

[17]  Matthias Scheutz,et al.  First steps toward natural human-like HRI , 2007, Auton. Robots.

[18]  Francois Routhier,et al.  Evaluation of the JACO robotic arm: Clinico-economic study for powered wheelchair users with upper-extremity disabilities , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[19]  J. Andrew Bagnell,et al.  Human-inspired force compliant grasping primitives , 2014, Auton. Robots.

[20]  J. Wolpaw,et al.  Clinical Applications of Brain-Computer Interfaces: Current State and Future Prospects , 2009, IEEE Reviews in Biomedical Engineering.

[21]  Oussama Khatib,et al.  Spanning large workspaces using small haptic devices , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[22]  Matthias Scheutz,et al.  DIARC: A Testbed for Natural Human-Robot Interaction , 2006, AAAI.

[23]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[24]  D. E. Whitney,et al.  The mathematics of coordinated control of prosthetic arms and manipulators. , 1972 .

[25]  M L Boninger,et al.  Ten-dimensional anthropomorphic arm control in a human brain−machine interface: difficulties, solutions, and limitations , 2015, Journal of neural engineering.

[26]  Oussama Khatib,et al.  Inertial Properties in Robotic Manipulation: An Object-Level Framework , 1995, Int. J. Robotics Res..

[27]  Jeff A. Bilmes,et al.  The VoiceBot: a voice controlled robot arm , 2009, CHI.

[28]  Mark R. Cutkosky,et al.  Using Haptic Feedback to Improve Grasp Force Control in Multiple Sclerosis Patients , 2009, IEEE Transactions on Robotics.

[29]  Carsten Binnig,et al.  Making the Case for Query-by-Voice with EchoQuery , 2016, SIGMOD Conference.

[30]  Yuanqing Li,et al.  An EEG-based BCI system for 2D cursor control , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[31]  Matthias Scheutz,et al.  A Hybrid Architectural Approach to Understanding and Appropriately Generating Indirect Speech Acts , 2013, AAAI.

[32]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[33]  Byron M. Yu,et al.  A high-performance brain–computer interface , 2006, Nature.

[34]  Jun Morimoto,et al.  Task-Specific Generalization of Discrete and Periodic Dynamic Movement Primitives , 2010, IEEE Transactions on Robotics.

[35]  Michael Fischer,et al.  From Books to Bots: Using Medical Literature to Create a Chat Bot , 2016, IoTofHealth@MobiSys.