Alternative Modes of Interaction in Proximal Human-in-the-Loop Operation of Robots

Ambiguity and noise in natural language instructions create a significant barrier towards adopting autonomous systems into safety critical workflows involving humans and machines. In this paper, we propose to build on recent advances in electrophysiological monitoring methods and augmented reality technologies, to develop alternative modes of communication between humans and robots involved in large-scale proximal collaborative tasks. We will first introduce augmented reality techniques for projecting a robot's intentions to its human teammate, who can interact with these cues to engage in real-time collaborative plan execution with the robot. We will then look at how electroencephalographic (EEG) feedback can be used to monitor human response to both discrete events, as well as longer term affective states while execution of a plan. These signals can be used by a learning agent, a.k.a an affective robot, to modify its policy. We will present an end-to-end system capable of demonstrating these modalities of interaction. We hope that the proposed system will inspire research in augmenting human-robot interactions with alternative forms of communications in the interests of safety, productivity, and fluency of teaming, particularly in engineered settings such as the factory floor or the assembly line in the manufacturing industry where the use of such wearables can be enforced.

[1]  Klaus Schilling,et al.  A Spatial Augmented Reality system for intuitive display of robotic data , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Ross A. Knepper,et al.  Asking for Help Using Inverse Semantics , 2014, Robotics: Science and Systems.

[3]  Nicholas R. Gans,et al.  A Multi-view camera-projector system for object detection and robot-human feedback , 2013, 2013 IEEE International Conference on Robotics and Automation.

[4]  Shin Sato,et al.  A human-robot interface using an interactive hand pointer that projects a mark in the real work space , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[5]  Tom Brock,et al.  The Industrial Robot , 1983 .

[6]  Mohd Yamani Idna Idris,et al.  Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation , 2015, Expert Syst. Appl..

[7]  Kentaro Ishii,et al.  Designing Laser Gesture Interface for Robot Control , 2009, INTERACT.

[8]  Kentaro Ishii,et al.  Blinkbot: look at, blink and move , 2010, UIST '10.

[9]  Stephanie Rosenthal,et al.  Dynamic generation and refinement of robot verbalization , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[10]  Ricardo Chavarriaga,et al.  Latency correction of error potentials between different experiments reduces calibration time for single-trial classification , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[11]  Brian Scassellati,et al.  Autonomously constructing hierarchical task networks for planning and human-robot collaboration , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Nancy M. Amato,et al.  A Roadmap for US Robotics - From Internet to Robotics 2020 Edition , 2021, Found. Trends Robotics.

[13]  Gal A. Kaminka,et al.  Curing robot autism: a challenge , 2013, AAMAS.

[14]  Rajesh P. N. Rao Brain-Computer Interfacing: An Introduction , 2010 .

[15]  Yili Liu,et al.  EEG-Based Brain-Controlled Mobile Robots: A Survey , 2013, IEEE Transactions on Human-Machine Systems.

[16]  Emanuele Ruffaldi,et al.  Third Point of View Augmented Reality for Robot Intentions Visualization , 2016, AVR.

[17]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[18]  Joseph DelPreto,et al.  Correcting robot mistakes in real time using EEG signals , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[19]  R Chavarriaga,et al.  EEG-based decoding of error-related brain activity in a real-world driving task , 2015, Journal of neural engineering.

[20]  Atsushi Watanabe,et al.  Communicating robotic navigational intentions , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[21]  Luis Montesano,et al.  Single trial recognition of error-related potentials during observation of robot operation , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[22]  José del R. Millán,et al.  Error-Related EEG Potentials Generated During Simulated Brain–Computer Interaction , 2008, IEEE Transactions on Biomedical Engineering.

[23]  Brendan Z. Allison,et al.  How Many People Could Use an SSVEP BCI? , 2012, Front. Neurosci..

[24]  Ravi Teja Chadalavada,et al.  That's on my mind! robot to human intention communication through on-board projection on shared floor space , 2015, 2015 European Conference on Mobile Robots (ECMR).

[25]  Jonathan P. How,et al.  MAR-CPS: Measurable Augmented Reality for Prototyping Cyber-Physical Systems , 2015 .

[26]  Siddhartha S. Srinivasa,et al.  Effects of Robot Motion on Human-Robot Collaboration , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[27]  Yu Zhang,et al.  Plan explicability and predictability for robot task planning , 2015, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[28]  Jonathan P. How,et al.  Measurable Augmented Reality for Prototyping Cyberphysical Systems: A Robotics Platform to Aid the Hardware Prototyping and Performance Testing of Algorithms , 2016, IEEE Control Systems.

[29]  Benjamin Blankertz,et al.  A novel brain-computer interface based on the rapid serial visual presentation paradigm , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.