Virtual Reality Study of Human Adaptability in Industrial Human-Robot Collaboration

Modern industrial automation may benefit from humans and robots collaborating with each other in a shared workspace. Even though collaborative robots are often designed to be physically safe, mental and emotional well-being of humans working with industrial robots, as well as the fluency of collaboration, are rarely considered. This study uses Pimax 5k+ Virtual Reality headset to study human behaviours in a potential collaborative task, where a human and a robot work at the same time on the same workpiece. The human’s motion and physiological responses were collected from the VR equipment, wearable Zephyr Biomodule sensor and a subjective questionnaire. The results show that some people can easily adapt to the robot and work fluently even when it speeds up, while others fail to keep up with it and give up any attempts to collaborate. It was shown that participants, who fail to keep up with the robot can often be detected before they give up. This study shows not only the need to adapt the robot’s behaviour (especially its speed) to each worker individually, but also the possibility to use human motion and physiological data to predict which worker is going to require additional support to improve the collaboration.

[1]  Marcelo J. Dapino,et al.  Discrete Layer Jamming for Safe Co-Robots , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[2]  Maria V. Sanchez-Vives,et al.  First Person Experience of Body Transfer in Virtual Reality , 2010, PloS one.

[3]  Kaspar Althoefer,et al.  Variable Stiffness Link (VSL): Toward inherently safe robotic manipulators , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Rossana Castaldo,et al.  Acute mental stress assessment via short term HRV analysis in healthy adults: A systematic review with meta-analysis , 2015, Biomed. Signal Process. Control..

[5]  Guy Hoffman,et al.  Evaluating Fluency in Human–Robot Collaboration , 2019, IEEE Transactions on Human-Machine Systems.

[6]  Willis J. Tompkins,et al.  A Real-Time QRS Detection Algorithm , 1985, IEEE Transactions on Biomedical Engineering.

[7]  Przemyslaw A. Lasota,et al.  Analyzing the Effects of Human-Aware Motion Planning on Close-Proximity Human–Robot Collaboration , 2015, Hum. Factors.

[8]  Siddhartha S. Srinivasa,et al.  Effects of Robot Motion on Human-Robot Collaboration , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Andrea Gaggioli,et al.  Using Virtual Reality to Test Human-Robot Interaction During a Collaborative Task , 2019, Volume 1: 39th Computers and Information in Engineering Conference.

[10]  Tamio Arai,et al.  Assessment of operator stress induced by robot collaboration in assembly , 2010 .

[11]  J. R. Llata,et al.  Working Together: A Review on Safe Human-Robot Collaboration in Industrial Environments , 2017, IEEE Access.

[12]  Ferdinando Cannella,et al.  Optimal Subtask Allocation for Human and Robot Collaboration Within Hybrid Assembly System , 2014, IEEE Transactions on Automation Science and Engineering.

[13]  Ole Madsen,et al.  Robot skills for manufacturing , 2016 .

[14]  Peter Nickel,et al.  Effects of movement speed and predictability in human–robot collaboration , 2017 .