Advancing Anticipatory Behaviors in Dyadic Human-Robot Collaboration: The AnDy project

In the near future, robots collaborating with human operators in industries will need more and more anticipation capabilities to properly react to human actions and provide efficient collaboration. To achieve this goal, new technologies are needed that not only estimate the motion of the humans, but that fully describe the whole-body dynamics of the interaction and that can predict its outcome. These hardware and software technologies are the goal of the European project AnDy. The AnDy project leverages existing technologies to endow robots with the ability to control physical collaboration through intentional interaction in order to maximize ergonomics for the user. To achieve this goal, AnDy relies on three technological and scientific breakthroughs. First, AnDy innovates the way of measuring human whole-body motions by developing a wearable AnDySuit, which tracks motions and records forces. Second, AnDy develops the AnDyModel, which combines ergonomic models with cognitive predictive models of human dynamic behavior in collaborative tasks, learned from data acquired with the AnDySuit. Third, AnDy proposes AnDyControl, an innovative technology for assisting humans through predictive physical control based on AnDyModel. By measuring and modeling human whole-body dynamics, AnDy will provide robots with a new level of awareness about human intentions and ergonomics. By incorporating this awareness on-line in the robot's controllers, AnDy paves the way for novel applications of physical human-robot collaboration in manufacturing, health-care, and assisted living. We present the goals and methods of the AnDy project, as well as first year results. Technical advances on the AndySuit – using inertial motion capture and sensorized shoes – along with the development of an on-line inverse dynamics software tool now allows real-time monitoring of human dynamics. The information provided thereby is used in the controller of a robot physically interacting with the human, so that the robot reactively adapts its movement to the human’s movement. Experiments with a real robot have shown promising results. The next step is to couple the robot controller with an automatic ergonomic assessment tool. Thereby, the robot will be able to detect and anticipate critical situations, and will react in order to optimize the ergonomics of the human movement.