Trends and challenges in robot manipulation

Hand it to you Our ability to grab, hold, and manipulate objects involves our dexterous hands, our sense of touch, and feedback from our eyes and muscles that allows us to maintain a controlled grip. Billard and Kragic review the progress made in robotics to emulate these functions. Systems have developed from simple, pinching grippers operating in a fully defined environment, to robots that can identify, select, and manipulate objects from a random collection. Further developments are emerging from advances in computer vision, computer processing capabilities, and tactile materials that give feedback to the robot. Science, this issue p. eaat8414 BACKGROUND Humans have a fantastic ability to manipulate objects of various shapes, sizes, and materials and can control the objects’ position in confined spaces with the advanced dexterity capabilities of our hands. Building machines inspired by human hands, with the functionality to autonomously pick up and manipulate objects, has always been an essential component of robotics. The first robot manipulators date back to the 1960s and are some of the first robotic devices ever constructed. In these early days, robotic manipulation consisted of carefully prescribed movement sequences that a robot would execute with no ability to adapt to a changing environment. As time passed, robots gradually gained the ability to automatically generate movement sequences, drawing on artificial intelligence and automated reasoning. Robots would stack boxes according to size, weight, and so forth, extending beyond geometric reasoning. This task also required robots to handle errors and uncertainty in sensing at run time, given that the slightest imprecision in the position and orientation of stacked boxes might cause the entire tower to topple. Methods from control theory also became instrumental for enabling robots to comply with the environment’s natural uncertainty by empowering them to adapt exerted forces upon contact. The ability to stably vary forces upon contact expanded robots’ manipulation repertoire to more-complex tasks, such as inserting pegs in holes or hammering. However, none of these actions truly demonstrated fine or in-hand manipulation capabilities, and they were commonly performed using simple two-fingered grippers. To enable multipurpose fine manipulation, roboticists focused their efforts on designing humanlike hands capable of using tools. Wielding a tool in-hand became a problem of its own, and a variety of advanced algorithms were developed to facilitate stable holding of objects and provide optimality guarantees. Because optimality was difficult to achieve in a stochastic environment, from the 1990s onward researchers aimed to increase the robustness of object manipulation at all levels. These efforts initiated the design of sensors and hardware for improved control of hand–object contacts. Studies that followed were focused on robust perception for coping with object occlusion and noisy measurements, as well as on adaptive control approaches to infer an object’s physical properties, so as to handle objects whose properties are unknown or change as a result of manipulation. ADVANCES Roboticists are still working to develop robots capable of sorting and packaging objects, chopping vegetables, and folding clothes in unstructured and dynamic environments. Robots used for modern manufacturing have accomplished some of these tasks in structured settings that still require fences between the robots and human operators to ensure safety. Ideally, robots should be able to work side by side with humans, offering their strength to carry heavy loads while presenting no danger. Over the past decade, robots have gained new levels of dexterity. This enhancement is due to breakthroughs in mechanics with sensors for perceiving touch along a robot’s body and new mechanics for soft actuation to offer natural compliance. Most notably, this development leverages the immense progress in machine learning to encapsulate models of uncertainty and support further advances in adaptive and robust control. Learning to manipulate in real-world settings is costly in terms of both time and hardware. To further elaborate on data-driven methods but avoid generating examples with real, physical systems, many researchers use simulation environments. Still, grasping and dexterous manipulation require a level of reality that existing simulators are not yet able to deliver—for example, in the case of modeling contacts for soft and deformable objects. Two roads are hence pursued: The first draws inspiration from the way humans acquire interaction skills and prompts robots to learn skills from observing humans performing complex manipulation. This allows robots to acquire manipulation capabilities in only a few trials. However, generalizing the acquired knowledge to apply to actions that differ from those previously demonstrated remains difficult. The second road constructs databases of real object manipulation, with the goal to better inform the simulators and generate examples that are as realistic as possible. Yet achieving realistic simulation of friction, material deformation, and other physical properties may not be possible anytime soon, and real experimental evaluation will be unavoidable for learning to manipulate highly deformable objects. OUTLOOK Despite many years of software and hardware development, achieving dexterous manipulation capabilities in robots remains an open problem—albeit an interesting one, given that it necessitates improved understanding of human grasping and manipulation techniques. We build robots to automate tasks but also to provide tools for humans to easily perform repetitive and dangerous tasks while avoiding harm. Achieving robust and flexible collaboration between humans and robots is hence the next major challenge. Fences that currently separate humans from robots will gradually disappear, and robots will start manipulating objects jointly with humans. To achieve this objective, robots must become smooth and trustable partners that interpret humans’ intentions and respond accordingly. Furthermore, robots must acquire a better understanding of how humans interact and must attain real-time adaptation capabilities. There is also a need to develop robots that are safe by design, with an emphasis on soft and lightweight structures as well as control and planning methodologies based on multisensory feedback. Holding two objects in one hand requires dexterity. Whereas a human can grab multiple objects at the same time (top), a robot (bottom) cannot yet achieve such dexterity. In this example, a human has placed the objects in the robot’s hand. PHOTOS: LEARNING ALGORITHMS AND SYSTEMS LABORATORY, EPFL Dexterous manipulation is one of the primary goals in robotics. Robots with this capability could sort and package objects, chop vegetables, and fold clothes. As robots come to work side by side with humans, they must also become human-aware. Over the past decade, research has made strides toward these goals. Progress has come from advances in visual and haptic perception and in mechanics in the form of soft actuators that offer a natural compliance. Most notably, immense progress in machine learning has been leveraged to encapsulate models of uncertainty and to support improvements in adaptive and robust control. Open questions remain in terms of how to enable robots to deal with the most unpredictable agent of all, the human.

[1]  Leon D. Harmon,et al.  Automated Tactile Sensing , 1982 .

[2]  Yiannis Demiris,et al.  Iterative path optimisation for personalised dressing assistance using vision and force information , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[3]  Jitendra Malik,et al.  Learning to Poke by Poking: Experiential Learning of Intuitive Physics , 2016, NIPS.

[4]  Sachin Sakhare,et al.  Image processing techniques for object tracking in video surveillance- A survey , 2015, 2015 International Conference on Pervasive Computing (ICPC).

[5]  Mark H. Lee,et al.  A Survey of Robot Tactile Sensing Technology , 1989, Int. J. Robotics Res..

[6]  Danica Kragic,et al.  Mapping human intentions to robot motions via physical interaction through a jointly-held object , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Danica Kragic,et al.  Data-Driven Grasp Synthesis—A Survey , 2013, IEEE Transactions on Robotics.

[8]  John T. Wen,et al.  Human-directed coordinated control of an assistive mobile manipulator , 2016, International Journal of Intelligent Robotics and Applications.

[9]  Elizabeth A. Croft,et al.  Exploration of geometry and forces occurring within human-to-robot handovers , 2018, 2018 IEEE Haptics Symposium (HAPTICS).

[10]  Siddhartha S. Srinivasa,et al.  Extrinsic dexterity: In-hand manipulation with external forces , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Oliver Kroemer,et al.  Combining active learning and reactive control for robot grasping , 2010, Robotics Auton. Syst..

[12]  Jan Peters,et al.  Learning robot in-hand manipulation with tactile features , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[13]  Aude Billard,et al.  Catching Objects in Flight , 2014, IEEE Transactions on Robotics.

[14]  Aaron M. Dollar,et al.  A two-fingered robot gripper with large object reorientation range , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Jörg Stückler,et al.  Following human guidance to cooperatively carry a large object , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[16]  John T. Wen,et al.  Collaborative human-robot manipulation of highly deformable materials , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[17]  John Kenneth Salisbury,et al.  Contact Sensing from Force Measurements , 1990, Int. J. Robotics Res..

[18]  Peter J. Kyberd,et al.  The use of underactuation in prosthetic grasping , 2011 .

[19]  David Vogt,et al.  Inferring guidance information in cooperative human-robot tasks , 2013, 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids).

[20]  Danica Kragic,et al.  Learning of grasp adaptation through experience and tactile sensing , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Yoky Matsuoka,et al.  A kinematic thumb model for the ACT hand , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[22]  Wei Chen,et al.  Tactile Sensors for Friction Estimation and Incipient Slip Detection—Toward Dexterous Robotic Manipulation: A Review , 2018, IEEE Sensors Journal.

[23]  Kenji Tahara,et al.  Grasp and dexterous manipulation of multi-fingered robotic hands: a review from a control view point , 2017, Adv. Robotics.

[24]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[25]  Jean-Pierre Thibaut,et al.  Developing motor planning over ages. , 2010, Journal of experimental child psychology.

[26]  Jimmy A. Jørgensen,et al.  Assessing Grasp Stability Based on Learning and Haptic Data , 2011, IEEE Transactions on Robotics.

[27]  Fumio Kanehiro,et al.  Cooperative works by a human and a humanoid robot , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[28]  H. Harry Asada,et al.  Inter-finger coordination and postural synergies in robot hands via mechanical implementation of principal components analysis , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[29]  Gerd Hirzinger,et al.  The thumb: guidelines for a robotic design , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[30]  Allison M. Okamura,et al.  An overview of dexterous manipulation , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[31]  Yanmei Li,et al.  A review of modeling of soft-contact fingers and stiffness control for dextrous manipulation in robotics , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[32]  Larry H. Matthies,et al.  Task-oriented grasping with semantic and geometric scene understanding , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[33]  Aaron M. Dollar,et al.  Dimensional synthesis of three-fingered robot hands for maximal precision manipulation workspace , 2015, Int. J. Robotics Res..

[34]  Kevin M. Lynch,et al.  Planning and control for dynamic, nonprehensile, and hybrid manipulation tasks , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[35]  Danica Kragic,et al.  Reinforcement Learning for Pivoting Task , 2017, ArXiv.

[36]  Oliver Brock,et al.  A novel type of compliant and underactuated robotic hand for dexterous grasping , 2016, Int. J. Robotics Res..

[37]  Robert D. Howe,et al.  The Highly Adaptive SDM Hand: Design and Performance Evaluation , 2010, Int. J. Robotics Res..

[38]  Stefano Caselli,et al.  Comfortable robot to human object hand-over , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[39]  Martin,et al.  [IEEE 2010 RO-MAN: The 19th IEEE International Symposium on Robot and Human Interactive Communication - Viareggio, Italy (2010.09.13-2010.09.15)] 19th International Symposium in Robot and Human Interactive Communication - Load sharing in human-robot cooperative manipulation , 2010 .

[40]  Mehmet Remzi Dogar,et al.  Haptic identification of objects using a modular soft robotic gripper , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[41]  Danica Kragic,et al.  Deep predictive policy training using reinforcement learning , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[42]  Sandra Hirche,et al.  Load sharing in human-robot cooperative manipulation , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[43]  Manuel G. Catalano,et al.  Adaptive synergies for the design and control of the Pisa/IIT SoftHand , 2014, Int. J. Robotics Res..

[44]  Rolf Pfeifer,et al.  How the body shapes the way we think - a new view on intelligence , 2006 .

[45]  André Crosnier,et al.  Collaborative manufacturing with physical human–robot interaction , 2016 .

[46]  Marilena Vendittelli,et al.  Manual guidance of humanoid robots without force sensors: Preliminary experiments with NAO , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[47]  Antonio Bicchi,et al.  Hands for dexterous manipulation and robust grasping: a difficult road toward simplicity , 2000, IEEE Trans. Robotics Autom..

[48]  I. Park,et al.  Stretchable, Skin‐Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review , 2016 .

[49]  Maurizio Valle,et al.  Guest Editorial Special Issue on Robotic Sense of Touch , 2011, IEEE Trans. Robotics.

[50]  Sandra Hirche,et al.  Investigating Human-Human Approach and Hand-Over , 2009, Human Centered Robot Systems, Cognition, Interaction, Technology.

[51]  Mohammed Bennamoun,et al.  3D Object Recognition in Cluttered Scenes with Local Surface Features: A Survey , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[52]  H. Hanafusa,et al.  Stable Prehension by a Robot Hand with Elastic Fingers , 1977 .

[53]  Alois Knoll,et al.  Handing Over a Cube , 2009, Annals of the New York Academy of Sciences.

[54]  Vijay Kumar,et al.  Robotic grasping and contact: a review , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[55]  Ashutosh Saxena,et al.  Learning haptic representation for manipulating deformable food objects , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[56]  Abderrahmane Kheddar,et al.  Quadratic Programming for Multirobot and Task-Space Force Control , 2019, IEEE Transactions on Robotics.

[57]  Cory Meijneke,et al.  Design and performance assessment of an underactuated hand for industrial applications , 2011 .

[58]  Nikolaos G. Tsagarakis,et al.  Detecting object affordances with Convolutional Neural Networks , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[59]  Danica Kragic,et al.  Hierarchical Fingertip Space: A Unified Framework for Grasp Planning and In-Hand Grasp Adaptation , 2016, IEEE Transactions on Robotics.

[60]  Jeannette Bohg,et al.  Fusing visual and tactile sensing for 3-D object reconstruction while grasping , 2013, 2013 IEEE International Conference on Robotics and Automation.

[61]  Aude Billard,et al.  Stretchable capacitive tactile skin on humanoid robot fingers — First experiments and results , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[62]  Robert J. Wood,et al.  Soft robotic glove for combined assistance and at-home rehabilitation , 2015, Robotics Auton. Syst..

[63]  François Keith,et al.  Proactive behavior of a humanoid robot in a haptic transportation task with a human partner , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[64]  Amir Firouzeh,et al.  Grasp Mode and Compliance Control of an Underactuated Origami Gripper Using Adjustable Stiffness Joints , 2017, IEEE/ASME Transactions on Mechatronics.

[65]  José Santos-Victor,et al.  Recognizing the grasp intention from human demonstration , 2015, Robotics Auton. Syst..

[66]  Aude Billard,et al.  Constraints extraction from asymmetrical bimanual tasks and their use in coordinated behavior , 2018, Robotics Auton. Syst..

[67]  P. Allen,et al.  Dexterous Grasping via Eigengrasps : A Low-dimensional Approach to a High-complexity Problem , 2007 .

[68]  Eren Erdal Aksoy,et al.  Part-based grasp planning for familiar objects , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[69]  Danica Kragic,et al.  Learning a dictionary of prototypical grasp-predicting parts from grasping experience , 2013, 2013 IEEE International Conference on Robotics and Automation.

[70]  Danica Kragic,et al.  A Metric for Comparing the Anthropomorphic Motion Capability of Artificial Hands , 2013, IEEE Transactions on Robotics.

[71]  Oliver Brock,et al.  Morphological computation: The good, the bad, and the ugly , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[72]  Kazuhiro Kosuge,et al.  Mobile robot helper , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[73]  M. Latash Fundamentals of Motor Control , 2012 .

[74]  Aaron M. Dollar,et al.  Toward robust, whole-hand caging manipulation with underactuated hands , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[75]  Aude Billard,et al.  Multi-contact haptic exploration and grasping with tactile sensors , 2016, Robotics Auton. Syst..

[76]  Alexander Verl,et al.  Cooperation of human and machines in assembly lines , 2009 .

[77]  Maya Cakmak,et al.  Adaptive Coordination Strategies for Human-Robot Handovers , 2015, Robotics: Science and Systems.

[78]  M. T. Mason,et al.  Toward Robotic Manipulation , 2018, Annu. Rev. Control. Robotics Auton. Syst..

[79]  Yi Li,et al.  Robot Learning Manipulation Action Plans by "Watching" Unconstrained Videos from the World Wide Web , 2015, AAAI.

[80]  Kofi Appiah,et al.  Embedded Vision Systems: A Review of the Literature , 2018, ARC.

[81]  Danica Kragic,et al.  Dual arm manipulation - A survey , 2012, Robotics Auton. Syst..

[82]  Renaud Detry,et al.  Tactile-Vision Integration for Task-Compatible Fine-Part Manipulation , 2017 .

[83]  Aaron Dollar,et al.  Yale OpenHand Project: Optimizing Open-Source Hand Designs for Ease of Fabrication and Adoption , 2017, IEEE Robotics & Automation Magazine.

[84]  Mariapaola D'Imperio,et al.  A dexterous gripper for in-hand manipulation , 2016, 2016 IEEE International Conference on Advanced Intelligent Mechatronics (AIM).

[85]  Danica Kragic,et al.  From object categories to grasp transfer using probabilistic reasoning , 2012, 2012 IEEE International Conference on Robotics and Automation.

[86]  Rachid Alami,et al.  Sharing effort in planning human-robot handover tasks , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[87]  Siddhartha S. Srinivasa,et al.  Using spatial and temporal contrast for fluent robot-human hand-overs , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[88]  Oliver Kroemer,et al.  Learning Audio Feedback for Estimating Amount and Flow of Granular Material , 2018, CoRL.

[89]  Danica Kragic,et al.  Affordance detection for task-specific grasping using deep learning , 2017, 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids).

[90]  Danica Kragic,et al.  Dexterous Manipulation Graphs , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[91]  Oliver Kroemer,et al.  Learning robot grasping from 3-D images with Markov Random Fields , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[92]  Jan Peters,et al.  Model learning for robot control: a survey , 2011, Cognitive Processing.

[93]  Kazuhiro Kosuge,et al.  Progress and prospects of the human–robot collaboration , 2017, Autonomous Robots.

[94]  Sanat S Bhole,et al.  Soft Microfluidic Assemblies of Sensors, Circuits, and Radios for the Skin , 2014, Science.

[95]  Jian Shi,et al.  Dynamic in-hand sliding manipulation , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[96]  Peter K. Allen,et al.  Grasp adjustment on novel objects using tactile experience from similar local geometry , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[97]  Tamim Asfour,et al.  Bimanual grasp planning , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[98]  Jakub W. Pachocki,et al.  Learning dexterous in-hand manipulation , 2018, Int. J. Robotics Res..

[99]  Oliver Brock,et al.  Interactive Perception: Leveraging Action in Perception and Perception in Action , 2016, IEEE Transactions on Robotics.

[100]  Antonis A. Argyros,et al.  Hand-Object Contact Force Estimation from Markerless Visual Tracking , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[101]  Matei T. Ciocarlie,et al.  The Velo gripper: A versatile single-actuator design for enveloping, parallel and fingertip grasps , 2014, Int. J. Robotics Res..

[102]  Antonio Bicchi,et al.  Modelling natural and artificial hands with synergies , 2011, Philosophical Transactions of the Royal Society B: Biological Sciences.

[103]  Charles C. Kemp,et al.  Human-Robot Interaction for Cooperative Manipulation: Handing Objects to One Another , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[104]  Sergey Levine,et al.  Optimal control with learned local models: Application to dexterous manipulation , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[105]  Thomas C. Henderson,et al.  A Survey of General- Purpose Manipulation , 1989, Int. J. Robotics Res..