Towards a Cooperative Robotic System for Autonomous Pipe Cutting in Nuclear Decommissioning

A mobile camera is used to support an assisted teleoperation pipe-cutting system for nuclear decommissioning. The base system consists of dual-manipulators with a single mounted Kinect camera. The user selects the object from an onscreen image, whilst the computer control system automatically grasps the pipe with one end-effector and positions the second for cutting. However, the system fails in some cases because of data limitations, for example a partially obscured pipe in a challenging decommissioning scenario (simulated in the laboratory). Hence, the present article develops a new method to increase the use case scenarios via the introduction of mobile cameras e.g. for mounting on a drone. This is a non-trivial problem, with SLAM and ArUco fiducials introduced to locate the cameras, and a novel error correction method proposed for finding the ArUco markers. Preliminary results demonstrate the validity of the approach but improvements will be required for robust autonomous cutting. Hence, to reduce the pipe position estimation errors, suggestions are made for various algorithmic and hardware refinements.

[1]  Richard Sharp,et al.  Radiation tolerance of components and materials in nuclear robot applications , 1996 .

[2]  Derek W. Seward,et al.  Development of a Multi-Arm Mobile Robot for Nuclear Decommissioning Tasks , 2007 .

[3]  Ronald D. Schrimpf,et al.  Radiation Effects And Soft Errors In Integrated Circuits And Electronic Devices , 2004 .

[4]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[5]  Maxime Adjigble,et al.  Towards advanced robotic manipulation for nuclear decommissioning: A pilot study on tele-operation and autonomy , 2016, 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA).

[6]  James R. Bergen,et al.  Visual odometry , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[7]  Robin R. Murphy,et al.  A decade of rescue robots , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[9]  Dynamic modeling and parameter estimation of a hydraulic robot manipulator using a multi-objective genetic algorithm , 2016 .

[10]  Shinji Yamamoto Development of inspection robot for nuclear power plant , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[11]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[12]  Sonia Chernova,et al.  A Comparison of Remote Robot Teleoperation Interfaces for General Object Manipulation , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[13]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[15]  Shinji Kawatsuma,et al.  Emergency response by robots to Fukushima-Daiichi accident: summary and lessons learned , 2012, Ind. Robot.

[16]  Manabu Hashimoto,et al.  Current Status and Future Trends on Robot Vision Technology , 2017, J. Robotics Mechatronics.

[17]  C. West,et al.  A Vision-Based Positioning System with Inverse Dead-Zone Control for Dual-Hydraulic Manipulators , 2018, 2018 UKACC 12th International Conference on Control (CONTROL).