FLAP for CAOS: Forward-Looking Active Perception for Clutter-Aware Object Search1

Abstract In this paper, we present a system for autonomous object search and exploration in cluttered environments. The system shortens the average time needed to complete search tasks by continually planning multiple perception actions ahead of time using probabilistic prior knowledge. Useful sensing actions are found using a frontier-based view sampling technique in a continuously built 3D map. We demonstrate the system on real hardware, investigate the planner’s performance in three experiments in simulation, and show that our approach achieves shorter overall run times of search tasks compared to a greedy strategy.

[1]  Leslie Pack Kaelbling,et al.  Manipulation-based active search for occluded objects , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Alexander Kleiner,et al.  A frontier-void-based approach for autonomous exploration in 3d , 2011 .

[3]  Masayuki Inaba,et al.  Searching objects in large-scale indoor environments: A decision-theoretic approach , 2012, 2012 IEEE International Conference on Robotics and Automation.

[4]  Luc De Raedt,et al.  Occluded object search by relational affordances , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Paul Newman,et al.  Choosing where to go: Complete 3D exploration with stereo , 2011, 2011 IEEE International Conference on Robotics and Automation.

[6]  Nico Blodow,et al.  Autonomous semantic mapping for robots performing everyday manipulation tasks in kitchen environments , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Thorsten Joachims,et al.  Contextually Guided Semantic Labeling and Search for 3D Point Clouds , 2011, ArXiv.

[8]  John K. Tsotsos,et al.  Visual Saliency Improves Autonomous Visual Search , 2014, 2014 Canadian Conference on Computer and Robot Vision.

[9]  Alexander Kleiner,et al.  Multirobot Coverage Search in Three Dimensions , 2016, J. Field Robotics.

[10]  Jianwei Zhang,et al.  The RACE Project - Robustness by Autonomous Competence Enhancement , 2014, Künstliche Intell..

[11]  Hoda A. ElMaraghy,et al.  Plan-N-Scan: A Robotic System for Collision-Free Autonomous Exploration and Workspace Mapping , 1999, J. Intell. Robotic Syst..

[12]  C. Ian Connolly,et al.  The determination of next best views , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[13]  Gaurav S. Sukhatme,et al.  Interactive environment exploration in clutter , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Yiming Ye,et al.  Sensor Planning for 3D Object Search , 1999 .

[15]  Siddhartha S. Srinivasa,et al.  Object search by manipulation , 2014, Auton. Robots.

[16]  Alejandro Sarmiento,et al.  An efficient strategy for rapidly finding an object in a polygonal world , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[17]  John K. Tsotsos,et al.  Visual search for an object in a 3D environment using a mobile robot , 2010, Comput. Vis. Image Underst..

[18]  Wolfram Burgard,et al.  OctoMap: an efficient probabilistic 3D mapping framework based on octrees , 2013, Autonomous Robots.

[19]  Joachim Hertzberg,et al.  An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor environments , 2003, Robotics Auton. Syst..

[20]  Nick Hawes,et al.  Using Qualitative Spatial Relations for indirect object search , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Brian Yamauchi,et al.  A frontier-based approach for autonomous exploration , 1997, Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA'97. 'Towards New Computational Principles for Robotics and Automation'.