Toward the automatic detection of access holes in disaster rubble

The collapse of buildings and other structures in heavily populated areas often result in multiple human victims becoming trapped within the resulting rubble. This rubble is often unstable, difficult to traverse and dangerous for first responders who are tasked with finding and extricating victims through access holes in the rubble. Recent work in scene mapping and reconstruction using RGB-D data collected by unmanned aerial vehicles (UAVs) suggest the possibility of automatically identifying potential access holes into the interior of rubble. This capability would allow critical limited search capacity to be concentrated in areas where potential access holes can be verified as useful entry points. In this paper, we present a system to automatically identify access holes in rubble. Our investigation begins with defining a hole in terms of its functionality as a potential means for accessing the interior of rubble. From this definition, we propose a set of discriminative geometric and photometric features to detect “access holes”. We conducted experiments using RGB-D data collected over several disaster training facilities using a UAV. Our empirical evaluation indicates the potential of the proposed approach for successfully identifying access holes in disaster rubble scenes.

[1]  Alexander Ferworn,et al.  Disaster scene reconstruction: modeling and simulating urban building collapse rubble within a game engine , 2013, SummerSim.

[2]  Alexander Ferworn,et al.  Initial experiments on 3D modeling of complex disaster environments using unmanned aerial vehicles , 2011, 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics.

[3]  Esa Aleksi Paaso,et al.  Creation of robot for subsurface void detection , 2009, 2009 IEEE Conference on Technologies for Homeland Security.

[4]  日向 俊二 Kinect for Windowsアプリを作ろう , 2012 .

[5]  Robin R. Murphy Marsupial and Shape-Shifting Robots for Urban Search and Rescue , 2000, IEEE Intell. Syst..

[6]  Adam Jacoff,et al.  Traversability metrics for rough terrain applied to repeatable test methods , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Masahiko Onosato,et al.  Digital gareki archives: An approach to know more about collapsed houses for supporting search and rescue activities , 2012, 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR).

[8]  Dhiraj Joshi,et al.  Object Categorization: Computer and Human Vision Perspectives , 2008 .

[9]  Rama Chellappa,et al.  Entropy rate superpixel segmentation , 2011, CVPR 2011.

[10]  K. Kawabata,et al.  Aerial Robots for Quick Information Gathering in USAR , 2006, 2006 SICE-ICASE International Joint Conference.

[11]  G. Nejat,et al.  3-D Active Sensing in Time-Critical Urban Search and Rescue Missions , 2012, IEEE/ASME Transactions on Mechatronics.

[12]  Aníbal Ollero,et al.  Control and perception techniques for aerial robotics , 2003, Annu. Rev. Control..

[13]  Alexander Dekhtyar,et al.  Information Retrieval , 2018, Lecture Notes in Computer Science.

[14]  Sven J. Dickinson,et al.  Object Categorization: The Evolution of Object Categorization and the Challenge of Image Abstraction , 2009 .

[15]  Robin R. Murphy,et al.  Trial by fire [rescue robots] , 2004, IEEE Robotics & Automation Magazine.

[16]  Arnab Sinha,et al.  Mind the gap: detection and traversability analysis of terrain gaps using LIDAR for safe robot navigation , 2013, Robotica.