Automated sensory data alignment for environmental and epidermal change monitoring

In this paper we present research adapting a state of the art condition-invariant robotic place recognition algorithm to the role of automated inter- and intra-image alignment of sensor observations of environmental and skin change over time. The approach involves inverting the typical criteria placed upon navigation algorithms in robotics; we exploit rather than attempt to fix the limited camera viewpoint invariance of such algorithms, showing that approximate viewpoint repetition is realistic in a wide range of environments and medical applications. We demonstrate the algorithms automatically aligning challenging visual data from a range of real-world applications: ecological monitoring of environmental change, aerial observation of natural disasters including flooding, tsunamis and bushfires and tracking wound recovery and sun damage over time and present a prototype active guidance system for enforcing viewpoint repetition. We hope to provide an interesting case study for how traditional research criteria in robotics can be inverted to provide useful outcomes in applied situations.

[1]  Cemal Cingi,et al.  Epidemiology and economic burden of nonmelanoma skin cancer. , 2012, Facial plastic surgery clinics of North America.

[2]  Lina María Paz,et al.  Large-Scale 6-DOF SLAM With Stereo-in-Hand , 2008, IEEE Transactions on Robotics.

[3]  Stergios I. Roumeliotis,et al.  Algorithms and sensors for small robot path following , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[4]  Jay E. Anderson,et al.  A COMPARISON OF THREE METHODS FOR ESTIMATING PLANT COVER , 1987 .

[5]  C. Parmesan Ecological and Evolutionary Responses to Recent Climate Change , 2006 .

[6]  Gordon Wyeth,et al.  Aerial SLAM with a single camera using visual expectation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[7]  Tom Duckett,et al.  Mini-SLAM: Minimalistic Visual SLAM in Large-Scale Environments Based on a New Interpretation of Image Similarity , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[8]  Wolfram Burgard,et al.  Lidar-based teach-and-repeat of mobile robot trajectories , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Michael Milford,et al.  Vision-based place recognition: how low can you go? , 2013, Int. J. Robotics Res..

[10]  Kurt Konolige,et al.  FrameSLAM: From Bundle Adjustment to Real-Time Visual Mapping , 2008, IEEE Transactions on Robotics.

[11]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Stephan Nebiker,et al.  UAV-based Augmented Monitoring - Real-time Georeferencing and Integration of Video Imagery with Virtual Globes , 2008 .

[13]  Stefan B. Williams,et al.  Multi-Scale Measures of Rugosity, Slope and Aspect from Benthic Stereo Image Reconstructions , 2012, PloS one.

[14]  Chris Roelfsema,et al.  Challenges of remote sensing for quantifying changes in large complex seagrass environments , 2013 .

[15]  Roland Siegwart,et al.  Lighting‐invariant Adaptive Route Following Using Iterative Closest Point Matching , 2015, J. Field Robotics.

[16]  Gordon Wyeth,et al.  Mapping a Suburb With a Single Camera Using a Biologically Inspired SLAM System , 2008, IEEE Transactions on Robotics.

[17]  Lindsay Kleeman,et al.  Robust Appearance Based Visual Route Following for Navigation in Large-scale Outdoor Environments , 2009, Int. J. Robotics Res..

[18]  Darius Burschka,et al.  V-GPS(SLAM): vision-based inertial system for mobile robots , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[19]  Michael Milford,et al.  Condition-invariant, top-down visual place recognition , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Timothy D. Barfoot,et al.  Visual teach and repeat for long-range rover autonomy , 2010 .

[21]  B. Stewart,et al.  World cancer report 2014. , 2014 .

[22]  Tom Duckett,et al.  A Minimalistic Approach to Appearance-Based Visual SLAM , 2008, IEEE Transactions on Robotics.

[23]  Peter I. Corke,et al.  All-environment visual place recognition with SMART , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[24]  Michel Dhome,et al.  Outdoor autonomous navigation using monocular vision , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Niko Sünderhauf,et al.  Appearance change prediction for long-term navigation across seasons , 2013, 2013 European Conference on Mobile Robots.

[26]  Gordon Wyeth,et al.  Transforming morning to afternoon using linear regression techniques , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[27]  Gordon Wyeth,et al.  SeqSLAM: Visual route-based navigation for sunny summer days and stormy winter nights , 2012, 2012 IEEE International Conference on Robotics and Automation.

[28]  Paul Timothy Furgale,et al.  Lighting‐invariant Visual Teach and Repeat Using Appearance‐based Lidar , 2013, J. Field Robotics.

[29]  Robert C. Bolles,et al.  Outdoor Mapping and Navigation Using Stereo Vision , 2006, ISER.

[30]  Guang-Zhong Yang,et al.  Feature Co-occurrence Maps: Appearance-based localisation throughout the day , 2013, 2013 IEEE International Conference on Robotics and Automation.

[31]  Lino Marques,et al.  Robots for Environmental Monitoring: Significant Advancements and Applications , 2012, IEEE Robotics & Automation Magazine.