Analysis of the Impact of Google Maps’ Level on Object Detection

Remote sensing images have different levels based on spatial resolution, which will affect the object detection performance seriously. This paper quantitatively analyses the impact of Google Maps’ level on object detection, taking the transmission tower as an example. The object area proportion (OAP) index is defined to help choose the data used for rapid detections and is capable of obtaining the optimal results under the particular requirements of speed and accuracy when observing a specific object in remote sensing images.

[1]  Robert A. Schowengerdt,et al.  Remote sensing, models, and methods for image processing , 1997 .

[2]  Fei-Fei Li,et al.  ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[3]  Trevor Darrell,et al.  Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation , 2013, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Luc Van Gool,et al.  The Pascal Visual Object Classes (VOC) Challenge , 2010, International Journal of Computer Vision.

[5]  Kaiming He,et al.  Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Wei Liu,et al.  SSD: Single Shot MultiBox Detector , 2015, ECCV.

[7]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[8]  Pietro Perona,et al.  Microsoft COCO: Common Objects in Context , 2014, ECCV.

[9]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).