A Semisupervised Siamese Network for Efficient Change Detection in Heterogeneous Remote Sensing Images

Change detection in heterogeneous remote sensing images is crucial for emergencies, such as disaster assessment. Existing methods based on homogeneous transformation suffer from the high computational cost that makes the change detection tasks time-consuming. To solve this problem, this article presents a new semisupervised Siamese network (S3N) based on transfer learning. In the proposed S3N, the low- and deep-level features are separated and treated differently for transfer learning. By incorporating two identical subnetworks that are both pretrained on natural images, the proposed S3N eliminates the computational cost for learning the low-level features that are universal for both remote sensing images and natural images. As the deep-level features contain different semantics between remote sensing images and natural images, a novel transfer learning strategy is presented to train only the weights of the layers for deep-level features in the proposed S3N. The decrease in the number of network parameters to be trained reduces the demand for training samples, leading to a significant decrease in computational cost. Afterward, the thresholding method, Otsu, is applied to the difference map derived by the proposed S3N to obtain the final binary map of change detection. Three data sets including different types of heterogeneous remote sensing images are employed to evaluate the performance of the proposed S3N. The experimental results demonstrate that the proposed S3N can achieve a comparable detection performance with much lower computational cost, compared with state-of-the-art change detection algorithms.