Bipartite Graph Attention Autoencoders for Unsupervised Change Detection Using VHR Remote Sensing Images
暂无分享,去创建一个
Detecting land cover change is an essential task in very-high-spatial-resolution (VHR) remote sensing applications. However, because VHR images can capture the details of ground objects, the scenes of VHR images are usually complex. For example, VHR images usually show distinct appearances or features of the same object, aroused by noise, climate conditions, imaging angles, etc. To address this issue, this article proposes a novel unsupervised approach named bipartite graph attention autoencoders (BGAAEs) for VHR image change detection. BGAAE, a further improved way of using dual convolutional autoencoders based on the architecture of image translation, equips the encoder layers with a graph attention mechanism (GAM). To generate an effective difference image, it consists of two additional loss terms: the domain correlation and semantic consistency losses, in addition to the reconstruction loss. The domain correlation loss is designed based on the encoder layers, aiming to enforce the spatial alignment of deep feature representations of the unchanged objects and mitigate the influence of pixel changes on the learning objective. The semantic consistency loss focuses on ensuring the semantic feature consistency of the bitemporal images after transcoding and allows for more flexible transformations. The experimental results on four VHR image datasets demonstrate the superiority of the proposed method.