Class Prior-Free Positive-Unlabeled Learning with Taylor Variational Loss for Hyperspectral Remote Sensing Imagery

Positive-unlabeled learning (PU learning) in hyperspectral remote sensing imagery (HSI) is aimed at learning a binary classifier from positive and unlabeled data, which has broad prospects in various earth vision applications. However, when PU learning meets limited labeled HSI, the unlabeled data may dominate the optimization process, which makes the neural networks overfit the unlabeled data. In this paper, a Taylor variational loss is proposed for HSI PU learning, which reduces the weight of the gradient of the unlabeled data by Taylor series expansion to enable the network to find a balance between overfitting and underfitting. In addition, the self-calibrated optimization strategy is designed to stabilize the training process. Experiments on 7 benchmark datasets (21 tasks in total) validate the effectiveness of the proposed method. Code is at: https://github.com/Hengwei-Zhao96/T-HOneCls.

[1]  Xinyu Wang,et al.  One-Class Risk Estimation for One-Class Hyperspectral Image Classification , 2022, IEEE Transactions on Geoscience and Remote Sensing.

[2]  N. Ye,et al.  Positive-Unlabeled Learning using Random Forests via Recursive Greedy Risk Minimization , 2022, NeurIPS.

[3]  Y. Zhong,et al.  Detecting pine wilt disease at the pixel level from high spatial and spectral resolution UAV-borne imagery in complex forest landscapes using deep one-class classification , 2022, Int. J. Appl. Earth Obs. Geoinformation.

[4]  Qingming Huang,et al.  Dist-PU: Positive-Unlabeled Learning from a Label Distribution Perspective , 2022, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  P. Pellikka,et al.  Mapping the distribution of invasive tree species using deep one-class classification in the tropical montane landscape of Kenya , 2022, ISPRS Journal of Photogrammetry and Remote Sensing.

[6]  G. Carneiro,et al.  Perturbed and Strict Mean Teachers for Semi-supervised Semantic Segmentation , 2021, Computer Vision and Pattern Recognition.

[7]  Le Wang,et al.  How to automate timely large-scale mangrove mapping with remote sensing , 2021 .

[8]  Weitong Chen,et al.  Positive-Unlabeled Learning from Imbalanced Data , 2021, IJCAI.

[9]  Dongyan Zhao,et al.  Predictive Adversarial Learning from Positive and Unlabeled Data , 2021, AAAI.

[10]  Ji Zhao,et al.  WHU-Hi: UAV-borne hyperspectral with high spatial resolution (H2) benchmark datasets and classifier for precise crop identification based on deep convolutional neural network with CRF , 2020 .

[11]  Ivor W. Tsang,et al.  A Survey of Label-noise Representation Learning: Past, Present and Future , 2020, ArXiv.

[12]  Bo Du,et al.  Beyond the Patchwise Classification: Spectral-Spatial Fully Convolutional Networks for Hyperspectral Image Classification , 2020, IEEE Transactions on Big Data.

[13]  Liangpei Zhang,et al.  Positive Unlabeled Learning with Class-prior Approximation , 2020, IJCAI.

[14]  Fengmao Lv,et al.  Can Cross Entropy Loss Be Robust to Label Noise? , 2020, IJCAI.

[15]  Aditya Krishna Menon,et al.  Does label smoothing mitigate label noise? , 2020, ICML.

[16]  Zhuo Zheng,et al.  FPGA: Fast Patch-Free Global Learning Framework for Fully End-to-End Hyperspectral Image Classification , 2020, IEEE Transactions on Geoscience and Remote Sensing.

[17]  David Berthelot,et al.  FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence , 2020, NeurIPS.

[18]  James Bailey,et al.  Symmetric Cross Entropy for Robust Learning With Noisy Labels , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[19]  Tongliang Liu,et al.  Positive and Unlabeled Learning with Label Disambiguation , 2019, IJCAI.

[20]  Hao Wu,et al.  A Variational Approach for Learning from Positive and Unlabeled Data , 2019, NeurIPS.

[21]  Licheng Jiao,et al.  Classification of Hyperspectral Images Based on Multiclass Spatial–Spectral Generative Adversarial Networks , 2019, IEEE Transactions on Geoscience and Remote Sensing.

[22]  Xingrui Yu,et al.  How does Disagreement Help Generalization against Label Corruption? , 2019, ICML.

[23]  J. Honda,et al.  Learning from Positive and Unlabeled Data with a Selection Bias , 2018, ICLR.

[24]  Mert R. Sabuncu,et al.  Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels , 2018, NeurIPS.

[25]  Bo Du,et al.  Spectral–Spatial Unified Networks for Hyperspectral Image Classification , 2018, IEEE Transactions on Geoscience and Remote Sensing.

[26]  Jieping Ye,et al.  Margin Based PU Learning , 2018, AAAI.

[27]  Masashi Sugiyama,et al.  Co-teaching: Robust training of deep neural networks with extremely noisy labels , 2018, NeurIPS.

[28]  Li Fei-Fei,et al.  MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels , 2017, ICML.

[29]  Shuyuan Yang,et al.  Deep Fully Convolutional Network-Based Spatial Distribution Prediction for Hyperspectral Image Classification , 2017, IEEE Transactions on Geoscience and Remote Sensing.

[30]  Yoshua Bengio,et al.  A Closer Look at Memorization in Deep Networks , 2017, ICML.

[31]  Qingshan Liu,et al.  Cascaded Recurrent Neural Networks for Hyperspectral Image Classification , 2017, IEEE Transactions on Geoscience and Remote Sensing.

[32]  Jun Li,et al.  Advanced Spectral Classifiers for Hyperspectral Images: A review , 2017, IEEE Geoscience and Remote Sensing Magazine.

[33]  Harri Valpola,et al.  Weight-averaged consistency targets improve semi-supervised deep learning results , 2017, ArXiv.

[34]  Gang Niu,et al.  Positive-Unlabeled Learning with Non-Negative Risk Estimator , 2017, NIPS.

[35]  Aritra Ghosh,et al.  Robust Loss Functions under Label Noise for Deep Neural Networks , 2017, AAAI.

[36]  Jacob Goldberger,et al.  Training deep neural-networks using a noise adaptation layer , 2016, ICLR.

[37]  Ambuj Tewari,et al.  Mixture Proportion Estimation via Kernel Embeddings of Distributions , 2016, ICML.

[38]  Nagarajan Natarajan,et al.  PU Learning for Matrix Completion , 2014, ICML.

[39]  Dong-Hong Ji,et al.  Positive Unlabeled Learning for Deceptive Reviews Detection , 2014, EMNLP.

[40]  Gang Wang,et al.  Deep Learning-Based Classification of Hyperspectral Data , 2014, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

[41]  Rob Fergus,et al.  Learning from Noisy Labels with Deep Neural Networks , 2014, ICLR.

[42]  Chee Keong Kwoh,et al.  Positive-unlabeled learning for disease gene identification , 2012, Bioinform..

[43]  Wenkai Li,et al.  A Positive and Unlabeled Learning Algorithm for One-Class Classification of Remote-Sensing Data , 2011, IEEE Transactions on Geoscience and Remote Sensing.

[44]  Giles M. Foody,et al.  Training set size requirements for the classification of a specific class , 2006 .

[45]  Jihong Ouyang,et al.  Who Is Your Right Mixup Partner in Positive and Unlabeled Learning , 2022, ICLR.

[46]  Charles Elkan,et al.  One-Class Remote Sensing Classification From Positive and Unlabeled Background Data , 2021, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

[47]  Chen Gong,et al.  Cost-sensitive positive and unlabeled learning , 2021, Inf. Sci..