Multispectral Person Re-Identification Using GAN for Color-to-Thermal Image Translation
暂无分享,去创建一个
Abstract We propose a ThermalGAN framework for cross-modality color-thermal person re-identification (ReID). We use a stack of generative adversarial networks (GANs) to translate a single color probe image to a multimodal thermal probe set. We use thermal histograms and feature descriptors as a thermal signature. We collected a large-scale multispectral ThermalWorld dataset for extensive training of our GAN model. In total the dataset includes 20,216 color-thermal image pairs, 516 person ID, and ground truth pixel-level object annotations. We made the dataset freely available (see http://www.zefirus.org/ThermalGAN/ ). We evaluate our framework on the ThermalWorld dataset to show that it delivers robust matching that competes and surpasses the state of the art in cross-modality color-thermal ReID.