Wavelength-independent texture for multispectral scene simulation

In recent years, there has been a growing need for accurate, high fidelity scene simulations in the visible, infrared, microwave and other wavelengths. Based on a rigorous material classification and incorporating material attribute information, we generate wavelength independent texture maps for multi-spectral scene simulation. We calculate the sensor radiance value of every pixel, and change them into color or gray. If a single pixel in the texture contains more than one material, we mixture them based on their radiation attribution. According to area consistency and coherence across scan lines, an extended Seed Filling Algorithm is used in those areas with same or similar materials. These optical steps are performed repeatedly until a satisfactory classfication and mixture is found and the texture maps in a certain wave band are obtained. In this way we generate infrared textures from visible maps and different simulation scence textures at different time of day and under different environment conditions can also be obtained. Finally we give some examples of multi-spectral scene simulation, which are quite satisfied compared with the measured images.