Perception Based Color Image Difference

A good image metric is often needed in digital image synthesis. It can be used to check the convergence behavior in progressive methods, to compare images rendered using various rendering methods etc. Since images are rendered to be observed by humans, an image metric should correspond to human perception as well. We propose here a new algorithm which operates in the original image space. There is no need for Fourier or wavelet transforms. Furthermore, the new metric is view distance dependent. The new method uses the contrast sensitivity function. The main idea is to place a number of various rectangles in images, and to compute the CIE LUV average color difference between corresponding rectangles. Errors are then weighted according to the rectangle size and the contrast sensitivity function.