Video image retrieval on the basis of subregional co-occurrence matrix texture features and normalised correlation

This paper proposes the simple and efficient image retrieval algorithm using subregional texture features. In order to retrieve images in terms of its contents, it is required to obtain a precise segmentation. However, it is very difficult and takes a long computing time. Therefore, this paper proposes a simple segmentation method, which is to divide an image into high and low entropy regions by using picture information measure (PIM). Also, in order to describe texture characteristics of each region, this paper suggests six different texture features produced on the basis of co-occurrence matrix. For an image retrieval system, a normalised correlation is adopted as a similarity function, which is not dependent on the range of each texture feature values. Finally, this proposed algorithm is applied to various images and produces competitive results.

[1]  Markus A. Stricker,et al.  Color indexing with weak spatial constraints , 1996, Electronic Imaging.

[2]  A. J. Richardson,et al.  Texture segmentation using directional operators , 1990, International Conference on Acoustics, Speech, and Signal Processing.

[3]  R.M. Haralick,et al.  Statistical and structural approaches to texture , 1979, Proceedings of the IEEE.

[4]  B. S. Manjunath,et al.  Texture Features for Browsing and Retrieval of Image Data , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Stephen W. Smoliar,et al.  Content based video indexing and retrieval , 1994, IEEE MultiMedia.

[6]  Shih-Fu Chang,et al.  Tools and techniques for color image retrieval , 1996, Electronic Imaging.

[7]  Robert M. Haralick,et al.  Textural Features for Image Classification , 1973, IEEE Trans. Syst. Man Cybern..

[8]  Anil K. Jain Fundamentals of Digital Image Processing , 2018, Control of Color Imaging Systems.

[9]  Wei Xiong,et al.  Experimental video database management system based on advanced object-oriented techniques , 1996, Electronic Imaging.