Integrated Browsing and Searching of Large Image Collections

Current image retrieval systems offer either an exploratory search method through browsing and navigation or a direct search method based on specific queries. Combining both of these methods in a uniform framework allows users to formulate queries more naturally, since they are already acquainted with the contents of the database and with the notion of matching the machine would use to return results. We propose a multi-modes and integrated image retrieval system that offers the user quick and effective previewing of the collection, intuitive and natural navigating to any parts of it, and query by example or composition for more specific and clearer retrieval goals.

[1]  Anil K. Jain,et al.  Artificial neural networks for feature extraction and multivariate data projection , 1995, IEEE Trans. Neural Networks.

[2]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[3]  Minh N. Do,et al.  Invariant Image Retrieval Using Wavelet Maxima Moment , 1999, VISUAL.

[4]  Shih-Fu Chang,et al.  Transform features for texture classification and discrimination in large image databases , 1994, Proceedings of 1st International Conference on Image Processing.

[5]  Erkki Oja,et al.  Content-Based Image Retrieval Using Self-Organizing Maps , 1999, VISUAL.

[6]  John C. Dalton,et al.  Similarity pyramids for browsing and organization of large image databases , 1998, Electronic Imaging.

[7]  Akihiro Sugimoto,et al.  An Interface for Visualizing Feature Space in Image Retrieval , 1998, MVA.

[8]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[9]  Markus A. Stricker,et al.  Similarity of color images , 1995, Electronic Imaging.