We have developed a user interface for similarity-based image retrieval, where the distribution of retrieved data in a high-dimensional feature space is represented as a dynamical scatter diagram of thumbnail images in a 3-dimensional visualization space and similarities between data are represented as sizes in the 3-dimensional space. Coordinate systems in the visualization space are obtained by statistical calculations on the distribution of feature vectors of retrieved images. Our system provides some different transformations from a high-dimensional feature space to a 3-dimensional space that give different coordinate systems to the visualization space. By changing the coordinates automatically at some intervals, a spatial-temporal pattern of the distribution of images is generated. Furthermore a hierarchical coordinate system that consists of some local coordinate systems based on key images can be defined in the visualization space. These methods can represent a large number of retrieved results in a way that users can grasp intuitively.
[1]
Akihiro Sugimoto,et al.
An Interface for Visualizing Feature Space in Image Retrieval
,
1998,
MVA.
[2]
Dragutin Petkovic,et al.
Query by Image and Video Content: The QBIC System
,
1995,
Computer.
[3]
Dwifiandika S. Faulus,et al.
An expressive language and interface for image querying
,
1997,
Machine Vision and Applications.
[4]
Myron Flickner,et al.
Query by Image and Video Content
,
1995
.
[5]
Simone Santini,et al.
In search of information in visual media
,
1997,
CACM.
[6]
Markus A. Stricker,et al.
Spectral covariance and fuzzy regions for image indexing
,
1997,
Machine Vision and Applications.