Media-based navigation for hypermedia systems

in this paper, we present the concept and the general framework of a new navigation style for hypermedia systems, the media-based navigation. The user browses through a hypermedia system using the specific clues such as shape, color, construction for still image, motion for movie, and tone or melody for auditory data. In this navigation, the user and the system interact with each other without translating the textual representation. We describe the visualbased navigation and show its algorithms. The algorithms are implemented on an experimental hypermedia database system called “Miyabi.” We show some experimental results and our current evaluation. We also describe the implementation of the algorithms for large scale hypermedia systems and show that these algorithms can effectively apply to the system which have more than 10000 images. We also describe the auditory media-based navigation. The media-based navigation is a useful interface for hypermedia systems to improve human-machine interactive interfaces.