Gesture-enhanced information retrieval and presentation in a distributed learning environment

We describe a novel multimodal gesture-enhanced interface to provide immersive as well as non-immersive, non-intrusive and natural interaction between the user and the information system in a distributed learning environment. Different types of gestures are dynamically transformed into clues for spatial/temporal queries for multimedia information sources and databases. The transformation can depend upon the location, the time, the task, and the user profile. Using various transcoding schemes, the retrieved information can be presented to the user in a multi-resolution, multi-dimensional, and multimodal manner.