Using Eye Tracking as Human Computer Interaction Interface
暂无分享,去创建一个
In the project AAMS, we have developed the e-learning platform ALM for Ilias as a technical basis for research in education. ALM uses eye tracking data to analyze a learner’s gaze movement at runtime in order to adapt the learning content. As an extension to the actual capabilities of the platform, we plan to implement and evaluate a framework for advanced eye tracking analysis techniques. This framework will focus on two main concepts. The first concept allows for real-time analysis of a user’s text reading status by artificial intelligence techniques, at any point in the learning process. This extends and enriches the adaptive behavior of our platform. The second concept is an interface framework for multimedia applications to connect to any eye tracking hardware that is available at runtime to be used as a user interaction input device. Since accuracy can be an issue for low-cost eye trackers, we use an object-specific relevance factor for the detection of selectable or related content.
[1] Stephen J. Payne,et al. How much do we understand when skim reading? , 2006, CHI EA '06.
[2] Holger Schmidt,et al. An adaptive and adaptable learning platform with realtime eye-tracking support: lessons learned , 2014, DeLFI.