User Modelling for Adaptive Human-Computer Interaction
暂无分享,去创建一个
An important factor for the next generation of Human-Computer Interaction (HCI) is an interaction style that reasons in context with the user’s goals, attitudes, plans and capabilities, and adapts the system accordingly. As a result, a more intuitive form of interaction would potentially foster effective task completion in contrast to the interaction afforded by conventional non-adaptive applications. Moreover, the recent availability of unobtrusive input modalities such as eye trackers and web cameras has enabled the viable detection of a user’s emotions and mental states in real-time. However, a key challenge is utilizing such modalities to enable a computer to actively interact with users based on their emotions and mental states. Consequently, the aim of this research is to design and develop a user model that will be utilized to infer a user’s emotional and cognitive state, which will be subsequently exploited to adapt the interface, or more generally, the user experience, in order to maximise the performance of the system and ‘guarantee’ task completion. This research proposes a framework to facilitate Adaptive HCI that comprises two components; firstly, a Perception Component that acquires user data from a range of input modalities (e.g. eye tracking, web cameras etc.) and models the affective and cognitive aspects of the user; secondly, an Adaptation Component that facilitates adaptation based on the model generated by the Perception Component. Subsequently, it is anticipated that a novel interaction style that considers the user’s affective and cognitive aspects may be achieved.