Finding the user's interest level from their eyes

An innovative model is proposed that empowers the semi-automatic image annotation algorithms with the implicit feedback of the users' eyes. This frame work extracts the features from the users' gaze pattern over an image by the help of the eye-trackers and combines them with the low level feature properties of that image. The resulting feature vector is sent to a fuzzy inference framework which grades the users' interest in the visited images. By defining a threshold in the middle of the interest level, the images can be classified in case the user is searching for a target concept. In addition to classifying the user's interest level enables us to cluster the visited images according to the users concerns. The primary results show that this model can classify the images with a F1 measure over 0.52.