Interactive interface with evolutionary eye sensing and physiological knowledge (特集 平成20年電気学会電子・情報・システム部門大会)

The purpose of this study is to develop an interactive interface using eye movement to operate a welfare apparatus, such as a feeding device for an orthopedically-impaired person. A part of the purpose is to eliminate special calibration and re-calibration during the operation. Originalities of this proposed system are as follows: eye sensing with evolutionary processing and interactive operation screen based on some physiological knowledge. The proposed system uses a non-contact type interface by the eye movement. An iris is tracked and eye movement is measured by the evolutionary eye sensing (EES) method. An operation screen is divided into 9 areas, which has visual stimulation using the physiological knowledge. A user can select these areas by eye movement and decide by eye fixation. The effectiveness of the proposed system is evaluated by comparison experiments with 20 subjects. The results indicate that the proposed system is easy to use for first-timer, who can become proficient in operation just after a few exercises.

[1]  G F Wilson,et al.  Evoked potential, cardiac, blink, and respiration measures of pilot workload in air-to-ground missions. , 1994, Aviation, space, and environmental medicine.

[2]  Michele A. Basso,et al.  Not looking while leaping: the linkage of blinking and saccadic gaze shifts , 2004, Experimental Brain Research.

[3]  Hidefumi Kobatake,et al.  Extraction of facial sketch image based on morphological processing , 1997, Proceedings of International Conference on Image Processing.

[4]  J. Veltman,et al.  Physiological workload reactions to increasing levels of task difficulty. , 1998, Ergonomics.

[5]  Rainer Lienhart,et al.  Empirical Analysis of Detection Cascades of Boosted Classifiers for Rapid Object Detection , 2003, DAGM-Symposium.

[6]  M. W. Greenlee,et al.  MR-Eyetracker: a new method for eye movement recording in functional magnetic resonance imaging , 1999, Experimental Brain Research.

[7]  J. A. van Woerden,et al.  Manus: The evolution of an assistive technology , 1996 .

[8]  Shigeru Akamatsu Computer recognition of human face—A survey , 1999 .

[9]  Toshiyuki Gotoh,et al.  An algorithm for an eye tracking system with self-calibration , 2002, Systems and Computers in Japan.

[10]  H D Crane,et al.  Accurate two-dimensional eye tracker using first and fourth Purkinje images. , 1973, Journal of the Optical Society of America.

[11]  Glenn F. Wilson,et al.  Physiological Data Used to Measure Pilot Workload in Actual Flight and Simulator Conditions , 1987 .

[12]  Tohru Ifukube,et al.  Image processing with video capture for eye movement measurement and its application to eye control system , 2006 .

[13]  Naoaki Itakura,et al.  Proposal of eye-gaze character input interface with guide area , 2003 .

[14]  Osamu Nakamura,et al.  Human face extraction based on color and moving information and the recognition of expressions , 2000, 2000 Canadian Conference on Electrical and Computer Engineering. Conference Proceedings. Navigating to a New Era (Cat. No.00TH8492).

[15]  Wayne D. Gray,et al.  The Eye Blink as a Physiological Indicator of Cognitive Workload , 2000 .

[16]  L A RIGGS,et al.  Motions of the retinal image during fixation. , 1954, Journal of the Optical Society of America.

[17]  R. Steinman,et al.  Fixation of targets near the absolute foveal threshold. , 1968, Vision research.

[18]  Kanya Tanaka,et al.  Downsized Evolutionary Video Processing for Lips Tracking and Data Acquisition , 2007, J. Adv. Comput. Intell. Intell. Informatics.

[19]  Osamu Yamaguchi,et al.  Facial Feature Point Extraction Method Based on Combination of Shape Extraction and Pattern Matching , 1998 .