Development of an intuitive interface based on facial orientations and gazing actions for auto-wheel chair operation

This paper suggests an intuitive interface based on the facial orientations and gazing actions for auto-wheel chair operation. The real-time image of the operator's face was taken to the computer through the USB camera to observe the operational intention. The changes in the darkness area of the both nostrils were utilized for recognition of the face orientations. When the operator faced to up and downward, the darkness areas of both nostrils were increased and decreased, respectively. On the other hand, the difference between two nostril areas could be caused in cases where the face was turn to the side. Here, these characteristics were utilized for the recognition of the face orientations. Moreover, gazing actions was recognized by the curve ratio of the operator's eye lines. Only when the operator gazed to the control computer, the facial orientation was reflected to operate the auto-wheelchair, instead of a joystick interface.