Window Manager Control by Extracting Hand Region and Analysing its Trajectory

This paper describes a method to recognize gestures by extracting hand regions and to control window manager from these recognition results. Color image processing using a skin color model and labeling techniques are used to extract hand regions. The trajectory of the hand is obtained by connecting hand regions extracted from each frame in an input image sequence. This trajectory is analyzed to recognize a gesture. Four kinds of gestures such as up, down, left and right can be recognized by direction property normalization and a syntactic method. According to the predefined window management rule, the window manager uses the gesture recognition technique to control the display, tracking and execution of a menu. A real-time online gesture recognition system that can process more than five frames per second is constructed by reducing the processing time required to process one frame below 2000 msec. This result can be a basis for implementing a next generation intelligent user interface.