Manipulating simulated objects with real-world gestures using a force and position sensitive screen

A flexible interface to computing environments can be provided by gestural input. We describe a prototype system that recognizes some types of single-finger gestures and uses these gestures to manipulate displayed objects. An experimental gesture input device yields information about single finger gestures in terms of position, pressure, and shear forces on a screen. The gestures are classified by a “gesture parser” and used to control actions in a fingerpainting program, an interactive computing system designed for young children, and an interactive digital logic simulation.