Gesture recognition for manipulation in artificial realities

In [1], we conclude that the flexible manipulation, by a human operator, of virtual objects in artificial realities is augmented by a gesture interface. Such an interface is described here and it can recognise static gestures, posture-based dynamic gestures, pose-based dynamic gestures, a “virtual control panel” involving posture and pose and simple pose-based trajectory analysis of postures. The interface is based on a novel, application independent technique for recognising gestures. Gestures are represented by what we term approximate splines, sequences of critical points (local minima and maxima) of the motion of degrees of freedom of the hand and wrist. This scheme allows more flexibility in matching a gesture performance spatially and temporally and reduces the computation required, compared with a full spline curve fitting approach. Training the gesture set is accomplished through the interactive presentation of a small number of samples of each gesture.