AirPincher: a handheld device for recognizing delicate mid-air hand gestures

We propose AirPincher, a handheld device for recognizing delicate mid-air hand gestures. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as wearable sensor-based and external vision-based. The wearable sensor-based techniques cause cumbersomeness of wearing sensors every time and the external vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate mid-air finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. It allows several delicate finger gestures, for example, rubbing a thumb against a middle finger, swiping with a thumb on an index finger, pinching with a thumb and an index finger, etc. Due to the inherent haptic feedback of these gestures, AirPincher eventually supports the eyes-free interaction. To validate AirPincher's feasibility, we implemented two use cases, i.e., controlling a pointing cursor and moving a virtual 3D object on the remote screen.

[1]  Robin R. Murphy,et al.  Hand gesture recognition with depth images: A review , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[3]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[4]  Wen-Huang Cheng,et al.  FingerPad: private and subtle interaction using fingertips , 2013, UIST.