ThunderPunch: A bare-hand, gesture-based, large interactive display interface with upper-body-part detection in a top view

We present a new bare-hand gesture interface for large-screen interaction in which multiple users can participate simultaneously and interact with virtual content directly. To better reflect the intent of our new interface, we have created a new type of hardware system with a large hybrid display, named ThunderPunch. Unlike the conventional method, which involves positioning the camera in front, the cameras are mounted on the ceiling so that they avoid covering the large screen. To achieve bare-hand interaction in this hardware structure, we propose real-time algorithms that detect multiple body poses and recognize punching and touching gestures from top-view depth images. A pointing and touching test shows that the proposed algorithm is usable and that it outperforms other algorithms. In addition, we created a game to make the best use of the proposed system.

[1]  Tatsuya Kawahara,et al.  Multiparty Interaction Understanding Using Smart Multimodal Digital Signage , 2014, IEEE Transactions on Human-Machine Systems.

[2]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[3]  Carmelo Ardito,et al.  Interaction with Large Displays , 2015, ACM Comput. Surv..

[4]  Joris IJsselmuiden,et al.  Extending touch: towards interaction with large-scale surfaces , 2009, ITS '09.

[5]  Yepeng Guan,et al.  Real-time 3D pointing gesture recognition for natural HCI , 2008, 2008 7th World Congress on Intelligent Control and Automation.

[6]  Horst-Michael Gross,et al.  A Monocular Pointing Pose Estimator for Gestural Instruction of a Mobile Robot , 2007 .

[7]  Seong-Whan Lee,et al.  Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter , 2011, Image Vis. Comput..

[8]  Ramesh Raskar,et al.  Automatic projector calibration with embedded light sensors , 2004, UIST '04.

[9]  Eui Chul Lee,et al.  Method for User Interface of Large Displays Using Arm Pointing and Finger Counting Gesture Recognition , 2014, TheScientificWorldJournal.

[10]  Hui Li,et al.  Groupnect: Integrating group interaction into large display system , 2016, 2016 IEEE Virtual Reality (VR).

[11]  Ki-Hong Kim,et al.  Intuitive Pointing Position Estimation for Large Scale Display Interaction in Top-View Depth Images , 2016, ACCV Workshops.

[12]  Lu Yang,et al.  Survey on 3D Hand Gesture Recognition , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[13]  Qigang Gao,et al.  DT-DT: Top-down Human Activity Analysis for Interactive Surface Applications , 2014, ITS '14.

[14]  Preben Hansen,et al.  On the Influence of Distance in the Interaction With Large Displays , 2016, Journal of Display Technology.

[15]  Nebojsa Jojic,et al.  Detection and estimation of pointing gestures in dense disparity maps , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[16]  Takefumi Ogawa,et al.  Binocular interface: Interaction techniques considering binocular parallax for a large display , 2015, 2015 IEEE Virtual Reality (VR).

[17]  Michael Rauter Reliable Human Detection and Tracking in Top-View Depth Images , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops.