HumanTop: a multi-object tracking tabletop

In this paper, a computer vision based interactive multi-touch tabletop system called HumanTop is introduced. HumanTop implements a stereo camera vision subsystem which allows not only an accurate fingertip tracking algorithm but also a precise touch-over-the-working surface detection method. Based on a pair of visible spectra cameras, a novel synchronization circuit makes the camera caption and the image projection independent from each other, providing the minimum basis for the development of computer vision analysis based on visible spectrum cameras without any interference coming from the projector. The assembly of both cameras and the synchronization circuit is not only capable of performing an ad-hoc version of a depth camera, but it also introduces the recognition and tracking of textured planar objects, even when contents are projected over them. On the other hand HumanTop supports the tracking of sheets of paper and ID-code markers. This set of features makes the HumanTop a comprehensive, intuitive and versatile augmented tabletop that provides multitouch interaction with projective augmented reality on any flat surface. As an example to exploit all the capabilities of HumanTop, an educational application has been developed using an augmented book as a launcher to different didactic contents. A pilot study in which 28 fifth graders participated is presented. Results about efficiency, usability/satisfaction and motivation are provided. These results suggest that HumanTop is an interesting platform for the development of educational contents.

[1]  Pierre Dillenbourg,et al.  Multi-finger interactions with papers on augmented tabletops , 2009, TEI.

[2]  Dieter Schmalstieg,et al.  Real-Time Detection and Tracking for Augmented Reality on Mobile Phones , 2010, IEEE Transactions on Visualization and Computer Graphics.

[3]  Hamid K. Aghajan,et al.  A Multi-touch Surface Using Multiple Cameras , 2007, ACIVS.

[4]  Robert J. K. Jacob,et al.  User interface , 2002 .

[5]  Marc Alexa,et al.  Continuous reference images for FTIR touch sensing , 2008, SIGGRAPH '08.

[6]  David G. Lowe,et al.  Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration , 2009, VISAPP.

[7]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[8]  Guangjun Zhang,et al.  A New Sub-Pixel Detector for X-Corners in Camera Calibration Targets , 2005, WSCG.

[9]  Gudrun Klinker,et al.  An LED-based multitouch sensor for LCD screens , 2010, TEI '10.

[10]  Adam Finkelstein,et al.  Video puppetry: a performative interface for cutout animation , 2008, ACM Trans. Graph..

[11]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[12]  Vincent Lepetit,et al.  Scalable real-time planar targets tracking for digilog books , 2010, The Visual Computer.

[13]  Nikolaus F. Troje,et al.  Paper windows: interaction techniques for digital paper , 2005, CHI.

[14]  David Nistér,et al.  Scalable Recognition with a Vocabulary Tree , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[15]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[16]  A. Agarwal,et al.  High Precision Multi-touch Sensing on Surfaces using Overhead Cameras , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[17]  P. Peer,et al.  Human skin color clustering for face detection , 2003, The IEEE Region 8 EUROCON 2003. Computer as a Tool..

[18]  Francis K. H. Quek,et al.  MirrorTrack: tracking with reflection - comparison with top-down approach , 2009, ICMI-MLMI '09.

[19]  Martin Kaltenbrunner reacTIVision and TUIO: a tangible tabletop toolkit , 2009, ITS '09.

[20]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[21]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[22]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[23]  Len Bass,et al.  User interface software , 1993 .

[24]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[25]  Andrew D. Wilson Using a depth camera as a touch sensor , 2010, ITS '10.

[26]  Sergi Jordà,et al.  The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.

[27]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[28]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[29]  Alex Zelinsky,et al.  Learning OpenCV---Computer Vision with the OpenCV Library (Bradski, G.R. et al.; 2008)[On the Shelf] , 2009, IEEE Robotics & Automation Magazine.

[30]  Makoto Mizukawa,et al.  Fast Hand Feature Extraction Based on Connected Component Labeling, Distance Transform and Hough Transform , 2009, J. Robotics Mechatronics.

[31]  Manolis I. A. Lourakis,et al.  Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse , 2006, ECCV Workshop on HCI.

[32]  Henny van der Meijden,et al.  Effects of constructing versus playing an educational game on student motivation and deep learning strategy use , 2011, Comput. Educ..

[33]  Gudrun Klinker,et al.  A short guide to modulated light , 2009, Tangible and Embedded Interaction.

[34]  Hideo Saito,et al.  Virtually augmenting hundreds of real pictures: An approach based on learning, retrieval, and tracking , 2010, 2010 IEEE Virtual Reality Conference (VR).

[35]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[36]  Yoichi Sato,et al.  Real-Time Fingertip Tracking and Gesture Recognition , 2002, IEEE Computer Graphics and Applications.

[37]  Julien Letessier,et al.  Visual tracking of bare fingers for interactive surfaces , 2004, UIST '04.

[38]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[39]  A. Agarwal,et al.  C-Slate: A Multi-Touch and Object Recognition System for Remote Collaboration using Horizontal Surfaces , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[40]  Sanjit K. Mitra,et al.  Using saddle points for subpixel feature detection in camera calibration targets , 2002, Asia-Pacific Conference on Circuits and Systems.

[41]  Yannick Verdie Evolution of hand tracking algorithms to MirrorTrack Yannick Verdié , 2008 .

[42]  Mariano Alcañiz Raya,et al.  Design and validation of an augmented book for spatial abilities development in engineering students , 2010, Comput. Graph..

[43]  Ying Wu,et al.  Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper , 2001, PUI '01.

[44]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[45]  R. Mas,et al.  Real – Time Hand Tracking and Gesture Recognition for Human-Computer Interaction , 2000 .