Two-Camera Synchronization and Trajectory Reconstruction for a Touch Screen Usability Experiment

This paper considers the usability of stereoscopic 3D touch displays. For this purpose extensive subjective experiments were carried out and the hand movements of test subjects were recorded using a two-camera setup consisting of a high-speed camera and a standard RGB video camera with different viewing angles. This produced a large amount of video data that is very laborious to analyze manually which motivates the development of automated methods. In this paper, we propose a method for automatic video synchronization for the two cameras to enable 3D trajectory reconstruction. This together with proper finger tracking and trajectory processing techniques form a fully automated measurement framework for hand movements. We evaluated the proposed method with a large amount of hand movement videos and demonstrated its accuracy on 3D trajectory reconstruction. Finally, we computed a set of hand trajectory features from the data and show that certain features, such as the mean and maximum velocity differ statistically significantly between different target object disparity categories. With small modifications, the framework can be utilized in other similar HCI studies.

[1]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[2]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[3]  Tuomas Eerola,et al.  Multi-camera Finger Tracking and 3D Trajectory Reconstruction for HCI Studies , 2017, ACIVS.

[4]  James L. Lyons,et al.  Goal-directed aiming: two components but multiple processes. , 2010, Psychological bulletin.

[5]  W. Cleveland,et al.  Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting , 1988 .

[6]  PAUL D. SAMPSON,et al.  Fitting conic sections to "very scattered" data: An iterative refinement of the bookstein algorithm , 1982, Comput. Graph. Image Process..

[7]  Wijnand A. IJsselsteijn,et al.  Stereoscopic displays in medical domains: a review of perception and performance effects , 2009, Electronic Imaging.

[8]  Jane Yung-jen Hsu,et al.  Touching the void: direct-touch interaction for intangible displays , 2010, CHI.

[9]  Rui Caseiro,et al.  High-Speed Tracking with Kernelized Correlation Filters , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Klaus H. Hinrichs,et al.  Evaluation of depth perception for touch interaction with stereoscopic rendered objects , 2012, ITS.

[11]  Alexander Toet,et al.  Visual comfort of binocular and 3D displays , 2004 .

[12]  Toni Kuronen Post-processing and analysis of tracked hand trajectories , 2014 .

[13]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[14]  Melvyn A. Goodale,et al.  The role of binocular vision in prehension: a kinematic analysis , 1992, Vision Research.

[15]  Jari Takatalo,et al.  High-Speed Hand Tracking for Studying Human-Computer Interaction , 2015, SCIA.