Estimating virtual touchscreen for fingertip interaction with large displays

Large displays are everywhere. However, the computer mouse remains the most common interaction tool for such displays. We propose a new approach for fingertip interaction with large display systems using monocular computer vision. By taking into account the location of the user and the interaction area available, we can estimate an interaction surface - virtual touchscreen - between the display and the user. Users can use their pointing finger to interact with the display as if it was brought forward and presented directly in front of them, while preserving viewing angle. An interaction model is presented to describe the interaction with the virtual touchscreen, using the head-hand line method. Initial results, in the form of a work-in-progress prototype, demonstrate the feasibility of this concept.