Human gestures are typical examples of non-verbal communication, and help people communicate smoothly [1]. However, using camera to recognizing gesture needs high processing power and suffer from delays in recognition [2]. Sometimes distance between large screen and user is a problem as for example in pen based interaction user must be attached to screen. So our main motivation is how we should design a user interface that use cookie wireless sensor [3] as an input device. In this paper we describe the interface setting, method of extracting motion and direction from 3D accelometer, using the tilting gesture. Then we proposed a method that allows users to define their own tilting positions and refer it to certain directions. Then we describe a menu selection interface that is based on pie menu for interaction with remote displays. An evaluation of the proposed interface in terms of accuracy, time and attached objects has been conducted.
[1]
Doug A. Bowman,et al.
Literature Survey on Interaction Techniques for Large Displays
,
2006
.
[2]
K. Tsukada,et al.
Ubi-Finger : Gesture Input Device for Mobile Use
,
2002
.
[3]
Jani Mäntyjärvi,et al.
Accelerometer-based gesture control for a design environment
,
2006,
Personal and Ubiquitous Computing.
[4]
Manuel Odendahl,et al.
The BlueWand as interface for ubiquitous and wearable computing environments
,
2003
.
[5]
Hiroaki Kimura,et al.
CookieFlavors: easy building blocks for wireless tangible input
,
2006,
CHI EA '06.