Movement-based interaction in camera spaces: a conceptual framework

In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movement-based projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space, relations, and feedback. We see these as central for describing and analysing movement-based systems using camera tracking and we show how these three concepts can be used to analyse other camera tracking applications.

[1]  Perttu Hämäläinen,et al.  Martial arts in artificial reality , 2005, CHI.

[2]  Gary Bradski,et al.  Computer Vision Face Tracking For Use in a Perceptual User Interface , 1998 .

[3]  Vibha Sazawal,et al.  TiltType: accelerometer-supported text entry for very small devices , 2002, UIST '02.

[4]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Mark Billinghurst,et al.  Face to face collaborative AR on mobile phones , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[6]  Jenny Edwards,et al.  Understanding Movement as Input for Interaction-a study of Two EyeToy(tm) Games , 2004 .

[7]  Kenneth P. Fishkin,et al.  A taxonomy for and analysis of tangible interfaces , 2004, Personal and Ubiquitous Computing.

[8]  Matthew Chalmers,et al.  Picking Pockets on the Lawn: The Development of Tactics and Strategies in a Mobile Game , 2005, UbiComp.

[9]  Lars Erik Holmquist,et al.  Token-Based Acces to Digital Information , 1999, HUC.

[10]  Martin Ludvigsen,et al.  "Help Me Pull That Cursor" A Collaborative Interactive Floor Enhancing Community Interaction , 2004, Australas. J. Inf. Syst..

[11]  Gregory D. Abowd,et al.  Charting past, present, and future research in ubiquitous computing , 2000, TCHI.

[12]  Martin R. Gibbs,et al.  Mediating intimacy: designing technologies to support strong-tie relationships , 2005, CHI.

[13]  Susanne Bødker,et al.  Applying activity theory to video analysis: how to make sense of video data in human-computer interaction , 1995 .

[14]  Eva Eriksson,et al.  Mixed Interaction Space - Expanding the Interaction Space with Mobile Devices , 2005, BCS HCI.

[15]  Jan Gehl,et al.  Life Between Buildings: Using Public Space , 2003 .

[16]  Eva Eriksson,et al.  Mixed Interaction Spaces a new interaction technique for mobile devices , 2005 .

[17]  Austin Henderson,et al.  Making sense of sensing systems: five questions for designers and researchers , 2002, CHI.

[18]  Tony P. Pridmore,et al.  Expected, sensed, and desired: A framework for designing sensing-based interaction , 2005, TCHI.

[19]  Eva Eriksson,et al.  Mixed interaction space: designing for camera based interaction with mobile devices , 2005, CHI Extended Abstracts.

[20]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[21]  J. Underkoff Urp : A Luminous-Tangible Workbench for Urban Planning and Design , 1999, CHI 1999.

[22]  Wendy E. Mackay,et al.  Augmented reality: linking real and virtual worlds: a new paradigm for interacting with computers , 1998, AVI '98.

[23]  D MynattElizabeth,et al.  Charting past, present, and future research in ubiquitous computing , 2000 .