The role of physical controllers in motion video gaming

Systems that detect the unaugmented human body allow players to interact without using a physical controller. But how is interaction altered by the absence of a physical input device? What is the impact on game performance, on a player's expectation of their ability to control the game, and on their game experience? In this study, we investigate these issues in the context of a table tennis video game. The results show that the impact of holding a physical controller, or indeed of the fidelity of that controller, does not appear in simple measures of performance. Rather, the difference between controllers is a function of the responsiveness of the game being controlled, as well as other factors to do with expectations, real world game experience and social context.

[1]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[2]  Dan Rosenfeld,et al.  Going beyond the display: a surface technology with an electronically switchable diffuser , 2008, UIST '08.

[3]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[4]  Mel Slater,et al.  Body Centred Interaction in Immersive Virtual Environments , 1994 .

[5]  Abigail Sellen,et al.  Affordances for manipulation of physical versus digital media on interactive surfaces , 2007, CHI.

[6]  Abigail Sellen,et al.  Putting the physical into the digital: issues in designing hybrid interactive surfaces , 2009, BCS HCI.

[7]  Caroline Hummels,et al.  Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction , 2007, Personal and Ubiquitous Computing.

[8]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[9]  Eva Hornecker,et al.  A Design Theme for Tangible Interaction: Embodied Facilitation , 2005, ECSCW.

[10]  Jakob Tholander,et al.  Design qualities for whole body interaction: learning from golf, skateboarding and BodyBugging , 2010, NordiCHI.

[11]  Jin Moen,et al.  From hand-held to body-worn: embodied experiences of the design and use of a wearable movement-based interaction concept , 2007, TEI.

[12]  Bart Simon,et al.  Wii are Out of Control: Bodies, Game Screens and the Production of Gestural Excess , 2009 .

[13]  Philip Tuddenham,et al.  Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks , 2010, CHI.

[14]  Jenny Edwards,et al.  Understanding movement for interaction design: frameworks and approaches , 2007, Personal and Ubiquitous Computing.

[15]  Toby Sharp,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR.

[16]  Ramesh Raskar,et al.  Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators , 2007, ACM Trans. Graph..

[17]  Yuta Sugiura,et al.  An actuated physical puppet as an input device for controlling a digital manikin , 2011, CHI.

[18]  Giuseppe Monno,et al.  3D POINTING IN VIRTUAL REALITY: EXPERIMENTAL STUDY , 2003 .

[19]  Radu Horaud,et al.  Human Motion Tracking by Registering an Articulated Surface to 3D Points and Normals , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Maiken Hillerup Fogtmann Designing bodily engaging games: learning from sports , 2011, CHINZ '11.

[21]  Mary C. Whitton,et al.  Effects of Handling Real Objects and Self-Avatar Fidelity on Cognitive Task Performance and Sense of Presence in Virtual Environments , 2003, Presence: Teleoperators & Virtual Environments.

[22]  Hyun Suk Lee,et al.  Realization of a vibro-tactile glove type mouse , 2009, VRST '09.

[23]  Toni Robertson,et al.  Inventing and devising movement in the design of movement-based interactive systems , 2008, OZCHI.

[24]  Hiroshi Ishii,et al.  BodyBeats: whole-body, musical interfaces for children , 2006, CHI Extended Abstracts.

[25]  Steve Benford,et al.  A Framework for Tangible User Interfaces , 2003 .

[26]  Steve Benford,et al.  Physical manipulation: evaluating the potential for tangible designs , 2009, TEI.

[27]  Hiroshi Ishii,et al.  Handsaw: tangible exploration of volumetric data by direct cut-plane projection , 2008, CHI.

[28]  Sukeshini A. Grandhi,et al.  Understanding naturalness and intuitiveness in gesture production: insights for touchless gestural interfaces , 2011, CHI.

[29]  Albrecht Schmidt,et al.  Tangible interfaces in perspective , 2004, Personal and Ubiquitous Computing.

[30]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[31]  Peng Song,et al.  WYSIWYF: exploring and annotating volume data with a tangible handheld device , 2011, CHI.

[32]  Kenneth P. Fishkin,et al.  A taxonomy for and analysis of tangible interfaces , 2004, Personal and Ubiquitous Computing.

[33]  Daihua Wang,et al.  A Six-Degree-of-Freedom Acceleration Sensing Method Based on Six Coplanar Single-Axis Accelerometers , 2011, IEEE Transactions on Instrumentation and Measurement.

[34]  Oscar E. Ruiz,et al.  XIII ADM - XV INGEGRAF International Conference on TOOLS AND METHODS EVOLUTION IN ENGINEERING DESIGN , 2003 .

[35]  Ivan Poupyrev,et al.  A framework and testbed for studying manipulation techniques for immersive VR , 1997, VRST '97.

[36]  Jenny Edwards,et al.  The feel dimension of technology interaction: exploring tangibles through movement and touch , 2007, TEI.

[37]  Thilakshan Kanesalingam,et al.  Motion Tracking Glove for Human-Machine Interaction: Inertial Guidance , 2010 .

[38]  Jacob Buur,et al.  Getting a grip on tangible interaction: a framework on physical space and social interaction , 2006, CHI.