TAOs - Tangible Active Objects for table-top interaction

Personal Computers have arrived in almost every part of our life to do work faster and better. They are used for writing texts, creating music or drawings, or simply organizing and guiding everyday tasks. Nearly all these tasks are done with computers which are operated using screen, keyboard and mouse even if their use may be sometimes cumbersome or even unsuitable for some tasks. Human Computer Interaction (HCI) aims to analyze the way people use computers and suggest new methods for interaction. One area of this research field is called "Tangible Interaction". Tangible Interaction tries to use everyday objects as tangible representations for digital data. It is hoped that by pulling the data into the tangible real world (in contrast to the virtual world) they can be made more vivid and graspable and thereby better understandable. These real-world representations are called Tangible User Interface Objects (TUIOs) and the systems they are used in Tangible User Interfaces (TUIs). The main goal of this work is to create active objects as a new kind of TUIO. These active objects extend the concept of TUIOs by the possibility to be not only manipulated by the user but also by the computer. Many different ways of manipulation are possible, e.g. adding LEDs or liquid crystal displays, sound output or tactile and haptic feedback with vibration, etc. One of the most challenging manipulation possibilities is computer controlled planar movement for instance on a desk surface, which will be developed in this work. The developed objects are constructed as modular as possible to be open for future extensions and modifications. A software structure for the coordination of the objects is implemented. Furthermore some applications shown to give examples for the potential of this novel technique.

[1]  Helge Ritter,et al.  A Tangible Environment for Ambient Data Representation , 2006 .

[3]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[4]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[5]  Danica Mast,et al.  Globe4D: time-traveling with an interactive four-dimensional globe , 2006, MM '06.

[6]  Tim McNerney,et al.  Tangible programming bricks : an approach to making programming accessible to everyone , 1999 .

[7]  Thomas Hermann,et al.  TUImod: Modular Objects for Tangible User Interfaces , 2008 .

[8]  D. Maynes-Aminzade,et al.  The actuated workbench: computer-controlled actuation in tabletop tangible interfaces , 2003, ACM Trans. Graph..

[9]  Masahiko Inami,et al.  Augmented coliseum: an augmented game environment with small vehicles , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[10]  Paul Dourish,et al.  Where the action is , 2001 .

[11]  Sergi Jordà,et al.  The reacTable: a tangible tabletop musical instrument and collaborative workbench , 2006, SIGGRAPH '06.

[12]  Helge J. Ritter,et al.  Gesture Desk - An Integrated Multi-modal Gestural Workplace for Sonification , 2003, Gesture Workshop.

[13]  Jean Vanderdonckt,et al.  AudioCubes: a distributed cube tangible interface based on interaction range for sound design , 2008, Tangible and Embedded Interaction.

[14]  Hiroshi Ishii,et al.  curlybot: designing a new class of computational toys , 2000, CHI.

[15]  Yoshifumi Kitamura,et al.  ActiveCube: a bi-directional user interface using cubes , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[16]  Hiroshi Ishii,et al.  Audiopad: A Tag-based Interface for Musical Performance , 2002, NIME.

[17]  Helge J. Ritter,et al.  AcouMotion - An Interactive Sonification System for Acoustic Motion Control , 2005, Gesture Workshop.

[18]  Pattie Maes,et al.  Siftables: towards sensor network user interfaces , 2007, TEI.

[19]  R. Bencina,et al.  Improved Topological Fiducial Tracking in the reacTIVision System , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[20]  Hiroshi Ishii,et al.  mediaBlocks: physical containers, transports, and controls for online media , 1998, SIGGRAPH.

[21]  D.V. Lebedev,et al.  Real-time path planning in dynamic environments: a comparison of three neural network models , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[22]  Roland Philippsen Motion planning and obstacle avoidance for mobile robots in highly cluttered dynamic environments , 2004 .

[23]  James Gibson,et al.  Block Jam: A Tangible Interface for Interactive Music , 2003, NIME.

[24]  Elise van den Hoven,et al.  Tangible Computing in Everyday Life: Extending Current Frameworks for Tangible User Interfaces with Personal Objects , 2004, EUSAI.

[25]  Thomas Hermann,et al.  Sonification for Exploratory Data Analysis , 2002 .

[26]  Hiroshi Ishii,et al.  Mechanical constraints as computational constraints in tabletop tangible interfaces , 2007, CHI.

[27]  Helge Ritter,et al.  AudioDB: Get in Touch with Sounds , 2008 .

[28]  Tom Igoe,et al.  Making Things Talk , 2007 .

[29]  Jean-Claude Latombe,et al.  Robot motion planning , 1970, The Kluwer international series in engineering and computer science.

[30]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[31]  J. Underkoff Urp : A Luminous-Tangible Workbench for Urban Planning and Design , 1999, CHI 1999.

[32]  O. Khatib,et al.  Real-Time Obstacle Avoidance for Manipulators and Mobile Robots , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[33]  Helge Ritter,et al.  Listen to your Data: Model-Based Sonification for Data Analysis , 1999 .