Supplementary Material for "Embodied Social Networking with Gesture-enabled Tangible Active Objects"

<img src="https://pub.uni-bielefeld.de/download/2696631/2702772" width="200" style="float:right;"> In this paper we present a novel approach for Tangible User Interfaces (TUIs) which incorporate small mobile platforms to actuate Tangible User Interface Objects (TUIOs). We propose an application of Tangible Active Objects (TAOs) in combination with gestural interaction for social networking. TUIOs represent messages while gestural input using these objects is used for triggering actions with these messages. We conducted a case study and present the results. We demonstrate interaction with a working social networking client.

[1]  Masanori Sugimoto,et al.  RoboTable: a tabletop framework for tangible interaction with robots in a mixed reality , 2009, Advances in Computer Entertainment Technology.

[2]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[3]  Jan O. Borchers,et al.  Madgets: actuating widgets on interactive tabletops , 2010, UIST.

[4]  Hiroshi Ishii,et al.  mediaBlocks: physical containers, transports, and controls for online media , 1998, SIGGRAPH.

[5]  Hiroshi Ishii,et al.  The actuated workbench: computer-controlled actuation in tabletop tangible interfaces , 2002, UIST '02.

[6]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[7]  R. Bencina,et al.  Improved Topological Fiducial Tracking in the reacTIVision System , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[8]  Sebastian Wrede,et al.  An Integration Framework for Developing Interactive Robots , 2005, PPSDR@ICRA.

[9]  Jean-Claude Latombe,et al.  Robot motion planning , 1970, The Kluwer international series in engineering and computer science.

[10]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[11]  Libor Preucil,et al.  European Robotics Symposium 2008 , 2008 .

[12]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[13]  Gerrit C. van der Veer,et al.  CHI '05 Extended Abstracts on Human Factors in Computing Systems , 2005, CHI 2005.

[14]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[15]  Helge J. Ritter,et al.  Gesture Desk - An Integrated Multi-modal Gestural Workplace for Sonification , 2003, Gesture Workshop.

[16]  Sriram Subramanian,et al.  Talking about tactile experiences , 2013, CHI.

[17]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[18]  Antonio Camurri,et al.  Gesture-Based Communication in Human-Computer Interaction , 2003, Lecture Notes in Computer Science.

[19]  Eckard Riedenklau TAOs - Tangible Active Objects for table-top interaction , 2009 .

[20]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[21]  Nelson H. C. Yung,et al.  Corner detector based on global and local curvature properties , 2008 .