SenScreen: A Toolkit for Supporting Sensor-enabled Multi-Display Networks

Over the past years, a number of sensors have emerged, that enable gesture-based interaction with public display applications, including Microsoft Kinect, Asus Xtion, and Leap Motion. In this way, interaction with displays can be made more attractive, particularly if deployed across displays hence involving many users. However, interactive applications are still scarce, which can be attributed to the fact that developers usually need to implement a low-level connection to the sensor. In this work, we tackle this issue by presenting a toolkit, called SenScreen, consisting of (a) easy-to-install adapters that handle the low-level connection to sensors and provides the data via (b) an API that allows developers to write their applications in JavaScript. We evaluate our approach by letting two groups of developers create an interactive game each using our toolkit. Observation, interviews, and questionnaire indicate that our toolkit simplifies the implementation of interactive applications and may, hence, serve as a first step towards a more widespread use of interactive public displays.

[1]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[2]  Gregory D. Abowd,et al.  Providing architectural support for building context-aware applications , 2000 .

[3]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.

[4]  Albrecht Schmidt,et al.  Ingredients for a New Wave of Ubicomp Products , 2013, IEEE Pervasive Computing.

[5]  Mahsan Rofouei,et al.  Your phone or mine?: fusing body, touch and device sensing for multi-user device-display interaction , 2012, CHI.

[6]  Rui José,et al.  PuReWidgets: a programming toolkit for interactive public display applications , 2012, EICS '12.

[7]  Jing Yang,et al.  Magic wand: a hand-drawn gesture input device in 3-D space with inertial sensors , 2004, Ninth International Workshop on Frontiers in Handwriting Recognition.

[8]  Kiyoshi Kotani,et al.  Motion Control of Self-Moving Trays for Human Supporting Production Cell "Attentive Workbench" , 2005, ICRA.

[9]  Albrecht Schmidt,et al.  MakeIt: Integrate User Interaction Times in the Design Process of Mobile Applications , 2008, Pervasive.

[10]  Florian Alt,et al.  A Conceptual Architecture for Pervasive Advertising in Public Display Networks , 2012 .

[11]  Nigel Davies,et al.  Real world responses to interactive gesture based public displays , 2011, MUM.

[12]  Deborah Estrin,et al.  Building efficient wireless sensor networks with low-level naming , 2001, SOSP.

[13]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[14]  Albrecht Schmidt,et al.  Designing application stores for public display networks , 2012, PerDis '12.

[15]  Florian Alt,et al.  Cognitive effects of interactive public display applications , 2013, PerDis.

[16]  Michael Rohs,et al.  PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices , 2012, AVI.

[17]  Jörg Müller,et al.  Screenfinity: extending the perception area of content on very large public displays , 2013, CHI.

[18]  Alireza Sahami Shirazi,et al.  SENSE-SATION: An extensible platform for integration of phones into the Web , 2010, 2010 Internet of Things (IOT).

[19]  Wong Tai Man,et al.  ThumbStick: a novel virtual hand gesture interface , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[20]  Gregory D. Abowd,et al.  The context toolkit: aiding the development of context-enabled applications , 1999, CHI '99.

[21]  Mu-Chun Su,et al.  A hand-gesture-based control interface for a car-robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Albrecht Schmidt,et al.  Advertising on Public Display Networks , 2012, Computer.

[23]  Florian Alt,et al.  Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.

[24]  Ryosuke Aoki,et al.  Unicursal gesture interface for TV remote with touch screens , 2011, 2011 IEEE International Conference on Consumer Electronics (ICCE).

[25]  Nigel Davies,et al.  Ubi Displays: A Toolkit for the Rapid Creation of Interactive Projected Displays , 2013 .