Creepy Tracker Toolkit for Context-aware Interfaces

Context-aware pervasive applications can improve user experiences by tracking people in their surroundings. Such systems use multiple sensors to gather information regarding people and devices. However, when developing novel user experiences, researchers are left to building foundation code to support multiple network-connected sensors, a major hurdle to rapidly developing and testing new ideas. We introduce Creepy Tracker, an open-source toolkit to ease prototyping with multiple commodity depth cameras. It automatically selects the best sensor to follow each person, handling occlusions and maximizing interaction space, while providing full-body tracking in scalable and extensible manners. It also keeps position and orientation of stationary interactive surfaces while offering continuously updated point-cloud user representations combining both depth and color data. Our performance evaluation shows that, although slightly less precise than marker-based optical systems, Creepy Tracker provides reliable multi-joint tracking without any wearable markers or special devices. Furthermore, implemented representative scenarios show that Creepy Tracker is well suited for deploying spatial and context-aware interactive experiences.

[1]  Daniel Mendes,et al.  The benefits of DOF separation in mid-air 3D object manipulation , 2016, VRST.

[2]  Charles T. Loop,et al.  Holoportation: Virtual 3D Teleportation in Real-time , 2016, UIST.

[3]  Daniel Mendes,et al.  Eery Space: Facilitating Virtual Meetings Through Remote Proxemics , 2015, INTERACT.

[4]  Saul Greenberg,et al.  Proxemic interaction: designing for a proximity and orientation-aware environment , 2010, ITS '10.

[5]  Yuxi Wang,et al.  SoD-Toolkit: A Toolkit for Interactively Prototyping and Developing Multi-Sensor, Multi-Device Environments , 2015, ITS.

[6]  Saul Greenberg,et al.  Cross-device interaction via micro-mobility and f-formations , 2012, UIST.

[7]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[8]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[9]  Nicolai Marquardt,et al.  Proxemic interactions in ubiquitous computing ecologies , 2011, CHI Extended Abstracts.

[10]  Ted Selker,et al.  A Look at Human Interaction with Pervasive Computers , 1999, IBM Syst. J..

[11]  E. Hall,et al.  The Hidden Dimension , 1970 .

[12]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[13]  Bernt Schiele,et al.  Beyond Position Awareness , 2002, Personal and Ubiquitous Computing.

[14]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[15]  Hrvoje Benko,et al.  CrossMotion: Fusing Device and Image Motion for User Identification, Tracking and Device Association , 2014, ICMI.

[16]  J.K. Aggarwal,et al.  Human activity analysis , 2011, ACM Comput. Surv..

[17]  Blair MacIntyre,et al.  RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units , 2014, UIST.

[18]  A.D. Wilson Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[19]  Yvonne Rogers,et al.  Around the table: are multiple-touch surfaces better than single-touch for children's collaborative interactions? , 2009, CSCL.

[20]  P. Hartvigsen The Computer for the 21st Century (1991) , 2014 .

[21]  Tero Jokela,et al.  A Diary Study on Combining Multiple Information Devices in Everyday Activities and Tasks , 2015, CHI.

[22]  Eyal Ofek,et al.  Room2Room: Enabling Life-Size Telepresence in a Projected Augmented Reality Environment , 2016, CSCW.

[23]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[24]  Kris Luyten,et al.  Proxemic Flow: Dynamic Peripheral Floor Visualizations for Revealing and Mediating Large Surface Interactions , 2015, INTERACT.

[25]  Yvonne Rogers,et al.  When the fingers do the talking: A study of group participation with varying constraints to a tabletop interface , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[26]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[27]  Sebastian Boring,et al.  Gradual engagement: facilitating information exchange between digital devices as a function of proximity , 2012, ITS.

[28]  Mel Slater,et al.  Body Centred Interaction in Immersive Virtual Environments , 1994 .

[29]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[30]  Bernt Schiele,et al.  DeeperCut: A Deeper, Stronger, and Faster Multi-person Pose Estimation Model , 2016, ECCV.

[31]  Bernd Fröhlich,et al.  Immersive Group-to-Group Telepresence , 2013, IEEE Transactions on Visualization and Computer Graphics.

[32]  Joaquim A. Jorge,et al.  Physio@Home: design explorations to support movement guidance , 2014, CHI Extended Abstracts.

[33]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[34]  Patrick Baudisch,et al.  Multitoe: high-precision interaction with back-projected floors based on high-resolution multi-touch input , 2010, UIST.

[35]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.

[36]  Julian Frommel,et al.  ShareVR: Enabling Co-Located Experiences for Virtual Reality between HMD and Non-HMD Users , 2017, CHI.

[37]  Gregory D. Abowd,et al.  A Conceptual Framework and a Toolkit for Supporting the Rapid Prototyping of Context-Aware Applications , 2001, Hum. Comput. Interact..

[38]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[39]  Jerry Alan Fails,et al.  Light widgets: interacting in every-day spaces , 2002, IUI '02.

[40]  Greg Welch,et al.  The office of the future: a unified approach to image-based modeling and spatially immersive displays , 1998, SIGGRAPH.

[41]  Nicolai Marquardt,et al.  EagleSense: Tracking People and Devices in Interactive Spaces using Real-Time Top-View Depth-Sensing , 2017, CHI.

[42]  Tara Matthews,et al.  A toolkit for managing user attention in peripheral displays , 2004, UIST '04.

[43]  Kaj Grønbæk,et al.  IGameFloor: a platform for co-located collaborative games , 2007, ACE '07.