Tangible tabletops for emergency response: an exploratory study

Effective handling of location-based data is important to emergency response management (ERM). Expert team members co-located around maps typically discuss events while drawing freeform areas or while using physical placeholders representing incidents. Key ERM functions are filtering data, selecting information recipients, searching datasets, drawing time-dependent freeform areas, and zooming in on one region while leaving others unchanged. Under time pressure the mouse and keyboard could be insufficient; intuitive graspable solutions, such as tangible user interfaces (TUIs), may be better suited for ERM. We present CoTracker, a tangible tabletop system with expected potential for ERM teamwork. On an interactive map expert team members can discuss an operational picture using TUIs like bricks, frames, and pens. With the participation of domain experts for cognitive walk-through studies, we examined how generic and specialized TUIs can support ERM-related functions. We present some insights into the design of ERM-focused tangible tabletops.

[1]  Takeo Igarashi,et al.  As-rigid-as-possible shape manipulation , 2005, ACM Trans. Graph..

[2]  Roy Want,et al.  Implementing phicons: combining computer vision with infrared technology for interactive physical icons , 1999, UIST '99.

[3]  Kasper Hornbæk,et al.  Tangible bots: interaction with active tangibles in tabletop interfaces , 2011, CHI.

[4]  Morten Fjeld,et al.  CERMIT: Co-located and Remote Collaborative System for Emergency Response Management , 2009 .

[5]  Andreas Kunz,et al.  InfrActables: Multi-User Tracking System for Interactive Surfaces , 2006, IEEE Virtual Reality Conference (VR 2006).

[6]  Daniel J. Wigdor,et al.  Table-centric interactive spaces for real-time collaboration , 2006, AVI '06.

[7]  Morten Fjeld,et al.  Multi-State Device Tracking for Tangible Tabletops , 2011, SIGRAD.

[8]  Philip Tuddenham,et al.  Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks , 2010, CHI.

[9]  Sisi Zlatanova,et al.  Multi-user tangible interfaces for effective decision-making in disaster management , 2008 .

[10]  Heidrun Schumann,et al.  Tangible views for information visualization , 2010, ITS '10.

[11]  Morten Fjeld,et al.  Dual mode IR position and state transfer for tangible tabletops , 2011, ITS '11.

[12]  S. Zlatanova,et al.  Remote Sensing and GIS Technologies for Monitoring and Prediction of Disasters , 2008 .

[13]  Volkmar Pipek,et al.  Supporting improvisation work in inter-organizational crisis management , 2012, CHI.

[14]  Jonas Landgren,et al.  Visual reporting in time-critical work: exploring video use in emergency response , 2011, Mobile HCI.

[15]  Charles R. McLean,et al.  Towards standards for integrated gaming and simulation for incident management , 2007, SCSC.

[16]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[17]  Y Wang,et al.  Bringing Clay and Sand into Digital Design — Continuous Tangible user Interfaces , 2004 .

[18]  Jie Liu,et al.  uEmergency: a collaborative system for emergency management on very large tabletop , 2012, ITS '12.

[19]  Wolmet Barendregt,et al.  Epistemic action: A measure for cognitive support in tangible user interfaces? , 2009, Behavior research methods.

[20]  Rajeev Sharma,et al.  Designing a human-centered, multimodal GIS interface to support emergency management , 2002, GIS '02.

[21]  Andreas M. Kunz,et al.  MightyTrace: multiuser tracking technology on lc-displays , 2008, CHI.

[22]  Jonas Landgren,et al.  A study of emergency response work: patterns of mobile phone interaction , 2007, CHI.

[23]  Sriram Subramanian,et al.  Kick: investigating the use of kick gestures for mobile interactions , 2011, Mobile HCI.