Applied Informatics

After years of dominance, classic desktop-based WIMP (Windows, Icons, Menus, Pointer) systems are slowly being replaced by modern post-WIMP systems. Such systems do not stick to a certain user interface or interaction paradigm, but rather contain a heterogeneous set of characteristics that stem from multiple fields of research. These characteristics induce a variety of different challenges that designers and developers of post-WIMP systems have to face and overcome. In this thesis, it is argued that formal methods, and in particular finite-state machines, are an important means to tackle certain of these challenges. In order to adapt finite-state machines to the requirements of post-WIMP systems and to improve their expressivity, specific additions to their default notation are suggested. One such addition allows the specification of animated transitions. The other addition is a notation to differentiate multiple input points. Although there is a consensus between many researchers and developers that finite-state machines are a rather natural formalism for the specification of complex interactive systems, their implementation is not yet supported appropriately by user interface toolkits or programming languages. Thus, a finite-state machine framework for post-WIMP interaction design, the Reactive State Machine framework, is presented in the main part of this thesis. Due to its declarative nature, it greatly facilitates the transformation of a graphical state machine model into code. The Reactive State Machine framework supports all important state machine concepts, such as states and transitions. What sets it apart from other similar frameworks is its full support for input events, its support for animated transitions and its support for the multi-point notation that is introduced in this thesis. To show the utility and value of the Reactive State Machine Framework its application in three assorted use cases is demonstrated. For one of the use cases, the Facet-Streams system, a comparison is conducted between the old naive implementation based on low-level implementation techniques and the revised implementation based on the Reactive State Machine framework. Finally, the threshold and ceiling of the Reactive State Machine framework are assessed in a brief informal evaluation. To conclude the thesis, its main contributions are summarized and an outlook on potential future work is given.

[1]  Caroline Appert,et al.  SwingStates: adding state machines to Java and the Swing toolkit , 2008, Softw. Pract. Exp..

[2]  Jean Vanderdonckt,et al.  An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components , 2009, EICS '09.

[3]  Michael Weyrich,et al.  Reference Architectures for the Internet of Things , 2016, IEEE Software.

[4]  Paul Dourish,et al.  Where the action is , 2001 .

[5]  Alejandro F. Frangi,et al.  RADStation3G: A platform for cardiovascular image analysis integrating PACS, 3D+t visualization and grid computing , 2013, Comput. Methods Programs Biomed..

[6]  Saul Greenberg,et al.  Proxemic interaction: designing for a proximity and orientation-aware environment , 2010, ITS '10.

[7]  Gina Porter,et al.  Mobile Phones and Education in Sub‐Saharan Africa: From Youth Practice to Public Policy , 2016 .

[8]  Bay-Wei Chang,et al.  Animation: from cartoons to the user interface , 1993, UIST '93.

[9]  Jon Trinder,et al.  The Humane Interface: New Directions for Designing Interactive Systems , 2002, Interact. Learn. Environ..

[10]  William Buxton,et al.  There's more to interaction than meets the eye: some issues in manual input , 1987 .

[11]  Mary Czerwinski,et al.  Beyond the Desktop Metaphor: Designing Integrated Digital Work Environments , 2007 .

[12]  Woohun Lee,et al.  Exploring effectiveness of physical metaphor in interaction design , 2009, CHI Extended Abstracts.

[13]  F Guilbert,et al.  Outcomes of Endovascular Treatments of Aneurysms: Observer Variability and Implications for Interpreting Case Series and Planning Randomized Trials , 2012, American Journal of Neuroradiology.

[14]  Ian Burn,et al.  VALIDITY OF CLINICAL EXAMINATION AND MAMMOGRAPHY AS SCREENING TESTS FOR BREAST CANCER , 1975, The Lancet.

[15]  Dong Ni,et al.  Automatic Fetal Head Circumference Measurement in Ultrasound Using Random Forest and Fast Ellipse Fitting , 2018, IEEE Journal of Biomedical and Health Informatics.

[16]  Benjamin C. Peirce,et al.  Basic Category Theory for Computer Scientists , 1991 .

[17]  Andries van Dam,et al.  Post-WIMP user interfaces , 1997, CACM.

[18]  Hiroshi Ishii,et al.  Token+constraint systems for tangible interaction with digital information , 2005, TCHI.

[19]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[20]  Natasa Milic-Frayling,et al.  Materializing the query with facet-streams: a hybrid surface for collaborative search on tabletops , 2011, CHI.

[21]  Dave Collins,et al.  Designing Object-Oriented User Interfaces , 1995 .

[22]  K. Dickersin,et al.  Factors influencing publication of research results. Follow-up of applications submitted to two institutional review boards. , 1992, JAMA.

[23]  Michael Kohnen,et al.  Quality of DICOM header information for image categorization , 2002, SPIE Medical Imaging.

[24]  Brad A. Myers Separating application code from toolkits: eliminating the spaghetti of call-backs , 1991, UIST '91.

[25]  Otmar Hilliges,et al.  Bringing physics to the surface , 2008, UIST '08.

[26]  Mark Green,et al.  SIGGRAPH '90 Workshop report: software architectures and metaphors for non-WIMP user interfaces , 1991, COMG.

[27]  Marco Winckler,et al.  A model-based approach for supporting engineering usability evaluation of interaction techniques , 2011, EICS '11.

[28]  M. Sheelagh T. Carpendale,et al.  Territoriality in collaborative tabletop workspaces , 2004, CSCW.

[29]  J. Manshande CHILD GROWTH STANDARDS , 1980, The Lancet.

[30]  Michael Ackroyd Object-Oriented Design of a Finite State Machine , 1995, J. Object Oriented Program..

[31]  Eric Lecolinet,et al.  Flick-and-brake: finger control over inertial/sustained scroll motion , 2011, CHI EA '11.

[32]  Jock D. Mackinlay,et al.  Cone Trees: animated 3D visualizations of hierarchical information , 1991, CHI.

[33]  Philip J. Barnard,et al.  Cinematography and interface design , 1995, INTERACT.

[34]  Jakob Nielsen,et al.  The Anti-Mac interface , 1996, CACM.

[35]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[36]  Abigail Sellen,et al.  Putting the physical into the digital: issues in designing hybrid interactive surfaces , 2009 .

[37]  M. Sheelagh T. Carpendale,et al.  Bubble Sets: Revealing Set Relations with Isocontours over Existing Visualizations , 2009, IEEE Transactions on Visualization and Computer Graphics.

[38]  William M. Newman,et al.  A system for interactive graphical programming , 1968, AFIPS Spring Joint Computing Conference.

[39]  Jean Vanderdonckt,et al.  A toolkit for peer-to-peer distributed user interfaces: concepts, implementation, and applications , 2009, EICS '09.

[40]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[41]  Daniel J. Wigdor,et al.  Gesture play: motivating online gesture learning with fun, positive reinforcement and physical metaphors , 2010, ITS '10.

[42]  Niels Henze,et al.  Free-hand gestures for music playback: deriving gestures with a user-centred process , 2010, MUM.

[43]  Harald Reiterer,et al.  AffinityTable - A Hybrid Surface for Supporting Affinity Diagramming , 2011, INTERACT.

[44]  David L. Parnas,et al.  On the use of transition diagrams in the design of a user interface for an interactive computer system , 1969, ACM '69.

[45]  Harold W. Thimbleby,et al.  Press on - principles of interaction programming , 2007 .

[46]  Harald Reiterer,et al.  ZOIL: A Design Paradigm and Software Framework for Post-WIMP Distributed User Interfaces , 2011, Distributed User Interfaces.

[47]  Steven C. Horii,et al.  Review: Understanding and Using DICOM, the Data Interchange Standard for Biomedical Imaging , 1997, J. Am. Medical Informatics Assoc..

[48]  Orit Shaer,et al.  A specification paradigm for the design and implementation of tangible user interfaces , 2009, TCHI.

[49]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[50]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[51]  Abigail Sellen,et al.  Affordances for manipulation of physical versus digital media on interactive surfaces , 2007, CHI.

[52]  Scott E. Hudson,et al.  A framework for robust and flexible handling of inputs with uncertainty , 2010, UIST.

[53]  A. Dix Formal Methods in HCI: Moving Towards an Engineering Approach , 2001 .

[54]  Johnny Accot,et al.  A Formal Description of Low Level Interaction and its Application to Multimodal Interactive Systems , 1996, DSV-IS.

[55]  Andrew Dillon,et al.  Who's Zooming Whom? Attunement to Animation in the Interface , 1997, J. Am. Soc. Inf. Sci..

[56]  Michael L. Anderson Embodied Cognition: A field guide , 2003, Artif. Intell..

[57]  Harald Reiterer,et al.  Natural User Interfaces : Why We Need Better Model-Worlds, Not Better Gestures , 2010, CHI 2010.

[58]  Brad A. Myers,et al.  Designers’ natural descriptions of interactive behaviors , 2008, 2008 IEEE Symposium on Visual Languages and Human-Centric Computing.

[59]  Eric Harslem,et al.  Designing the STAR User Interface , 1987, ECICS.

[60]  Alexander Katovsky,et al.  Category Theory , 2010, Arch. Formal Proofs.

[61]  Scott R. Klemmer,et al.  How bodies matter: five themes for interaction design , 2006, DIS '06.

[62]  Orit Shaer,et al.  Reality-based interaction: a framework for post-WIMP interfaces , 2008, CHI.

[63]  Scott E. Hudson,et al.  Probabilistic state machines: dialog management for inputs with uncertainty , 1992, UIST '92.

[64]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[65]  Ricardo Langner,et al.  Neat: a set of flexible tools and gestures for layout tasks on interactive displays , 2011, ITS '11.

[66]  Peng Gao,et al.  Medical high-resolution image sharing and electronic whiteboard system: A pure-web-based system for accessing and discussing lossless original images in telemedicine , 2015, Comput. Methods Programs Biomed..

[67]  Ron George,et al.  Objects, Containers, Gestures, and Manipulations: Universal Foundational Metaphors of Natural User Interfaces , 2010 .

[68]  Alan J. Dix,et al.  Formal methods for interactive systems , 1991, Computers and people series.

[69]  David Harel,et al.  Statecharts: A Visual Formalism for Complex Systems , 1987, Sci. Comput. Program..

[70]  Pasquale Lops,et al.  Playing with knowledge: A virtual player for "Who Wants to Be a Millionaire?" that leverages question answering techniques , 2015, Artif. Intell..

[71]  Gregory D. Abowd,et al.  Providing integrated toolkit-level support for ambiguity in recognition-based interfaces , 2000, CHI.

[72]  Michel Feron,et al.  Trends in PACS architecture. , 2011, European journal of radiology.

[73]  Ralph Johnson,et al.  design patterns elements of reusable object oriented software , 2019 .

[74]  Paul Suetens,et al.  Incorporating novel image processing methods in a hospital-wide PACS , 2005 .

[75]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[76]  Stéphane Chatty Defining the Dynamic Behaviour of Animated Interfaces , 1992, Engineering for Human-Computer Interaction.

[77]  Gregory D. Abowd,et al.  Modelling status and event behaviour of interactive systems , 1996, Softw. Eng. J..

[78]  M.N. Sastry,et al.  Structure and interpretation of computer programs , 1986, Proceedings of the IEEE.

[79]  Frits H. Post,et al.  StateStream: a developer-centric approach towards unifying interaction models and architecture , 2009, EICS '09.

[80]  European Society of Radiology 2009 The future role of radiology in healthcare , 2010, Insights into imaging.

[81]  Takeo Igarashi,et al.  Bubble clusters: an interface for manipulating spatial aggregation of graphical objects , 2007, UIST.

[82]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[83]  Wendy E. Mackay,et al.  Back to the real world , 1993, CACM.

[84]  Tom Igoe,et al.  Physical computing: sensing and controlling the physical world with computers , 2004 .

[85]  Brad A. Myers,et al.  How designers design and program interactive behaviors , 2008, 2008 IEEE Symposium on Visual Languages and Human-Centric Computing.

[86]  Nicholas Omoregbe,et al.  DESIGN AND IMPLEMENTATION OF YORUBA LANGUAGE MOBILE TUTOR , 2014 .

[87]  Mark W. Woolrich,et al.  FSL , 2012, NeuroImage.

[88]  Roman Rädle,et al.  Squidy: a zoomable design environment for natural user interfaces , 2009, CHI Extended Abstracts.

[89]  Mary Czerwinski,et al.  Beyond the Desktop Metaphor , 2007 .

[90]  Ronald H. Silverman,et al.  Accuracy, repeatability, and reproducibility of Artemis very high‐frequency digital ultrasound arc‐scan lateral dimension measurements , 2006, Journal of cataract and refractive surgery.

[91]  G. E. Pfaff,et al.  User Interface Management Systems , 1985, Eurographic Seminars.

[92]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[93]  Cleotilde Gonzalez,et al.  Does animation in user interfaces improve decision making? , 1996, CHI.

[94]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.

[95]  David A. Heflich,et al.  Influence of online computer games on the academic achievement of nontraditional undergraduate students , 2018 .

[96]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[97]  Alan J. Dix,et al.  Modelling Devices for Natural Interaction , 2008, Electron. Notes Theor. Comput. Sci..

[98]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[99]  Joëlle Coutaz,et al.  A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.

[100]  Andries van Dam Post-Wimp User Interfaces: The Human Con-nection , 1997 .

[101]  Paul Hudak,et al.  Functional reactive animation , 1997, ICFP '97.

[102]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[103]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[104]  Hamid Soltanian-Zadeh,et al.  Web-based interactive 2D/3D medical image processing and visualization software , 2010, Comput. Methods Programs Biomed..

[105]  Meredith Ringel Morris,et al.  Beyond "social protocols": multi-user coordination policies for co-located groupware , 2004, CSCW.

[106]  Satrajit S. Ghosh,et al.  Nipype: A Flexible, Lightweight and Extensible Neuroimaging Data Processing Framework in Python , 2011, Front. Neuroinform..

[107]  J. S. Collins,et al.  United States head circumference growth reference charts: birth to 21 years. , 2010, The Journal of pediatrics.

[108]  Alan Dix,et al.  Formal Methods in HCI: a Success Story - why it works and how to reproduce it , 2002 .

[109]  Saul Greenberg,et al.  Real time groupware as a distributed system: concurrency control and its effect on the interface , 1994, CSCW '94.

[110]  Anthony I. Wasserman,et al.  Extending State Transition Diagrams for the Specification of Human–Computer Interaction , 1985, IEEE Transactions on Software Engineering.

[111]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[112]  Morgan B Kaufmann,et al.  Upside down ∀s and algorithms computational formalisms and theory , 2003 .