Tailoring User Interfaces to Include Gesture-Based Interaction with gestUI

The development of custom gesture-based user interfaces requires software engineers to be skillful in the use of the tools and languages needed to implement them. gestUI, a model-driven method, can help them achieve these skills by defining custom gestures and including gesture-based interaction in existing user interfaces. Up to now, gestUI has used the same gesture catalogue for all software users, with gestures that could not be subsequently redefined. In this paper, we extend gestUI by including a user profile in the metamodel that permits individual users to define custom gestures and to include gesture-based interaction in user interfaces. Using tailoring mechanisms, each user can redefine his custom gestures during the software runtime. Although both features are supported by models, the gestUI tool hides its technical complexity from the users. We validated these gestUI features in a technical action research in an industrial context. The results showed that these features were perceived as both useful and easy to use when defining/redefining custom gestures and including them in a user interface.

[1]  Ralf Denzer,et al.  Vision and Requirements of Scenario-Driven Environmental Decision Support Systems Supporting Automation for End Users , 2011, ISESS.

[2]  Martin Schrepp,et al.  Applying the User Experience Questionnaire (UEQ) in Different Evaluation Scenarios , 2014, HCI.

[3]  Michael E. Atwood,et al.  "Human Crafters" Once again: Supporting Users as Designers in Continuous Co-design , 2013, IS-EUD.

[4]  Roel Wieringa,et al.  Design science as nested problem solving , 2009, DESRIST.

[5]  Jon Atle Gulla,et al.  Use of Tailored Process Models to Support ERP End-Users , 2006, ISTA.

[6]  Alberto Rodrigues da Silva,et al.  Model-driven engineering: A survey supported by the unified conceptual model , 2015, Comput. Lang. Syst. Struct..

[7]  Luís Ferreira Pires,et al.  A Quantitative Analysis of Model-Driven Code Generation through Software Experimentation , 2013, CAiSE.

[8]  Oscar Pastor,et al.  GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces , 2016, Complex Syst. Informatics Model. Q..

[9]  Jan Alexandersson,et al.  A comparative study of systems for the design of flexible user interfaces , 2016, J. Ambient Intell. Smart Environ..

[10]  Frank Maurer,et al.  Tool support for testing complex multi-touch gestures , 2010, ITS '10.

[11]  Maja Zumer,et al.  Dimensions of User Experience and Reaction Cards , 2014, ICADL.

[12]  Frédéric Bevilacqua,et al.  Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces , 2014, TIIS.

[13]  Fabio Paternò,et al.  Cicero Designer: An Environment for End-User Development of Multi-Device Museum Guides , 2009, IS-EUD.

[14]  Martin Gonzalez-Rodriguez,et al.  Designing User Interfaces Tailored to the Current User?s Requirements in Real Time , 2004, ICCHP.

[15]  Volker Wulf,et al.  Component-Based Approaches to Tailorable Systems , 2006, End User Development.

[16]  Daniel L. Moody,et al.  The method evaluation model: a theoretical model for validating information systems design methods , 2003, ECIS.

[17]  C. Rolland Capturing System Intentionality with Maps , 2007 .

[18]  Roel Wieringa,et al.  Technical Action Research as a Validation Method in Information Systems Design Science , 2012, DESRIST.