Gesture ontology for informing Service-oriented Architecture

We present a system engineering approach for designing Service-oriented Architectures (SOA) for software applications that use gesture commands. The approach employs ontology for gesture-based interaction which was designed on three levels: user execution, system implementation, and gesture reflection. The ontology borrows concepts from several research communities interested in gestures such as human-computer interaction, pattern recognition, and cognitive psychology. We show how the ontology can be used in order to inform the design of Service-oriented Architectures for engineering new systems and applications and describe a software architecture design for controlling smart homes with gesture commands.

[1]  J. Wade Davis,et al.  Statistical Pattern Recognition , 2003, Technometrics.

[2]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[4]  Thomas R. Gruber,et al.  A translation approach to portable ontology specifications , 1993, Knowl. Acquis..

[5]  Michael G. Thomason,et al.  Syntactic Pattern Recognition, An Introduction , 1978, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[7]  Joseph J. LaViola,et al.  GestureBar: improving the approachability of gesture-based interfaces , 2009, CHI.

[8]  Norbert Reithinger,et al.  SmartWeb Handheld - Multimodal Interaction with Ontological Knowledge Bases and Semantic Web Services , 2007, Artifical Intelligence for Human Computing.

[9]  H Choi Head gesture recognition using HMMs , 1999 .

[10]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[11]  Maojun Zhang,et al.  An application oriented and shape feature based multi-touch gesture description and recognition method , 2011, Multimedia Tools and Applications.

[12]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[13]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[14]  Filip De Turck,et al.  WS-Gesture, a gesture-based state-aware control framework , 2010, 2010 IEEE International Conference on Service-Oriented Computing and Applications (SOCA).

[15]  Βασίλειος Αταλασίδης-Καραντίνης 3D motion tracking : παραγωγή διαφημιστικού βίντεο συνδυαστικής χρήσης τριδιάστατων γραφικών με ζωντανά πλάνα , 2012 .

[16]  Radu-Daniel Vatavu,et al.  Multi-Level Representation of Gesture as Command for Human Computer Interaction , 2008, Comput. Informatics.

[17]  Daniel Thalmann,et al.  Emotional Body Expression Parameters In Virtual Human Ontology , 2006 .

[18]  Ralf Salomon,et al.  Gesture recognition for virtual reality applications using data gloves and neural networks , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).