Involving users in the gestural language definition process for the NInA framework

Along the past years, many interaction devices had been introduced to support the interaction between humans and computers, as the digital pen, touch sensitive devices and, more recently, gesture recognition devices. The last one allows a human-computer interaction using only the user's body movement, turning the interaction more "natural". However, these natural user interfaces (NUI) are still based on a pre-defined set of commands and are still not as natural as we desire. In this work, we questioned the benefits of the NUI over the mouse and keyboard combination. This work presents some improvements on the NUI system for Geographic Information Systems (GIS), the NInA framework. Results showed that a NUI approach demands a short learning time, with just a couple of interactions and instructions, and the user is ready to go. Moreover, the users demonstrated greater satisfaction, leading to productivity improvement.