The multimodal edge of human aerobotic interaction

This paper presents the idea of a multimodal human aerobotic interaction. An overview of the aerobotic system and its application is given. The joystick-based controller interface and its limitations is discussed. Two techniques are suggested as emerging alternatives to the joystick-based controller interface used in human aerobotic interaction. The first technique is a multimodal combination of speech, gaze, gesture, and other non-verbal cues already used in regular human-human interaction. The second is telepathic interaction via brain computer interfaces. The potential limitations of these alternatives is highlighted, and the considerations for further works are presented.

[1]  Daniel Soto-Guerrero,et al.  A human-machine interface with unmanned aerial vehicles , 2013, CCE 2013.

[2]  Xu Zhu,et al.  Gesture control by wrist surface electromyography , 2015, 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[3]  Robert W. Lindeman,et al.  The AcceleGlove: a whole-hand input device for virtual reality , 2002, SIGGRAPH '02.

[4]  M. Milanova,et al.  Recognition of Emotional states in Natural Human-Computer Interaction , 2008, 2008 IEEE International Symposium on Signal Processing and Information Technology.

[5]  Illah R. Nourbakhsh,et al.  Interaction challenges in human-robot space exploration , 2005, INTR.

[6]  R. Jimenez,et al.  Human-Computer Interface for Control of Unmanned Aerial Vehicles , 2007, 2007 IEEE Systems and Information Engineering Design Symposium.

[7]  Ho-Sub Yoon,et al.  Visual Recognition of Static/Dynamic Gesture: Gesture-Driven Editing System , 1999, J. Vis. Lang. Comput..

[8]  K. Lafleur,et al.  Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface , 2013, Journal of neural engineering.

[9]  Timothy E. Lindquist,et al.  Assessing the Usability of Human-Computer Interfaces , 1985, IEEE Software.

[10]  Brad A. Myers,et al.  A brief history of human-computer interaction technology , 1998, INTR.

[11]  Roddy Cowie,et al.  Recognition of Emotional States in Natural Human-Computer Interaction , 2008 .

[12]  E.M. Petriu,et al.  Hand-Gesture and Facial-Expression Human-Computer Interfaces for Intelligent Space Applications , 2008, 2008 IEEE International Workshop on Medical Measurements and Applications.

[13]  Takis Zourntos,et al.  A Midsummer Night’s Dream (with flying robots) , 2011, Auton. Robots.

[14]  Gong Chao,et al.  Human-Computer Interaction: Process and Principles of Human-Computer Interface Design , 2009, 2009 International Conference on Computer and Automation Engineering.

[15]  Chaomin Luo,et al.  The magic glove: a gesture-based remote controller for intelligent mobile robots , 2011, Electronic Imaging.

[16]  J. Geoffrey Chase,et al.  Human Robot Collaboration: An Augmented Reality Approach—A Literature Review and Analysis , 2007 .

[17]  Dawn Xiaodong Song,et al.  On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces , 2012, USENIX Security Symposium.

[18]  Vladimir Pavlovic,et al.  Toward multimodal human-computer interface , 1998, Proc. IEEE.

[19]  Gabriel Ramírez-Torres,et al.  A human-machine interface with unmanned aerial vehicles , 2013, 2013 10th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE).