Multimodal Human Aerobotic Interaction

This chapter discusses HCI interfaces used in controlling aerial robotic systems (otherwise known as aerobots). The autonomy control level of aerobot is also discussed. However, due to the limitations of existing models, a novel classification model of autonomy, specifically designed for multirotor aerial robots, called the navigation control autonomy (nCA) model is also developed. Unlike the existing models such as the AFRL and ONR, this model is presented in tiers and has a two-dimensional pyramidal structure. This model is able to identify the control void existing beyond tier-one autonomy components modes and to map the upper and lower limits of control interfaces. Two solutions are suggested for dealing with the existing control void and the limitations of the RC joystick controller – the multimodal HHI-like interface and the unimodal BCI interface. In addition to these, some human factors based performance measurement is recommended, and the plans for further works presented.

[1]  Pramod Verma Flying User Interface , 2016, UIST.

[2]  K. Lafleur,et al.  Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface , 2013, Journal of neural engineering.

[3]  Huhn Kim,et al.  Evaluation of the safety and usability of touch gestures in operating in-vehicle information systems with visual occlusion. , 2014, Applied ergonomics.

[4]  Norman I. Badler,et al.  Defining Next-Generation Multi-Modal Communication in Human Robot Interaction , 2011 .

[5]  Jafar M. H. Ali,et al.  Content-Based Image Classification and Retrieval: A Rule-based System Using Rough Sets Framework , 2007, Int. J. Intell. Inf. Technol..

[6]  Gong Chao,et al.  Human-Computer Interaction: Process and Principles of Human-Computer Interface Design , 2009, 2009 International Conference on Computer and Automation Engineering.

[7]  R. Jimenez,et al.  Human-Computer Interface for Control of Unmanned Aerial Vehicles , 2007, 2007 IEEE Systems and Information Engineering Design Symposium.

[8]  Vladimir Pavlovic,et al.  Toward multimodal human-computer interface , 1998, Proc. IEEE.

[9]  Zhi-Hong Mao,et al.  Magnetic hand tracking for human-computer interface , 2010, Digests of the 2010 14th Biennial IEEE Conference on Electromagnetic Field Computation.

[10]  E. Sholes,et al.  Evolution of a UAV Autonomy Classification Taxonomy , 2007, 2007 IEEE Aerospace Conference.

[11]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[12]  Stephen D. Prior,et al.  The multimodal edge of human aerobotic interaction , 2016 .

[13]  Mu-Chun Su,et al.  Voice-controlled human-computer interface for the disabled , 2001 .

[14]  Aníbal Ollero,et al.  Experiments on coordinated motion of aerial robotic manipulators , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Sharon L. Oviatt,et al.  Breaking the Robustness Barrier: Recent Progress on the Design of Robust Multimodal Systems , 2002, Adv. Comput..

[16]  Cynthia Breazeal,et al.  An Empirical Analysis of Team Coordination Behaviors and Action Planning With Application to Human–Robot Teaming , 2010, Hum. Factors.

[17]  Illah R. Nourbakhsh,et al.  Interaction challenges in human-robot space exploration , 2005, INTR.

[18]  Hyacinth S. Nwana,et al.  Software agents: an overview , 1996, The Knowledge Engineering Review.

[19]  H. Jin Kim,et al.  Vision-Guided Aerial Manipulation Using a Multirotor With a Robotic Arm , 2016, IEEE/ASME Transactions on Mechatronics.

[20]  Lee Tan BSc MPhil PhD Mieee,et al.  The State of the Art in Human-computer Speech-based Interface Technologies , 2003 .

[21]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[22]  Daniele Nardi,et al.  Speaky for robots: the development of vocal interfaces for robotic applications , 2015, Applied Intelligence.

[23]  Takis Zourntos,et al.  A Midsummer Night’s Dream (with flying robots) , 2011, Auton. Robots.

[24]  Chung-Lin Huang,et al.  A Real-Time Model-Based Human Motion Tracking and Analysis for Human-Computer Interface Systems , 2004, EURASIP J. Adv. Signal Process..

[25]  Xu Zhu,et al.  Gesture control by wrist surface electromyography , 2015, 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops).

[26]  M. Milanova,et al.  Recognition of Emotional states in Natural Human-Computer Interaction , 2008, 2008 IEEE International Symposium on Signal Processing and Information Technology.

[27]  Volker Blanz,et al.  A Learning-Based High-Level Human Computer Interface for Face Modeling and Animation , 2007, Artifical Intelligence for Human Computing.

[28]  Atul Kumar Sahu,et al.  Benchmarking CNC Machine Tool Using Hybrid-Fuzzy Methodology: A Multi-Indices Decision Making (MCDM) Approach , 2015, Int. J. Fuzzy Syst. Appl..

[29]  James A. Larson,et al.  Guidelines for multimodal user interface design , 2004, CACM.

[30]  Dawn Xiaodong Song,et al.  On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces , 2012, USENIX Security Symposium.

[31]  Adel Hatami-Marbini,et al.  Productivity Growth and Efficiency Measurements in Fuzzy Environments with an Application to Health Care , 2012, Int. J. Fuzzy Syst. Appl..

[32]  Robert W. Lindeman,et al.  The AcceleGlove: a whole-hand input device for virtual reality , 2002, SIGGRAPH '02.

[33]  Mikael Wiberg,et al.  Interaction Per Se: Understanding "The Ambience of Interaction" as Manifested and Situated in Everyday & Ubiquitous IT-Use , 2010, Int. J. Ambient Comput. Intell..

[34]  Chaomin Luo,et al.  The magic glove: a gesture-based remote controller for intelligent mobile robots , 2011, Electronic Imaging.

[35]  J. Geoffrey Chase,et al.  Human Robot Collaboration: An Augmented Reality Approach—A Literature Review and Analysis , 2007 .

[36]  E.M. Petriu,et al.  Hand-Gesture and Facial-Expression Human-Computer Interfaces for Intelligent Space Applications , 2008, 2008 IEEE International Workshop on Medical Measurements and Applications.

[37]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[38]  Naphtali Rishe,et al.  HandMagic: Towards user interaction with inertial measuring units , 2016, 2016 IEEE SENSORS.

[39]  Timothy E. Lindquist,et al.  Assessing the Usability of Human-Computer Interfaces , 1985, IEEE Software.

[40]  Jonathan Harris,et al.  Speech and gesture interfaces for squad-level human-robot teaming , 2014, Defense + Security Symposium.

[41]  Lalit Gupta,et al.  Gesture-based interaction and communication: automated classification of hand gesture contours , 2001, IEEE Trans. Syst. Man Cybern. Syst..