Implicit Intention Communication in Human–Robot Interaction Through Visual Behavior Studies
暂无分享,去创建一个
[1] B. Goldwater. Psychological significance of pupillary movements. , 1972, Psychological bulletin.
[2] Dave M. Stampe,et al. Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .
[3] Arne John Glenstrup,et al. Eye Controlled Media: Present and Future State , 1995 .
[4] Jitendra Malik,et al. Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
[5] David G. Lowe,et al. Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.
[6] Joseph H. Goldberg,et al. Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.
[7] Carlos Hitoshi Morimoto,et al. Pupil detection and tracking using multiple light sources , 2000, Image Vis. Comput..
[8] Terrence Fong,et al. Collaboration, Dialogue, Human-Robot Interaction , 2001, ISRR.
[9] Myung Jin Chung,et al. A human-robot interface using vision-based eye gaze estimation system , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.
[10] G. Ballantyne. Robotic surgery, telerobotic surgery, telepresence, and telementoring , 2002, Surgical Endoscopy And Other Interventional Techniques.
[11] Manuel Mazo,et al. Electro-Oculographic Guidance of a Wheelchair Using Eye Movements Codification , 2003, Int. J. Robotics Res..
[12] Jean Scholtz,et al. Human-robot interaction: development of an evaluation methodology for the bystander role of interaction , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).
[13] Alexander Zelinsky,et al. Intuitive Human-Robot Interaction Through Active 3D Gaze Tracking , 2003, ISRR.
[14] Monica N. Nicolescu,et al. Natural methods for robot task learning: instructive demonstrations, generalization and practice , 2003, AAMAS '03.
[15] Ted Selker,et al. Visual Attentive Interfaces , 2004 .
[16] Nir Friedman,et al. Bayesian Network Classifiers , 1997, Machine Learning.
[17] Zhiwei Zhu,et al. Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.
[18] Carlos Hitoshi Morimoto,et al. Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..
[19] Richard Wright,et al. The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.
[20] Satoshi Kagami,et al. Motion Control System that Realizes Physical Interaction between Robot's Hands and Environment during Walk , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.
[21] Chern-Sheng Lin,et al. Powered Wheelchair Controlled by Eye-Tracking System , 2006 .
[22] Kenji Kawashima,et al. Development of a Master Slave System with Force Sensing Using Pneumatic Servo System for Laparoscopic Surgery , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.
[23] Vincenzo Lippiello,et al. Human-robot interaction control using force and vision , 2007 .
[24] Ahmad Lotfi,et al. Remote control of mobile robots through human eye gaze: the design and evaluation of an interface , 2008, Security + Defence.
[25] Armando Barreto,et al. Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. , 2008, Journal of rehabilitation research and development.
[26] Advait Jain,et al. A clickable world: Behavior selection through pointing and context for mobile manipulation , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[27] Desney S. Tan,et al. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces , 2008, CHI.
[28] Luís Paulo Reis,et al. IntellWheels MMI: A Flexible Interface for an Intelligent Wheelchair , 2009, RoboCup.
[29] Jeff A. Bilmes,et al. The VoiceBot: a voice controlled robot arm , 2009, CHI.
[30] Brett Browning,et al. A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..
[31] John Paulin Hansen,et al. Gaze-controlled driving , 2009, CHI Extended Abstracts.
[32] J. Gilbert. The EndoAssist robotic camera holder as an aid to the introduction of laparoscopic colorectal surgery. , 2009, Annals of the Royal College of Surgeons of England.
[33] Scott T. Grafton,et al. Spatio-Temporal Dynamics of Human Intention Understanding in Temporo-Parietal Cortex: A Combined EEG/fMRI Repetition Suppression Paradigm , 2009, PloS one.
[34] F. Jatene,et al. Robotic versus human camera holding in video-assisted thoracic sympathectomy: a single blind randomized trial of efficacy and safety. , 2008, Interactive cardiovascular and thoracic surgery.
[35] Sharda A. Chhabria,et al. EYE MOTION TRACKING FOR WHEELCHAIR CONTROL , 2010 .
[36] Qiang Ji,et al. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[37] Chih-Hung King,et al. Towards an assistive robot that autonomously performs bed baths for patient hygiene , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[38] Desney S. Tan,et al. Making muscle-computer interfaces more practical , 2010, CHI.
[39] Jason Weston,et al. A user's guide to support vector machines. , 2010, Methods in molecular biology.
[40] R Chavarriaga,et al. Learning From EEG Error-Related Potentials in Noninvasive Brain-Computer Interfaces , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.
[41] Andrea Vedaldi,et al. Vlfeat: an open and portable library of computer vision algorithms , 2010, ACM Multimedia.
[42] Guang-Zhong Yang,et al. Gaze contingent control for an articulated mechatronic laparoscope , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.
[43] J. Stolzenburg,et al. Comparison of the FreeHand® robotic camera holder with human assistants during endoscopic extraperitoneal radical prostatectomy , 2011, BJU international.
[44] U. Castiello,et al. Cues to intention: The role of movement information , 2011, Cognition.
[45] Ronnie Cann,et al. Incrementality and intention-recognition in utterance processing , 2011, Dialogue Discourse.
[46] Csaba Antonya,et al. Attentive User Interface for Interaction within Virtual Reality Environments Based on Gaze Analysis , 2011, HCI.
[47] Emilio Frazzoli,et al. Intention-Aware Motion Planning , 2013, WAFR.
[48] Guillaume Doisy. Sensorless collision detection and control by physical interaction for wheeled mobile robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[49] van Km Kees Hee,et al. Tele-operated service robots for household and care , 2012 .
[50] Christopher C. Cummins,et al. A model of intentional communication: AIRBUS (Asymmetric Intention Recognition with Bayesian Updating of Signals) , 2012 .
[51] A. Knoll,et al. Human-computer interfaces for interaction with surgical tools in robotic surgery , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).
[52] Minho Lee,et al. Probing of human implicit intent based on eye movement and pupillary analysis for augmented cognition , 2013, Int. J. Imaging Syst. Technol..
[53] Minho Lee,et al. Intention Recognition and Object Recommendation System using Deep Auto-encoder Based Affordance Model , 2013 .
[54] Yu Wang,et al. Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation , 2014 .
[55] Minho Kim,et al. Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking , 2014, Comput. Biol. Medicine.
[56] Jan-Louis Kruger,et al. Attention distribution and cognitive load in a subtitled academic lecture: L1 vs. L2 , 2014 .
[57] Minho Lee,et al. Human intention recognition based on eyeball movement pattern and pupil size variation , 2014, Neurocomputing.
[58] Songpo Li,et al. Implicit human intention inference through gaze cues for people with limited motion ability , 2014, 2014 IEEE International Conference on Mechatronics and Automation.
[59] Songpo Li,et al. Attention-Aware Robotic Laparoscope Based on Fuzzy Interpretation of Eye-Gaze Patterns , 2015 .