Combining gaze and AI planning for online human intention recognition
暂无分享,去创建一个
Liz Sonenberg | Tim Miller | Frank Vetere | Eduardo Velloso | Ronal Singh | Joshua Newn | L. Sonenberg | Tim Miller | F. Vetere | Eduardo Velloso | Ronal Singh | Joshua Newn
[1] David Robinson,et al. The oculomotor control system: A review , 1968 .
[2] Barton Whaley. Toward a general theory of deception , 1982 .
[3] Michael E. Bratman,et al. Intention, Plans, and Practical Reason , 1991 .
[4] Robert J. K. Jacob,et al. What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.
[5] Milind Tambe,et al. RESC: An Approach for Real-time, Dynamic Agent Tracking , 1995, IJCAI.
[6] Eric Horvitz,et al. A computational architecture for conversation , 1999 .
[7] Joseph H. Goldberg,et al. Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.
[8] M. Corbetta,et al. Control of goal-directed and stimulus-driven attention in the brain , 2002, Nature Reviews Neuroscience.
[9] Andrew T. Duchowski,et al. Eye Tracking Methodology - Theory and Practice, Third Edition , 2003 .
[10] Nate Blaylock,et al. Statistical Goal Parameter Recognition , 2004, ICAPS.
[11] A. Freire,et al. Are eyes windows to a deceiver's soul? Children's use of another's eye gaze cues in a deceptive situation. , 2004, Developmental psychology.
[12] E. Pacherie. The phenomenology of action: A conceptual framework , 2008, Cognition.
[13] Christof Koch,et al. A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .
[14] Anatole Lécuyer,et al. Gaze behavior and visual attention model when turning in virtual environments , 2009, VRST '09.
[15] Hector Geffner,et al. Probabilistic Plan Recognition Using Off-the-Shelf Classical Planners , 2010, AAAI.
[16] Gamini Dissanayake,et al. Nonverbal robot-group interaction using an imitated gaze cue , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[17] Hector Geffner,et al. Goal Recognition over POMDPs: Inferring the Intention of a POMDP Agent , 2011, IJCAI.
[18] Araceli Sanchis,et al. Towards gaze-controlled platform games , 2011, 2011 IEEE Conference on Computational Intelligence and Games (CIG'11).
[19] Gerhard Tröster,et al. Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[20] Raimund Dachselt,et al. Look & touch: gaze-supported target acquisition , 2012, CHI.
[21] Jennifer J. Vogel-Walcutt,et al. A review of eye-tracking applications as tools for training , 2012, Cognition, Technology & Work.
[22] A. Vrij,et al. Windows to the Soul? Deliberate Eye Contact as a Cue to Deceit , 2012 .
[23] Silvia Wen-Yu Lee,et al. A review of using eye-tracking technology in exploring learning from 2000 to 2012 , 2013 .
[24] Jacob L. Orquin,et al. Attention and choice: a review on eye movements in decision making. , 2013, Acta psychologica.
[25] Kai Kunze,et al. I know what you are reading: recognition of document types using mobile eye tracking , 2013, ISWC '13.
[26] Blai Bonet,et al. A Concise Introduction to Models and Methods for Automated Planning , 2013, A Concise Introduction to Models and Methods for Automated Planning.
[27] Brian Scassellati,et al. Are you looking at me? Perception of robot attention is mediated by gaze type and group size , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[28] Yukiko I. Nakano,et al. Effectiveness of Gaze-Based Engagement Estimation in Conversational Agents , 2013, Eye Gaze in Intelligent User Interfaces.
[29] Roman Bednarik,et al. A Computational Approach for Prediction of Problem-Solving Behavior Using Support Vector Machines and Eye-Tracking Data , 2013, Eye Gaze in Intelligent User Interfaces.
[30] Guang-Zhong Yang,et al. Eye tracking for skills assessment and training: a systematic review. , 2014, The Journal of surgical research.
[31] Robert P. Goldman,et al. Plan, Activity, and Intent Recognition: Theory and Practice , 2014 .
[32] Chris L. Baker,et al. Modeling Human Plan Recognition Using Bayesian Theory of Mind , 2014 .
[33] Sven Bertel,et al. Dynamically adapting an AI game engine based on players' eye movements and strategies , 2014, EICS.
[34] Jessie Y. C. Chen,et al. Human–Agent Teaming for Multirobot Control: A Review of Human Factors Issues , 2014, IEEE Transactions on Human-Machine Systems.
[35] T. Levine. Truth-Default Theory (TDT) , 2014 .
[36] Bilge Mutlu,et al. Using gaze patterns to predict task intent in collaboration , 2015, Front. Psychol..
[37] Marcus Carter,et al. Remote Gaze and Gesture Tracking on the Microsoft Kinect: Investigating the Role of Feedback , 2015, OZCHI.
[38] Hans-Werner Gellersen,et al. An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation , 2015, INTERACT.
[39] T. Foulsham. Eye movements and their functions in everyday tasks , 2015, Eye.
[40] Justus H. Piater,et al. The Effects of Social Gaze in Human-Robot Collaborative Assembly , 2015, ICSR.
[41] Sean Andrist,et al. Look together: analyzing gaze coordination with epistemic network analysis , 2015, Front. Psychol..
[42] Md. Golam Rashed,et al. Supporting Human–Robot Interaction Based on the Level of Visual Focus of Attention , 2015, IEEE Transactions on Human-Machine Systems.
[43] Hans-Werner Gellersen,et al. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.
[44] Marcus Carter,et al. The Emergence of EyePlay: A Survey of Eye Interaction in Games , 2016, CHI PLAY.
[45] Bilge Mutlu,et al. Anticipatory robot control for efficient human-robot collaboration , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[46] Marcus Carter,et al. Multimodal Segmentation on a Large Interactive Tabletop: Extending Interaction on Horizontal Surfaces with Gaze , 2016, ISS.
[47] Marcus Carter,et al. Exploring the Effects of Gaze Awareness on Multiplayer Gameplay , 2016, CHI PLAY.
[48] Gal A. Kaminka,et al. Online goal recognition through mirroring: humans and agents , 2016 .
[49] Eduardo Velloso,et al. GazeGrip: improving mobile device accessibility with gaze & grip interaction , 2017, OZCHI.
[50] Sebastian Sardiña,et al. Cost-Based Goal Recognition for Path-Planning , 2017, AAMAS.
[51] Frank Vetere,et al. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games , 2017, CHI PLAY.
[52] Albrecht Schmidt,et al. Cognitive Heat: Exploring the Usage of Thermal Imaging to Unobtrusively Estimate Cognitive Load , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[53] Manfred Tscheligi,et al. LaserViz: Shared Gaze in the Co-Located Physical World , 2017, TEI.
[54] Felipe Meneguzzi,et al. Landmark-Based Heuristics for Goal Recognition , 2017, AAAI.
[55] François Charpillet,et al. Multi-modal Intention Prediction with Probabilistic Movement Primitives , 2017, HFR.
[56] B. Scassellati,et al. Social eye gaze in human-robot interaction , 2017, J. Hum. Robot Interact..
[57] Johan Wagemans,et al. Does effective gaze behavior lead to enhanced performance in a complex error-detection cockpit task? , 2018, PloS one.
[58] Kursat Cagiltay,et al. A systematic review of eye tracking research on multimedia learning , 2018, Comput. Educ..
[59] Peter Stone,et al. Inferring User Intention using Gaze in Vehicles , 2018, ICMI.
[60] Tim Miller,et al. Combining Planning with Gaze for Online Human Intention Recognition , 2018, AAMAS.
[61] Frank Vetere,et al. Looks Can Be Deceiving: Using Gaze Visualisation to Predict and Mislead Opponents in Strategic Gameplay , 2018, CHI.
[62] Roland Brünken,et al. Differentiating Different Types of Cognitive Load: a Comparison of Different Measures , 2018 .
[63] Peter Stone,et al. A Study of Human-Robot Copilot Systems for En-route Destination Changing , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).
[64] Peter Stone,et al. Autonomous agents modelling other agents: A comprehensive survey and open problems , 2017, Artif. Intell..
[65] Andrew T. Duchowski,et al. Gaze-based interaction: A 30 year retrospective , 2018, Comput. Graph..
[66] Eduardo Velloso,et al. Combining Low and Mid-Level Gaze Features for Desktop Activity Recognition , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[67] Barbara Deml,et al. Human-Aware Robotic Assistant for Collaborative Assembly: Integrating Human Motion Prediction With Planning in Time , 2018, IEEE Robotics and Automation Letters.
[68] Martin Raubal,et al. The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation , 2018, CHI.
[69] Eduardo Velloso,et al. Designing Interactions with Intention-Aware Gaze-Enabled Artificial Agents , 2019, INTERACT.
[70] Martin Meißner,et al. The Promise of Eye-Tracking Methodology in Organizational Research: A Taxonomy, Review, and Future Avenues , 2019 .
[71] Giuseppe Loianno,et al. Human Gaze-Driven Spatial Tasking of an Autonomous MAV , 2019, IEEE Robotics and Automation Letters.
[72] Andreas Bulling,et al. Classifying Attention Types with Thermal Imaging and Eye Tracking , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..