Point-and-Command Paradigm for Interaction with Assistive Robots
暂无分享,去创建一个
[1] Doru Talaba,et al. Learning new skills by a humanoid robot through imitation , 2013, 2013 IEEE 14th International Symposium on Computational Intelligence and Informatics (CINTI).
[2] Simon J. Julier,et al. Towards a situated, multimodal interface for multiple UAV control , 2010, 2010 IEEE International Conference on Robotics and Automation.
[3] Milad Alemzadeh,et al. Human-Computer Interaction: Overview on State of the Art , 2008 .
[4] Allison Sauppé,et al. Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[5] Jochen Triesch,et al. A gesture interface for human-robot-interaction , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.
[6] Min Jiang,et al. A developmental approach to robotic pointing via human-robot interaction , 2014, Inf. Sci..
[7] Doru Talaba,et al. P300-Based Brain-Neuronal Computer Interaction for Spelling Applications , 2013, IEEE Transactions on Biomedical Engineering.
[8] Brian Scassellati,et al. Investigating models of social development using a humanoid robot , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..
[9] Takayuki Kanda,et al. Design patterns for sociality in human-robot interaction , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[10] Tapio Heikkilä,et al. Designing Autonomous Robot Systems - Evaluation of the R3-COP Decision Support System Approach , 2013, DECS@SAFECOMP.
[11] Glenn Taylor,et al. Behavior Design Patterns: Engineering Human Behavior Models , 2004 .
[12] Silviu Butnariu,et al. The Command of a Virtual Industrial Robot Using a Dedicated Haptic Interface , 2013 .
[13] Illah R. Nourbakhsh,et al. Planning for Human–Robot Interaction in Socially Situated Tasks , 2013, Int. J. Soc. Robotics.
[14] Philip Chan,et al. Toward accurate dynamic time warping in linear time and space , 2007, Intell. Data Anal..
[15] Matthew Turk,et al. Multimodal interaction: A review , 2014, Pattern Recognit. Lett..
[16] Gerhard K. Kraetzschmar,et al. Precise Pointing Target Recognition for Human-Robot Interaction , 2010 .
[17] Wolff‐Michael Roth. Gestures: Their Role in Teaching and Learning , 2001 .
[18] Allison Sauppé,et al. Design patterns for exploring and prototyping human-robot interactions , 2014, CHI.
[19] C. Creider. Hand and Mind: What Gestures Reveal about Thought , 1994 .
[20] Emer Gilmartin,et al. Multimodal conversational interaction with a humanoid robot , 2012, 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom).
[21] Frédéric Kaplan,et al. Learning to Interpret Pointing Gestures: Experiments with Four-Legged Autonomous Robots , 2005, Biomimetic Neural Learning for Intelligent Robots.
[22] Takayuki Kanda,et al. Natural deictic communication with humanoid robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[23] Richard A. Bolt,et al. “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.
[24] Horst-Michael Groß,et al. An approach to multi-modal human-machine interaction for intelligent service robots , 2003, Robotics Auton. Syst..
[25] F. T. Romero,et al. Speech-Based Human and Service Robot Interaction: An Application for Mexican Dysarthric People , 2013 .
[26] Stefan Wermter,et al. Real-world reinforcement learning for autonomous humanoid robot docking , 2012, Robotics Auton. Syst..
[27] H. Yanco,et al. Automation as Caregiver: A Survey of Issues and Technologies , 2003 .
[28] Ingo Lütkebohle,et al. Where is this? - gesture based multimodal interaction with an anthropomorphic robot , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.
[29] Susan Goldin-Meadow,et al. Language and Gesture: Gesture and the transition from one- to two-word speech: when hand and mouth come together , 2000 .
[30] Alexander Ferrein,et al. Caesar: an intelligent domestic service robot , 2012, Intell. Serv. Robotics.
[31] S. Goldin-Meadow,et al. Pointing sets the stage for learning language--and creating language. , 2007, Child development.
[32] Philippe Bidaud,et al. Human activity analysis: a personal robot integrating a framework for robust person detection and tracking and physical based motion analysis , 2013, Paladyn J. Behav. Robotics.
[33] A. Corradini,et al. Dynamic time warping for off-line recognition of a small gesture vocabulary , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.
[34] Jaime Valls Miró,et al. Dynamic Bayesian Networks for Learning Interactions between Assistive Robotic Walker and Human Users , 2010, KI.
[35] Martin Buss,et al. Information retrieval system for human-robot communication - Asking for directions , 2009, 2009 IEEE International Conference on Robotics and Automation.
[36] James Rossiter,et al. Multimodal intent recognition for natural human-robotic interaction , 2011 .
[37] Sotaro Kita,et al. Pointing: A foundational building block in human communication , 2003 .
[38] Nicu Sebe,et al. Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.
[39] R.V. Dubey,et al. Integration of an Intelligent Decision Support System and a Robotic Haptic Device for Eye-Hand Coordination Therapy , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.
[40] Ella M. Atkins,et al. Human Intent Prediction Using Markov Decision Processes , 2012, J. Aerosp. Inf. Syst..
[41] Ludwig Wittgenstein,et al. The Blue and Brown Books: Preliminary Studies for the 'Philosophical Investigations' , 1980 .
[42] S. Kita. Pointing: Where language, culture, and cognition meet , 2003 .
[43] mc schraefel,et al. A Taxonomy of Gestures in Human Computer Interactions , 2005 .
[44] Yukie Nagai,et al. Learning to comprehend deictic gestures in robots and human infants , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..
[45] Hui Wang,et al. Audio-Visual Tibetan Speech Recognition Based on a Deep Dynamic Bayesian Network for Natural Human Robot Interaction: , 2012 .
[46] Sungyoung Lee,et al. Two-stage Hidden Markov Model in Gesture Recognition for Human Robot Interaction , 2012 .
[47] Mao-Jiun J. Wang,et al. A decision support system for robot selection , 1991, Decis. Support Syst..
[48] Sergio Escalera,et al. Probability-based Dynamic Time Warping and Bag-of-Visual-and-Depth-Words for Human Gesture Recognition in RGB-D , 2014, Pattern Recognit. Lett..
[49] Alexei Makarenko,et al. Human-robot communication for collaborative decision making - A probabilistic approach , 2010, Robotics Auton. Syst..
[50] Cristina Colonnesi,et al. The relation between pointing and language development: A meta-analysis , 2010 .
[51] Yusuf Tansel İç,et al. Development of a decision support system for robot selection , 2013 .
[52] Jörg Stückler,et al. Learning to interpret pointing gestures with a time-of-flight camera , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[53] Magnus Egerstedt,et al. Executive decision support , 2009, IEEE Robotics & Automation Magazine.
[54] M. Matarić,et al. Human-Robot Interaction , 2009 .
[55] Cengiz Kahraman,et al. Developing a group decision support system based on fuzzy information axiom , 2010, Knowl. Based Syst..