Human-Computer Interaction: Overview on State of the Art
暂无分享,去创建一个
Milad Alemzadeh | Fakhreddine Karray | Jamil Abou Saleh | Mo Nours Arab | F. Karray | M. Arab | M. Alemzadeh | Milad Alemzadeh
[1] Jake K. Aggarwal,et al. Human Motion Analysis: A Review , 1999, Comput. Vis. Image Underst..
[2] John Vince,et al. Introduction to Virtual Reality , 2004, Springer London.
[3] Alex Pentland,et al. Human computing and machine understanding of human behavior: a survey , 2006, ICMI '06.
[4] Oliver Brock,et al. Human-Centered Robotics and Interactive Haptic Simulation , 2001, ISRR.
[5] Andrew T Duchowski,et al. A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.
[6] Rajeev Sharma,et al. Understanding Gestures in Multimodal Human Computer Interaction , 2000, Int. J. Artif. Intell. Tools.
[7] Keith Duncan,et al. Cognitive Engineering , 2017, Encyclopedia of GIS.
[8] Ying Wu,et al. Vision-Based Gesture Recognition: A Review , 1999, Gesture Workshop.
[9] Zhigang Deng,et al. Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.
[10] Clare-Marie Karat,et al. Conversational interface technologies , 2002 .
[11] Brad A. Myers,et al. A brief history of human-computer interaction technology , 1998, INTR.
[12] Michael J. Lyons,et al. Designing, Playing, and Performing with a Vision-based Mouth Interface , 2003, NIME.
[13] Sharon L. Oviatt,et al. Unification-based Multimodal Integration , 1997, ACL.
[14] Yasmine Arafa,et al. Building Multi-modal Personal Sales Agents as Interfaces to E-commerce Applications , 2001, Active Media Technology.
[15] Ben Shneiderman,et al. Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .
[16] Gregory D. Abowd,et al. Perceptual user interfaces using vision-based eye tracking , 2003, ICMI '03.
[17] Ben Shneiderman,et al. Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition) , 2004 .
[18] Kosuke Sato,et al. Real-time gesture recognition by learning and selective control of visual interest points , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[19] Jakob Nielsen,et al. Usability engineering , 1997, The Computer Science and Engineering Handbook.
[20] Dariu Gavrila,et al. The Visual Analysis of Human Movement: A Survey , 1999, Comput. Vis. Image Underst..
[21] Stuart Ferguson,et al. A hitchhiker's guide to virtual reality , 2007 .
[22] Alisa Rudnitskaya,et al. Electronic tongue for quality assessment of ethanol, vodka and eau-de-vie , 2005 .
[23] Jr. J.P. Campbell,et al. Speaker recognition: a tutorial , 1997, Proc. IEEE.
[24] Andry Rakotonirainy,et al. A Survey of Research on Context-Aware Homes , 2003, ACSW.
[25] Richard A. Bolt,et al. “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.
[26] Rainer Stiefelhagen,et al. Implementation and evaluation of a constraint-based multimodal fusion system for speech and 3D pointing gestures , 2004, ICMI '04.
[27] Biing-Hwang Juang,et al. Fundamentals of speech recognition , 1993, Prentice Hall signal processing series.
[28] Katie Salen,et al. Rules of play: game design fundamentals , 2003 .
[29] Nicu Sebe,et al. Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.
[30] Robert J. K. Jacob,et al. Evaluation of eye gaze interaction , 2000, CHI.
[31] Rosalind W. Picard. Affective Computing , 1997 .
[32] Lawrence S. Chen,et al. Joint processing of audio-visual information for the recognition of emotional expressions in human-computer interaction , 2000 .
[33] Charles S. Wasson. System Analysis, Design, and Development: Concepts, Principles, and Practices (Wiley Series in Systems Engineering and Management) , 2005 .
[34] Wolfgang Wahlster,et al. Readings in Intelligent User Interfaces , 1998 .
[35] Woodrow Barfield,et al. Fundamentals of Wearable Computers and Augumented Reality , 2000 .
[36] Sharon L. Oviatt,et al. Designing the User Interface for Multimodal Speech and Pen-Based Gesture Applications: State-of-the-Art Systems and Future Research Directions , 2000, Hum. Comput. Interact..
[37] Michel Daoud Yacoub. Wireless Technology: Protocols, Standards, and Techniques , 2001 .
[38] Sylvie Gibet,et al. Gesture-Based Communication in Human-Computer Interaction , 2001, Lecture Notes in Computer Science.
[39] Alex Kirlik,et al. Adaptive Perspectives on Human-Technology Interaction: Methods and Models for Cognitive Engineering and Human-Computer Interaction (Human-Technology Interaction) , 2006 .
[40] Oudeyer Pierre-Yves,et al. The production and recognition of emotions in speech: features and algorithms , 2003 .
[41] T. L. Williams,et al. Applications of Thermal Imaging , 1988 .
[42] Mubarak Shah,et al. Determining driver visual attention with one camera , 2003, IEEE Trans. Intell. Transp. Syst..
[43] Andrey Ronzhin,et al. Assistive multimodal system based on speech recognition and head tracking , 2005, 2005 13th European Signal Processing Conference.
[44] Samy Bengio,et al. Automatic analysis of multimodal group actions in meetings , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[45] Nicu Sebe,et al. Facial expression recognition from video sequences: temporal and static modeling , 2003, Comput. Vis. Image Underst..
[46] Pierre-Yves Oudeyer,et al. The production and recognition of emotions in speech: features and algorithms , 2003, Int. J. Hum. Comput. Stud..
[47] I. Poggi,et al. Perception of non-verbal emotional listener feedback , 2006 .
[48] Michael Johnston,et al. MATCHkiosk: A Multimodal Interactive City Guide , 2004, ACL.
[49] Beat Fasel,et al. Automati Fa ial Expression Analysis: A Survey , 1999 .
[50] Ben Shneiderman,et al. Designing the user interface - strategies for effective human-computer interaction, 3rd Edition , 1997 .
[51] Stephen Brewster,et al. Nonspeech auditory output , 2002 .
[52] Sharon Oviatt,et al. Multimodal Interfaces , 2008, Encyclopedia of Multimedia.
[53] Alphonse Chapanis,et al. Man-machine engineering , 1965 .
[54] Ashish Kapoor,et al. Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..
[55] Michelle X. Zhou,et al. A probabilistic approach to reference resolution in multimodal user interfaces , 2004, IUI '04.
[56] Hiroo Iwata,et al. Haptic interfaces , 2002 .
[57] Hatice Gunes,et al. Bi-modal emotion recognition from expressive face and body gestures , 2007, J. Netw. Comput. Appl..
[58] Maja Pantic,et al. Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..
[59] E.A. Bretz. When work is fun and games , 2002, IEEE Spectrum.
[60] Dov Te'eni,et al. Human-Computer Interaction: Developing Effective Organizational Information Systems , 2006 .
[61] Munindar P. Singh,et al. Readings in agents , 1997 .
[62] Gabriel Robles-De-La-Torre,et al. The importance of the sense of touch in virtual and real environments , 2006, IEEE MultiMedia.
[63] Magdalena D. Bugajska,et al. Building a Multimodal Human-Robot Interface , 2001, IEEE Intell. Syst..
[64] Vincent Hayward,et al. Haptic interfaces and devices , 2004 .