Senses, Perception, and Natural Human‐Interfaces for Interactive Displays

[1]  Takatoshi Tsujimura,et al.  OLED displays : fundamentals and applications , 2012 .

[2]  Jie Zhang,et al.  Three‐dimensional interaction and autostereoscopic display system using gesture recognition , 2013 .

[3]  Narendra Ahuja,et al.  Detecting Faces in Images: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Desney S. Tan,et al.  Brain-Computer Interfaces and Human-Computer Interaction , 2010, Brain-Computer Interfaces.

[5]  David M. Hoffman,et al.  Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. , 2008, Journal of vision.

[6]  Alan Wexelblat,et al.  An approach to natural gesture in virtual environments , 1995, TCHI.

[7]  Lucas Spierer,et al.  Multisensory Integration: What You See Is Where You Hear , 2011, Current Biology.

[8]  Marja Salmimaa,et al.  Interaction with an autostereoscopic touch screen: effect of occlusion on subjective experiences when pointing to targets in planes of different depths , 2012 .

[9]  Douglas C. Engelbart,et al.  Display-Selection Techniques for Text Manipulation , 1967 .

[10]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[11]  Harvey Fletcher,et al.  The nature of speech and its interpretation , 1922 .

[12]  John R. Paterson,et al.  Modern optics in exceptionally preserved eyes of Early Cambrian arthropods from Australia , 2011, Nature.

[13]  Achintya K. Bhowmik Natural and Intuitive User Interfaces with Perceptual Computing Technologies , 2013 .

[14]  A. Parker On the origin of optics , 2011 .

[15]  Anand Vardhan Bhalla,et al.  Comparative Study of Various Touchscreen Technologies , 2010 .

[16]  Achintya K. Bhowmik,et al.  Liquid-Crystal Technology Advances toward Future “True” 3-D Flat-Panel Displays , 2011 .

[17]  Rashid Ansari,et al.  Multimodal human discourse: gesture and speech , 2002, TCHI.

[18]  T. Freeman,et al.  Motion perception during sinusoidal smooth pursuit eye movements: signal latencies and non-linearities. , 2008, Journal of vision.

[19]  E. A. Johnson,et al.  Touch display--a novel input/output device for computers , 1965 .

[20]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[21]  Hsi-Jian Lee,et al.  Determination of 3D human body postures from a single view , 1985, Comput. Vis. Graph. Image Process..

[22]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.

[23]  E. A. Johnson Touch Displays: A Programmed Man-Machine Interface , 1967 .

[24]  G. McCarthy,et al.  Neural basis of eye gaze processing deficits in autism. , 2005, Brain : a journal of neurology.

[25]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  F. Volkmar,et al.  The enactive mind, or from actions to cognition: lessons from autism. , 2003, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[27]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[28]  James A. Russell,et al.  Do facial expressions signal specific emotions? Judging emotion from the face in context , 1996 .