Elvis: situated speech and gesture understanding for a robotic chandelier
暂无分享,去创建一个
[1] D. McNeill. Hand and Mind , 1995 .
[2] Antonella De Angeli,et al. Integration and Synchronization of Input Modes during Multimodal Human-Computer Interaction , 1997, CHI.
[3] Paul Lamere,et al. Design of the CMU Sphinx-4 Decoder , 2022 .
[4] Josef Kittler,et al. Histogram-based segmentation in a perceptually uniform color space , 1998, IEEE Trans. Image Process..
[5] Rajeev Sharma,et al. Understanding Gestures in Multimodal Human Computer Interaction , 2000, Int. J. Artif. Intell. Tools.
[6] Rajeev Sharma,et al. Exploiting speech/gesture co-occurrence for improving continuous gesture recognition in weather narration , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).
[7] Philip R. Cohen,et al. Synergistic use of direct manipulation and natural language , 1989, CHI '89.
[8] Sotaro Kita,et al. Movement Phase in Signs and Co-Speech Gestures, and Their Transcriptions by Human Coders , 1997, Gesture Workshop.
[9] M Kuperstein,et al. Neural model of adaptive hand-eye coordination for single postures. , 1988, Science.