Multimodal Interaction in Distributed and Ubiquitous Computing
暂无分享,去创建一个
[1] James H. Aylor,et al. Computer for the 21st Century , 1999, Computer.
[2] Li Deng,et al. Speech and Language Processing for Multimodal Human-Computer Interaction , 2004, J. VLSI Signal Process..
[3] Julio Abascal,et al. Managing Intelligent Services for People with Disabilities and Elderly People , 2009, HCI.
[4] Julian Padget,et al. Combining Organisational and Coordination Theory with Model Driven Approaches to Develop Dynamic, Flexible, Distributed Business Systems , 2009, DigiBiz.
[5] Alfred Kobsa,et al. Personalised hypermedia presentation techniques for improving online customer relationships , 2001, The Knowledge Engineering Review.
[6] Antonella De Angeli,et al. Integration and synchronization of input modes during multimodal human-computer interaction , 1997, CHI.
[7] Joëlle Coutaz,et al. A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.
[8] Li Deng,et al. MiPad: a multimodal interaction prototype , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).
[9] Kalle Lyytinen,et al. Issues and Challenges in Ubiquitous Computing , 2002 .
[10] Luigi Ceccaroni,et al. PaTac: Urban, Ubiquitous, Personalized Services for Citizens and Tourists , 2009, 2009 Third International Conference on Digital Society.
[11] Younghee Jung,et al. How to look beyond what users say that they want , 2007, CHI Extended Abstracts.
[12] Li Deng,et al. Distributed speech processing in miPad's multimodal user interface , 2002, IEEE Trans. Speech Audio Process..
[13] Alfred Kobsa,et al. Tailoring the Presentation of Plans to Users' Knowledge and Capabilites , 2003, KI.