Acoustic and visual signal based context awareness system for mobile application

In this paper, an acoustic and visual signal based context awareness system is proposed for a mobile application. The proposed system senses and determines, in real time, user contextual information, such as where the user is or what the user does, by processing signals from the microphone and the camera embedded in the mobile device. An initial implementation of the algorithms into a smart phone demonstrated effectiveness of the proposed system.

[1]  Alex Pentland,et al.  Auditory Context Awareness via Wearable Computing , 1998 .

[2]  Vesa T. Peltonen,et al.  Audio-based context recognition , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[3]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[4]  Ben P. Milner,et al.  Acoustic environment classification , 2006, TSLP.

[5]  P. Peer,et al.  Human skin color clustering for face detection , 2003, The IEEE Region 8 EUROCON 2003. Computer as a Tool..

[6]  Martin Szummer,et al.  Indoor-outdoor image classification , 1998, Proceedings 1998 IEEE International Workshop on Content-Based Access of Image and Video Database.

[7]  Shengcai Liao,et al.  Learning Multi-scale Block Local Binary Patterns for Face Recognition , 2007, ICB.