Multimodal Diaries

Time management is an important aspect of a successful professional life. In order to have a better understanding of where our time goes, we propose a system that summarizes the user's daily activity (e.g. sleeping, walking, working on the pc, talking, ...) using all-day multimodal data recordings. Two main novelties are proposed: (i) a system that combines both physical and contextual awareness hardware and software. It records synchronized audio, video, body sensors, GPS and computer monitoring data. (ii) A semi-supervised temporal clustering (SSTC) algorithm that accurately and efficiently groups large amounts of multimodal data into different activities. The effectiveness and accuracy of our SSTC is demonstrated in synthetic and real examples of activity segmentation from multimodal data gathered over long periods of time.

[1]  Andreas Krause,et al.  Unsupervised, dynamic identification of physiological and activity context in wearable computing , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[2]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[3]  Astro Teller,et al.  The BodyMedia platform: continuous body intelligence , 2004, CARPE'04.

[4]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[5]  Takeo Kanade,et al.  Discriminative cluster analysis , 2006, ICML.