Recognizing context for annotating a live life recording

In the near future, it will be possible to continuously record and store the entire audio–visual lifetime of a person together with all digital information that the person perceives or creates. While the storage of this data will be possible soon, retrieval and indexing into such large data sets are unsolved challenges. Since today’s retrieval cues seem insufficient we argue that additional cues, obtained from body-worn sensors, make associative retrieval by humans possible. We present three approaches to create such cues, each along with an experimental evaluation: the user’s physical activity from acceleration sensors, his social environment from audio sensors, and his interruptibility from multiple sensors.

[1]  Gregory D. Abowd,et al.  The ContextCam: Automated Point of Capture Video Annotation , 2004, UbiComp.

[2]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[3]  Bernt Schiele,et al.  Context-aware notification for wearable computing , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[4]  Albrecht Schmidt,et al.  Adding context information to digital photos , 2005, 25th IEEE International Conference on Distributed Computing Systems Workshops.

[5]  Philip Robinson,et al.  Smart-Its - communication and sensing technology for UbiComp environments , 2003 .

[6]  Alex Pentland,et al.  Auditory Context Awareness via Wearable Computing , 1998 .

[7]  Vesa T. Peltonen,et al.  Computational auditory scene recognition , 2002, 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[8]  Albrecht Schmidt,et al.  Multi-sensor context aware clothing , 2002, Proceedings. Sixth International Symposium on Wearable Computers,.

[9]  Paul Lukowicz,et al.  Recognizing Workshop Activity Using Body Worn Microphones and Accelerometers , 2004, Pervasive.

[10]  Michael Christoph Büchler,et al.  Algorithms for sound classification in hearing instruments , 2002 .

[11]  William A. Nugent,et al.  Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation , 2000, SPIE Optics + Photonics.

[12]  John T. Stasko,et al.  Establishing tradeoffs that leverage attention for utility: empirically evaluating information display in notification systems , 2003, Int. J. Hum. Comput. Stud..

[13]  Pieter D. Biemond,et al.  Wearable Sensor Badge & Sensor Jacket for Context Awareness , 1999 .

[14]  Wendy A. Kellogg,et al.  "I'd be overwhelmed, but it's just one more thing to do": availability and interruption in research management , 2002, CHI.

[15]  Albrecht Schmidt,et al.  Multi-sensor Activity Context Detection for Wearable Computing , 2003, EUSAI.

[16]  Kristof Van Laerhoven,et al.  Spine versus porcupine: a study in distributed wearable activity recognition , 2004, Eighth International Symposium on Wearable Computers.

[17]  Kiyoharu Aizawa,et al.  Efficient retrieval of life log based on context and content , 2004, CARPE'04.

[18]  James Church,et al.  Wearable sensor badge and sensor jacket for context awareness , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[19]  Mary Czerwinski,et al.  Notification, Disruption, and Memory: Effects of Messaging Interruptions on Memory and Performance , 2001, INTERACT.

[20]  Alex Pentland,et al.  An Interactive Computer Vision System DyPERS: Dynamic Personal Enhanced Reality System , 1999, ICVS.

[21]  Steve Mann,et al.  Continuous lifelong capture of personal experience with EyeTap , 2004, CARPE'04.

[22]  C. Randell,et al.  Context awareness by analysing accelerometer data , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[23]  Albrecht Schmidt,et al.  Ubiquitous computing - computing in context , 2003 .

[24]  Tapio Seppänen,et al.  Bayesian approach to sensor-based context awareness , 2003, Personal and Ubiquitous Computing.

[25]  Paul Lukowicz,et al.  Wearable Sensing to Annotate Meeting Recordings , 2002, SEMWEB.

[26]  Heinz Jäckel,et al.  SPEEDY:a fall detector in a wrist watch , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[27]  M Akay,et al.  Unconstrained monitoring of body motion during walking. , 2003, IEEE engineering in medicine and biology magazine : the quarterly magazine of the Engineering in Medicine & Biology Society.

[28]  James Fogarty,et al.  Examining the robustness of sensor-based statistical models of human interruptibility , 2004, CHI.

[29]  Kristof Van Laerhoven,et al.  Real-time analysis of data from many sensors with neural networks , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[30]  Tapio Seppänen,et al.  Recognizing human motion with multiple acceleration sensors , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[31]  Bernt Schiele,et al.  A model for human interruptability: experimental evaluation and automatic estimation from wearable sensors , 2004, Eighth International Symposium on Wearable Computers.

[32]  Albrecht Schmidt,et al.  There is more to context than location , 1999, Comput. Graph..