This chapter introduces video data management techniques for computational augmentation of human memory, i.e., augmented memory, on wearable and ubiquitous computers used in our everyday life. The ultimate goal of augmented memory is to enable users to conduct themselves using human memories and multimedia data seamlessly anywhere, anytime. In particular, a user’s viewpoint video is one of the most important triggers for recalling past events that have been experienced. We believe designing augmented memory system is a practical issue for real world-oriented video data management. This chapter also describes a framework for an augmented memory albuming system named Sceneful Augmented Remembrance Album (SARA). In the SARA framework, we have developed three modules for retrieving, editing, transporting, and exchanging augmented memory. Both the Residual Memory module and the I’m 701 E. Chocolate Avenue, Suite 200, Hershey PA 17033-1240, USA Tel: 717/533-8845; Fax 717/533-8661; URL-http://www.irm-press.com IRM PRESS This chapter appears in the bo k Video Data Management and Information Retrieval by Sagarmay Deb. Copyright © 2005, IRM Press, an imprint of Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited. 34 Kawamura, Ueoka, Kono, & Kidode Copyright © 2005, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited. Here! module enable a wearer to retrieve video data that he/she wants to recall in the real world. The Ubiquitous Memories module is proposed for editing, transporting, and exchanging video data via real world objects. Lastly, we discuss future works for the proposed framework and modules. INTRODUCTION In this chapter, we introduce our wearable and ubiquitous video data management study for computational augmentation of human memory in everyday life. Scientific psychological analysis and engineering technologies for memory aid have been studied extensively in recent years. Psychological results show that the human brain can cause mistakes in either the encoding, storing, or retrieval process (Brewer & Treyens, 1981; Nickerson & Adams, 1979). The technology of the computational augmentation of human memory aims to integrate computationally recorded multimedia data named, “augmented memory” (Rhodes, 1995, 1997), into human memory. The ultimate goal of augmented memory is to enable users to conduct themselves using these memories seamlessly anywhere, anytime in their everyday life. In particular, video data provides the user with strong stimuli to recall past events that he or she has experienced. Our research consists of several studies used in developing a video-based augmented memory system. In the field of computational memory aid, several representative works on wearable computers exist. Mann (1997), for example, described a user who wears a CCD camera and sensors to record his/her everyday life. This system allows the user to get information anytime and anywhere the user wants. This type of human-centered computer technology is called “wearable computing.” Wearable computing technology must be aware of the user’s internal (desire, emotion, health, action, etc.) and external (goings on, temperature, other people, etc.) state at any time. Jimminy (a Wearable Remembrance Agent) also supports human activities using just-in-time information retrieval (Rhodes, 2003). Kawashima, Nagasaki, and Toda (2002) and Toda, Nagasaki, Iijima, and Kawashima (2003) have developed an automatic video summarization system using visual pattern recognition methods for location recognition and a view tracking device for a user’s action recognition. The Mithril platform has also advanced over the years (DeVaul, Pentland , & Corey, 2003). Kidode (2002) has developed an advanced media technology project named a Wearable Information Playing Station (WIPS). Our study of augmented memory in this chapter is a part of the WIPS project. Lamming and Flynn (1994) have developed Forget-me-not, a prototype system as a portable episodic memory aid. This system records a user’s action history using sensors implanted in a laboratory and active badges worn by users. The user can refer to his/her own history and easily replay a past event on a PDA. This Forget-me-not study is based on the concept of “ubiquitous computing” proposed by Weiser (1991). The Aware Home Research Initiative is also directly inspired by the same concept (Abowd & Mynatt, 2002; Kidd et al., 1999; Tran & Mynatt, 2003). Augmented memory technologies using video data are divided into three simple topics as follows: a location-based memory aid, ubiquitous video data management, and a person’s information-based augmented memory. In the case of location-based memory supporting systems, Hoisko (2000) developed a visual episodic memory prosthesis that retrieves video data recorded at the place attached to certain IR tags. The Global 42 more pages are available in the full version of this document, which may be purchased using the "Add to Cart" button on the product's webpage: www.igi-global.com/chapter/wearable-ubiquitous-video-datamanagement/30761?camid=4v1 This title is available in InfoSci-Books, InfoSci-Database Technologies, Library Science, Information Studies, and Education, InfoSci-Library and Information Science, InfoSciComputer Science and Information Technology, Science, Engineering, and Information Technology, InfoSci-Select, InfoSci-Select. Recommend this product to your librarian: www.igi-global.com/e-resources/libraryrecommendation/?id=1
[1]
Hideaki Takeda,et al.
Ubiquitous memories : wearable interface for computational augmentation of human memory based on real world objects
,
2002
.
[2]
Masatsugu Kidode,et al.
HySIM: A Hybrid-space Image Matching Method for a High Speed Location-Based Video Retrieval on a Wearable Computer
,
2002,
MVA.
[3]
Gregory D. Abowd,et al.
The Human Experience
,
2002,
IEEE Pervasive Comput..
[4]
Masatsugu Kidode,et al.
Nice2CU: managing a person's augmented memory
,
2003,
Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..
[5]
Woontack Woo,et al.
Ubi-UCAM: A Unified Context-Aware Application Model
,
2003,
CONTEXT.
[6]
Bradley J. Rhodes,et al.
The wearable remembrance agent: A system for augmented memory
,
1997,
Digest of Papers. First International Symposium on Wearable Computers.
[7]
Kiyoharu Aizawa,et al.
Automatic Summarization of Wearable Video - Indexing Subjective Interest
,
2001,
IEEE Pacific Rim Conference on Multimedia.
[8]
W. Brewer,et al.
Role of schemata in memory for places
,
1981,
Cognitive Psychology.
[9]
Toshiyuki Amagasa,et al.
A System for Retrieval and Digest Creation of Video Data Based on Geographic Objects
,
2002,
DEXA.
[10]
Jennifer Healey,et al.
StartleCam: a cybernetic wearable camera
,
1998,
Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).
[11]
Alex Pentland,et al.
Framing through peripheral perception
,
2000,
Proceedings 2000 International Conference on Image Processing (Cat. No.00CH37101).
[12]
Jun Rekimoto,et al.
Augment-able reality: situated communication through physical and digital spaces
,
1998,
Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).
[13]
M. Lamming,et al.
"Forget-me-not" Intimate Computing in Support of Human Memory
,
1994
.
[14]
Jun Rekimoto,et al.
CyberCode: designing augmented reality environments with visual tags
,
2000,
DARE '00.
[15]
Alex Pentland,et al.
DyPERS: Dynamic Personal Enhanced Reality System
,
1998
.
[16]
Masatsugu Kidode,et al.
A Novel Video Retrieval Method to Support a User's Recollection of Past Events Aiming for Wearable Information Playing
,
2001,
IEEE Pacific Rim Conference on Multimedia.