There is a great demand for reliving past experiences in order to not only enjoy itself but also learn from history. In tourist guides, it is popular to show old photographs from the locations where they were taken. This is a technique not only to compare the old landscape to just the current, but also to relive the camera operator's experience by standing on his/her point of view. This is effective in conveying the past situation. Research projects and applications focusing on the effect of this vicarious experience have recently emerged using Augmented Reality (AR) technology 1)-5) or Mixed Reality (MR) 6)-8). AR technology is a technique to present virtual information in semantic context with real environment. It is suitable for superimposing a photograph on the actual place where the photograph was taken. For example, Nakano et al. have developed a system called "On-site virtual time machine" 2) that guides users to the position where the photograph was taken based on the analysis on the relative displacement of the feature points in the photograph and one in the current scenery and the photograph in the position. They have shown that their system promotes deeper understanding of photographic subject and its background information. Now, consider adapting the effect of vicarious experience to video contents. Video contents include more rich information to learn past mood from synchronic perspective than photographs. For example, behavior of people, creatures, machines or natural phenomena as the most basic element to understand their nature are found only in video materials. The system for the vicarious experience using video materials will enable us to enjoy videos itself more and improve our understanding about the subject and the surrounding environment in it. Unlike photos, which capture only one direction, videos sometimes capture a wider space along with the movement of the camera operator. The space is where the camera operator was able to overlook at that time. Reconstructing past space in the video and enabling us to overlook the space freely as same as the camera operator help us to grasp a big picture view of the past. In addition, when discussing applying video materials to the vicarious experience based on augmented reality, not only the location and orientation, but also the camera operator's movement and the spatial expansion Abstract This paper propose and evaluate the Reliving Past Scene Experience system using augmented reality technology. This system overlays past scenes in video materials onto the real environment and makes users to experience how the camera operator captured the scene by inducing them to move as in the same way as the operator. By inducing users to move in the same way as the camera operator unconsciously, the system make them feel as if they look around in the past scene on their own will. For this purpose we proposed three induction techniques: induction for preventing unintentional movements, starting to move and appropriate rotation. We implemented these in a system which relives videos of old railways and exhibited it at THE RAILWAY MUSEUM for two weeks. Results of analyzing users' operational log and questionnaires suggest that proposed system and interaction techniques are effective to relive past scene experiences in real environment.
[1]
Laurent Étienne,et al.
Feeling bumps and holes without a haptic interface: the perception of pseudo-haptic textures
,
2004,
CHI.
[2]
Didier Stricker,et al.
Reality Filtering: A Visual Time Machine in Augmented Reality
,
2008,
VAST.
[3]
Robert C. Bolles,et al.
Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography
,
1981,
CACM.
[4]
Maureen C. Stone,et al.
Snap-dragging
,
1986,
SIGGRAPH.
[5]
Luc Van Gool,et al.
SURF: Speeded Up Robust Features
,
2006,
ECCV.
[6]
Takuji Narumi,et al.
Digital Diorama: AR Exhibition System to Convey Background Information for Museums
,
2011,
HCI.
[7]
Steven K. Feiner,et al.
Interaction Techniques for Exploring Historic Sites through Situated Media
,
2006,
3D User Interfaces (3DUI'06).
[8]
Eero Eloranta.
User Interface
,
1988,
Computer-Aided Production Management.
[9]
Yun-Ta Tsai,et al.
The Westwood Experience: Connecting story to locations via Mixed Reality
,
2010,
2010 IEEE International Symposium on Mixed and Augmented Reality - Arts, Media, and Humanities.
[10]
Tom Drummond,et al.
Going out: robust model-based tracking for outdoor augmented reality
,
2006,
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.
[11]
Tetsuya Kakuta,et al.
Outdoor gallery and its photometric issues
,
2010,
VRCAI '10.
[12]
Andrew W. Fitzgibbon,et al.
Markerless tracking using planar structures in the scene
,
2000,
Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).
[13]
Timo Engelke,et al.
The House of Olbrich — An Augmented Reality tour through architectural history
,
2011,
2011 IEEE International Symposium on Mixed and Augmented Reality - Arts, Media, and Humanities.
[14]
Takeo Kanade,et al.
Shape and motion from image streams under orthography: a factorization method
,
1992,
International Journal of Computer Vision.
[15]
Abderrahmane Kheddar,et al.
Pseudo-haptic feedback: can isometric input devices simulate force feedback?
,
2000,
Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).