Real-Time Human Tracker Based on Location and Motion Recognition of User for Smart Home

The ubiquitous smart home provides an automatic home service via analysis of human and home contexts. It receives a lot of contexts from the human and the home environment, but the most important context is information about the human’s location and motion. In this paper, we present a real-time human tracker that detects human location and motion using cameras and predicts services in the ubiquitous smart home. We propose an algorithm that captures effective area from four network cameras in which human is detected and his position is estimated. To detect human motion, three kinds of images are used: IMAGE1: empty room image, IMAGE2: image of furniture and home appliances in the home, IMAGE3: image of IMAGE2 and the human. The system decides if specific furniture or home appliance is associated with the human by analyzing three images, and estimates human motion using a support vector machine (SVM). Human motion was recognized as four types: "lie down", "sit", "stand-up", and "walk". The human motion recognition is decided from the pixel number by the array line of the moving object using SVM. We evaluated each motion 1000 times. The average accuracy of all the motions was found to be 85.3%.

[1]  Christopher J. C. Burges,et al.  A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.

[2]  J. Krumm,et al.  Multi-camera multi-person tracking for EasyLiving , 2000, Proceedings Third IEEE International Workshop on Visual Surveillance.

[3]  S. Intille,et al.  Improving Multiple People Tracking Using Temporal Consistency , .

[4]  Jun Zhang,et al.  A moving object detection approach using integrated background template for smart video sensor , 2004, Proceedings of the 21st IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.04CH37510).

[5]  Kazuhiko Yamamoto,et al.  Estimation of human motion from multiple cameras for gesture recognition , 2002, Object recognition supported by user interaction for service robots.

[6]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  James W. Davis,et al.  Real-time recognition of activity using temporal templates , 1996, Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96.

[8]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[9]  Dongkyoo Shin,et al.  Research and implementation of the context-aware middleware for controlling home appliances , 2005, 2005 Digest of Technical Papers. International Conference on Consumer Electronics, 2005. ICCE..

[10]  Woontack Woo,et al.  Ubi-UCAM: A Unified Context-Aware Application Model , 2003, CONTEXT.

[11]  Sajal K. Das,et al.  Guest Editorial - Smart Homes , 2002 .