Multi Activity Recognition Based on Bodymodel-Derived Primitives

We propose a novel model-based approach to activity recognition using high-level primitives that are derived from a human body model estimated from sensor data. Using short but fixed positions of the hands and turning points of hand movements, a continuous data stream is segmented in short segments of interest. Within these segments, joint boosting enables the automatic discovery of important and distinctive features ranging from motion over posture to location. To demonstrate the feasibility of our approach we present the user-dependent and across-user results of a study with 8 participants. The specific scenario that we study is composed of 20 activities in quality inspection of a car production process.

[1]  Bernt Schiele,et al.  A new approach to enable gesture recognition in continuous data streams , 2008, 2008 12th IEEE International Symposium on Wearable Computers.

[2]  Paul Lukowicz,et al.  Combining Motion Sensors and Ultrasonic Hands Tracking for Continuous Activity Recognition in a Maintenance Scenario , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[3]  Paul Lukowicz,et al.  Using FSR based muscule activity monitoring to recognize manipulative arm gestures , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[4]  Paul Lukowicz,et al.  Using a complex multi-modal on-body sensor system for activity spotting , 2008, 2008 12th IEEE International Symposium on Wearable Computers.

[5]  Timo Pylvänäinen,et al.  Accelerometer Based Gesture Recognition Using Continuous HMMs , 2005, IbPRIA.

[6]  Blake Hannaford,et al.  A Hybrid Discriminative/Generative Approach for Modeling Human Activities , 2005, IJCAI.

[7]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[8]  Jiangwen Deng,et al.  An HMM-based approach for gesture segmentation and recognition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[9]  Bernt Schiele,et al.  Toward Recognition of Short and Non-repetitive Activities from Wearable Sensors , 2007, AmI.

[10]  Jake K. Aggarwal,et al.  Recognition of Composite Human Activities through Context-Free Grammar Based Representation , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[11]  Romano Fantacci Proceedings of the ICST 2nd international conference on Body area networks , 2007 .

[12]  Tapio Seppänen,et al.  Hand gesture recognition of a mobile device user , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[13]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[14]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[15]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Antonio Torralba,et al.  Sharing Visual Features for Multiclass and Multiview Object Detection , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Gerhard Tröster,et al.  Gestures are strings: efficient online gesture spotting and classification using string matching , 2007, BODYNETS.

[18]  Jani Mäntyjärvi,et al.  User Independent Gesture Interaction for Small Handheld Devices , 2006, Int. J. Pattern Recognit. Artif. Intell..