Useful Research Tools for Human Behaviour Understanding in the Context of Ambient Assisted Living

When novice researchers in the fields of Computer Vision and Human Behaviour Analysis/Understanding (HBA/HBU) initiate new projects applied to Ambient-Assisted Living (AAL) scenarios, a lack of specific, publicly available frameworks, tools and datasets is perceived. This work is an attempt to fill that particular gap, by presenting different field-related datasets—or benchmarks—, according to a taxonomy (which is also presented), and taking into account their availability as well as their relevance. Furthermore, it reviews and puts together a series of tools—either frameworks or pieces of software—that are at hand (although dispersed), which can ease the task. To end with the work, some conclusions are drawn about the reviewed tools, putting special emphasis in their generality and reliability.

[1]  Josef Hallberg,et al.  homeML - An Open Standard for the Exchange of Data Within Smart Environments , 2007, ICOST.

[2]  Andreas Savvides,et al.  The BehaviorScope framework for enabling ambient assisted living , 2010, Personal and Ubiquitous Computing.

[3]  Andreas P. Schmidt,et al.  SOPRANO – An extensible , open AAL platform for elderly people based on semantical contracts 1 , 2008 .

[4]  Barbara Caputo,et al.  Recognizing human actions: a local SVM approach , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[5]  Jessica K. Hodgins,et al.  Guide to the Carnegie Mellon University Multimodal Activity (CMU-MMAC) Database , 2008 .

[6]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[7]  Kent Larson,et al.  Using a Live-In Laboratory for Ubiquitous Computing Research , 2006, Pervasive.

[8]  山崎 達也,et al.  Pervasive Computing for Quality of Life Enhancement, 5th International Conference On Smart Homes and Health Telematics, ICOST 2007, Nara, Japan, June 21-23, 2007, Proceedings , 2007, ICOST.

[9]  Hossein Ragheb,et al.  MuHAVi: A Multicamera Human Action Video Dataset for the Evaluation of Action Recognition Methods , 2010, 2010 7th IEEE International Conference on Advanced Video and Signal Based Surveillance.

[10]  Cordelia Schmid,et al.  Learning realistic human actions from movies , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Moritz Tenorth,et al.  The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.

[12]  Rémi Ronfard,et al.  Free viewpoint action recognition using motion history volumes , 2006, Comput. Vis. Image Underst..

[13]  Chris D. Nugent,et al.  Evidential fusion of sensor data for activity recognition in smart homes , 2009, Pervasive Mob. Comput..

[14]  Uwe Hansmann,et al.  Pervasive Computing , 2003 .

[15]  Mubarak Shah,et al.  Action MACH a spatio-temporal Maximum Average Correlation Height filter for action recognition , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Birgitta König-Ries,et al.  openAAL 1the open source middleware for ambient-assisted living ( AAL ) , 2010 .

[17]  Ronald Poppe,et al.  A survey on vision-based human action recognition , 2010, Image Vis. Comput..

[18]  Ronen Basri,et al.  Actions as Space-Time Shapes , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.