harAGE: A Novel Multimodal Smartwatch-based Dataset for Human Activity Recognition

This work introduces the harAGEdataset: a novel multimodal smartwatch-based dataset for Human Activity Recognition (HAR) with more than 17 hours of data collected from 19 participants using a Garmin Vivoactive 3 device. The dataset contains samples from resting, lying, sitting, standing, washing hands, walking, running, stairs climbing, strength workout, flexibility workout, and cycling activities. The resting activity, excluded from the set of activities to recognise, was explicitly conducted while avoiding stressors and external stimuli, so the data collected can be used to compute the personal, baseline heart rate at rest. We also present the HAR-based models trained using the accelerometer data to recognise different sets of activities. Specifically, we focus on different strategies to combine, fuse, and enrich the accelerometer measurements, so they can be used end-to-end. Model performances are assessed following a Leave-One-Subject-Out Cross-Validation (LOSO-CV) approach, and we use the Unweighted Average Recall (UAR) as the evaluation metric to compare the ground truth and the inferred information. The best UAR score of 98.1 % is obtained when recognising the static and the dynamic activities, excluding the samples corresponding to the washing hands, strength workout, and flexibility workout activities. When recognising the specific activities included in these two sets, the model with the best performance scores a UAR of 70.1 %. Finally, when recognising all the activities considered in the harAGEdataset, the highest UAR achieved is 64.3 %.