A Deep Learning Assisted Method for Measuring Uncertainty in Activity Recognition with Wearable Sensors

For human activity recognition with wearable sensors, understanding the uncertainty in the classifier decision is necessary to predict sensor failures and design active learning paradigms. Although deep learning models have shown promising results in recognizing human activities from sensor data, it is still challenging to estimate their uncertainty in producing decisions. In this paper, we propose a Bayesian deep convolutional neural network with stochastic latent variables that allows us to estimate both aleatoric (data dependent) and epistemic (model dependent) uncertainties in recognition task. We put a distribution over the latent variables of the model, which are the features that are automatically extracted by the convolutional layers, and show how the inference can be approximated by combining a variational autoencoder with a typical deep neural network classifier. We also leverage Dropout Bayesian neural network to approximate the model uncertainty. The experimental results on a publicly available dataset of human activity recognition with wearable sensors show how each uncertainty (i.e., aleatoric and epistemic) measure is sensitive against different sources of uncertainty namely noisy as well as novel data. Moreover, the uncertainty for the samples that are misclassified by the model is significantly higher on average than the samples that are correctly classified.