The recognition of human activities in real-time remains an important challenge to enable supportive residential well-being assessments. In this paper, we measure and assess patterns associated with dynamic human activities, such as the actions of sitting-down and standing-up, by analyzing Wi-Fi channel state information. Two Wi-Fi sensors are used to capture information about a person’s activities within a limited space. The effect of sensor location and activity location within the space are assessed to study the independence of the proposed solution with respect to these factors. From the acquired data, feature vectors of 168 variables of kurtosis, maximum, maximum peak, mean, minimum, skew, standard deviation, and variance values are calculated. Traditional classifiers are evaluated for the prediction of dynamic sitting and standing activities. Results obtained demonstrate a classification accuracy of 98.5% using a medium Gaussian support vector machine. Deep learning using a bidirectional long short-term memory network is also tested for sequence-to-label and sequence-to-sequence classification from time series of statistical measures. These models achieved accuracies of 90.7% and 85.1%, respectively. The proposed feature extraction and applied classification demonstrate the ability of our solution to not only differentiate between static and motion activities, but also distinguish between the similar motions of standing-up and sitting-down. Therefore, this work goes beyond the state-of-the-art that generally focuses on detecting motion, not distinguishing between similar movements by subjects.