A CNN Based Human Bowel Sound Segment Recognition Algorithm with Reduced Computation Complexity for Wearable Healthcare System

Human bowel sounds (BSs) deliver much useful information about gastrointestinal health status. In recent years, the utilization of advanced bio-acoustic sensors, such as the wired electronic stethoscopes and the wireless wearable sound recording patches, has enabled researchers to record, store and analyze the BSs in digitized manners, i.e., computerized bowel sound analysis. However, for collected BS recordings, how to effectively pick up the segments that contain BS events while ignoring those segments that only contain background noises remains difficult. Moreover, the BS segment recognition algorithms that are meant to be applied in the wearable healthcare scenes are further required to retain a low computation complexity. In this work, a light weighted BS recognizer based on convolutional neural networks (CNNs) is proposed for wearable systems. Specifically, the proposed recognizer firstly converts each one-dimensional segment into a two-dimensional spectrogram by calculating the Mel-frequency cepstrum coefficients (MFCCs) frame by frame and then passes the spectrogram through a CNN to infer the category of the segment. To validate the CNN-based BS recognizer, a 28 minutes of BS dataset that contains 955 BS-present segments and 725 BS-absent segments are made. Experimental results on the dataset show that the recognizer attains the 91.25% and 90.83% mean accuracies for the ‘not-across-subjects’ and ‘across-subjects’ validation, respectively. Moreover, compared with the state-of-the-art LSTM approach, the CNN-based BS recognizer is light-weighted with only 20.35k parameters, which is a quarter of that of the LSTM. Due to the lower model complexity, the CNN-based BS recognizer has the potential to be integrated into the gateways in a wearable system to secure the function that only the BS-present segments are allowed to be relayed to the remote. User privacy can be better protected in this way.