Action Recognition in Manufacturing Assembly using Multimodal Sensor Fusion

Abstract Production innovations are occurring faster than ever. Manufacturing workers thus need to frequently learn new methods and skills. In fast changing, largely uncertain production systems, manufacturers with the ability to comprehend workers’ behavior and assess their operation performance in near real-time will achieve better performance than peers. Action recognition can serve this purpose. Despite that human action recognition has been an active field of study in machine learning, limited work has been done for recognizing worker actions in performing manufacturing tasks that involve complex, intricate operations. Using data captured by one sensor or a single type of sensor to recognize those actions lacks reliability. The limitation can be surpassed by sensor fusion at data, feature, and decision levels. This paper presents a study that developed a multimodal sensor system and used sensor fusion methods to enhance the reliability of action recognition. One step in assembling a Bukito 3D printer, which composed of a sequence of 7 actions, was used to illustrate and assess the proposed method. Two wearable sensors namely Myo-armband captured both Inertial Measurement Unit (IMU) and electromyography (EMG) signals of assembly workers. Microsoft Kinect, a vision based sensor, simultaneously tracked predefined skeleton joints of them. The collected IMU, EMG, and skeleton data were respectively used to train five individual Convolutional Neural Network (CNN) models. Then, various fusion methods were implemented to integrate the prediction results of independent models to yield the final prediction. Reasons for achieving better performance using sensor fusion were identified from this study.

[1]  Diane J. Cook,et al.  Simple and Complex Activity Recognition through Smart Phones , 2012, 2012 Eighth International Conference on Intelligent Environments.

[2]  Faicel Chamroukhi,et al.  Physical Human Activity Recognition Using Wearable Sensors , 2015, Sensors.

[3]  Ruzena Bajcsy,et al.  Sequence of the Most Informative Joints (SMIJ): A new representation for human skeletal action recognition , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[4]  Ning Yang,et al.  A Multisensor Multiclassifier Hierarchical Fusion Model Based on Entropy Weight for Human Activity Recognition Using Wearable Inertial Sensors , 2019, IEEE Transactions on Human-Machine Systems.

[5]  Weihua Sheng,et al.  Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[6]  Paul Lukowicz,et al.  Combining Motion Sensors and Ultrasonic Hands Tracking for Continuous Activity Recognition in a Maintenance Scenario , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[7]  Héctor Pomares,et al.  Human activity recognition based on a sensor weighting hierarchical classifier , 2013, Soft Comput..

[8]  L. Benini,et al.  Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness , 2007, 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information.

[9]  Wenjin Tao,et al.  Worker Activity Recognition in Smart Manufacturing Using IMU and sEMG Signals with Convolutional Neural Networks , 2018, EasyChair Preprints.

[10]  Jung Wook Park,et al.  Child Activity Recognition Based on Cooperative Fusion Model of a Triaxial Accelerometer and a Barometric Pressure Sensor , 2013, IEEE Journal of Biomedical and Health Informatics.

[11]  Paul Lukowicz,et al.  Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Yasushi Makihara,et al.  Similar gait action recognition using an inertial sensor , 2015, Pattern Recognit..

[13]  Nasser Kehtarnavaz,et al.  A survey of depth and inertial sensor fusion for human action recognition , 2015, Multimedia Tools and Applications.

[14]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[15]  Ming-Chuan Leu,et al.  Sensor Data Based Models for Workforce Management in Smart Manufacturing , 2018 .

[16]  Miguel A. Labrador,et al.  A Survey on Human Activity Recognition using Wearable Sensors , 2013, IEEE Communications Surveys & Tutorials.

[17]  Dimitris Kanellopoulos,et al.  Handling imbalanced datasets: A review , 2006 .

[18]  Yongcai Guo,et al.  HUMAN ACTIVITY RECOGNITION BY FUSING MULTIPLE SENSOR NODES IN THE WEARABLE SENSOR SYSTEMS , 2012 .

[19]  Jing Zhang,et al.  RGB-D-based action recognition datasets: A survey , 2016, Pattern Recognit..

[20]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[21]  Shenghui Zhao,et al.  A Comparative Study on Human Activity Recognition Using Inertial Sensors in a Smartphone , 2016, IEEE Sensors Journal.

[22]  Bernd Brügge,et al.  Agile Factory - An Example of an Industry 4.0 Manufacturing Process , 2015, 2015 IEEE 3rd International Conference on Cyber-Physical Systems, Networks, and Applications.

[23]  Marco Morana,et al.  Human Activity Recognition Process Using 3-D Posture Data , 2015, IEEE Transactions on Human-Machine Systems.

[24]  Wei-Yun Yau,et al.  Human Action Recognition With Video Data: Research and Evaluation Challenges , 2014, IEEE Transactions on Human-Machine Systems.

[25]  Hakob Sarukhanyan,et al.  ACTIVITY RECOGNITION USING K-NEAREST NEIGHBOR ALGORITHM ON SMARTPHONE WITH TRI-AXIAL ACCELEROMETER , 2012 .