Self-contained optical-inertial motion capturing for assembly planning in digital factory

In assembly activities, performing assembly planning is a crucial issue for the human-centered manufacturing. The challenges involve in retrieving and utilizing the real-time data about human-based work activities in a shop floor. Instead of the simulation-based assembly planning, the marker-based motion capture system can acquire realistic motion data of workers in assembling sites, but this method is inclined to corrupt due to the occlusion and is troublesome to be installed within the shop floor. Therefore, based on the complement of the optical and inertial sensor, this paper presents a self-contained motion capture method for assembly planning in real shop floor. It can provide a real-time and portable motion capturing for workers, avoiding the failure of traditional outside-in motion capture system due to occlusions or incorrect installations. What is more, the portable motion capture method can run on consumer mobile devices, providing a convenient and low-cost way to perceive the workers’ motion in shop floor, which is significant for extensive applications in assembly verification and planning for digital factories. Finally, experiments are carried out to demonstrate the accuracy and feasibility of the proposed motion capture method for assembly activities.

[1]  Ming C. Leu,et al.  Comparison of Marker-Based and Marker-Less Systems for Low-Cost Human Motion Capture , 2013 .

[2]  Sven Lange,et al.  Incremental smoothing vs. filtering for sensor fusion on an indoor UAV , 2013, 2013 IEEE International Conference on Robotics and Automation.

[3]  Florian Geiselhart,et al.  On the Use of Multi-Depth-Camera Based Motion Tracking Systems in Production Planning Environments , 2016 .

[4]  Tim Baines,et al.  Using empirical evidence of variations in worker performance to extend the capabilities of discrete event simulations in manufacturing , 2003, Proceedings of the 2003 Winter Simulation Conference, 2003..

[5]  Manolis I. A. Lourakis,et al.  SBA: A software package for generic sparse bundle adjustment , 2009, TOMS.

[6]  Jianrong Tan,et al.  Constrained behavior manipulation for interactive assembly in a virtual environment , 2007 .

[7]  Xi Chen,et al.  An inertial-based human motion tracking system with twists and exponential maps , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Fiorenzo Franceschini,et al.  Experimental comparison of dynamic tracking performance of iGPS and laser tracker , 2011 .

[9]  Ashutosh Tiwari,et al.  Digitisation of a moving assembly operation using multiple depth imaging sensors , 2016 .

[10]  Kevin W. Lyons,et al.  Virtual assembly using virtual reality techniques , 1997, Comput. Aided Des..

[11]  Roland Siegwart,et al.  Monocular Vision for Long‐term Micro Aerial Vehicle State Estimation: A Compendium , 2013, J. Field Robotics.

[12]  Xiaoqing Frank Liu,et al.  Computer-automated ergonomic analysis based on motion capture and assembly simulation , 2015, Virtual Reality.

[13]  Wei Gao,et al.  Enhancing fidelity of virtual assembly by considering human factors , 2016 .

[14]  Chunxia Pan Integrating CAD files and automatic assembly sequence planning , 2005 .

[15]  Jörg Krüger,et al.  Automated vision-based live ergonomics analysis in assembly operations , 2015 .

[16]  Martin Manns,et al.  Measuring Motion Capture Data Quality for Data Driven Human Motion Synthesis , 2016 .

[17]  Enrico Rukzio,et al.  Presenting a Novel Motion Capture-based Approach for Walk Path Segmentation and Drift Analysis in Manual Assembly☆ , 2016 .

[18]  Soh-Khim Ong,et al.  Real-virtual components interaction for assembly simulation and planning , 2016 .

[19]  Stefan Mengel,et al.  Using Marker-less Motion Capture Systems for Walk Path Analysis in Paced Assembly Flow Lines , 2016 .

[20]  Jinsong Bao,et al.  Assembly operation process planning by mapping a virtual assembly simulation to real operation , 2013, Comput. Ind..

[21]  Roland Siegwart,et al.  Real-time metric state estimation for modular vision-inertial systems , 2011, 2011 IEEE International Conference on Robotics and Automation.

[22]  Dieter Schmalstieg,et al.  Pose tracking from natural features on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[23]  Vincent G. Duffy,et al.  A methodology for assessing industrial workstations using optical motion capture integrated with digital human models , 2007 .

[24]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[25]  Paul G. Maropoulos,et al.  Review of the application of flexible, measurement-assisted assembly technology in aircraft manufacturing , 2014 .

[26]  Sen Yang,et al.  A LINE-MOD-based markerless tracking approachfor AR applications , 2017 .

[27]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[28]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[29]  Simon Baker,et al.  Lucas-Kanade 20 Years On: A Unifying Framework , 2004, International Journal of Computer Vision.

[30]  Emanuele Menegatti,et al.  A robust and easy to implement method for IMU calibration without external equipments , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[31]  Alain Bernard,et al.  CAD model based virtual assembly simulation, planning and training , 2013 .