Low-cost multi-view pose tracking using active markers

Vision-based motion tracking systems have important applications in robotics and human motion analysis as well as in the entertainment industry. In this paper, we present an overview of a high-precision low-cost motion tracking system that combines active pulsed markers with multi-view pose reconstruction. While many passive-marker tracking systems impose restrictions on marker placement for correct marker identification, active pulsed LEDs allow for rigid body tracking with arbitrary marker placement as well as marker tracking on unknown articulated and deformable bodies. Individual markers are tracked via a path search method, which is robust against short temporary occlusions. Once detected, a rigid body can be tracked via a minimal number of marker observations from a single view or distributed across multiple views, even if the position of each marker could not be reconstructed. Additionally, IMU data can be fused with optical marker tracking to improve precision and robustness against marker occlusions.

[1]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[2]  Peter Lindstrom Triangulation made easy , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[3]  Eric Foxlin,et al.  Motion Tracking Requirements and Technologies , 2002 .

[4]  Jean-Yves Hervé,et al.  Visual tracking of hand posture with occlusion handling , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[5]  Thomas S. Huang,et al.  Motion and structure from feature correspondences: a review , 1994, Proc. IEEE.

[6]  John Weston,et al.  Strapdown Inertial Navigation Technology , 1997 .

[7]  Norbert Schmitz,et al.  Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion , 2017, Sensors.

[8]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[9]  Sebastian Madgwick,et al.  Estimation of IMU and MARG orientation using a gradient descent algorithm , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[10]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[11]  Nicholas Ayache,et al.  Trinocular Stereo Vision for Robotics , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Christian Szegedy,et al.  DeepPose: Human Pose Estimation via Deep Neural Networks , 2013, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Geonho Cha,et al.  Multi-modal human action recognition using deep neural networks fusing image and inertial sensor data , 2017, 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI).

[14]  Xiaowei Zhou,et al.  Learning to Estimate 3D Human Pose and Shape from a Single Color Image , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[15]  Jitendra Malik,et al.  End-to-End Recovery of Human Shape and Pose , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[16]  Anthony Lawrence,et al.  Modern Inertial Technology: Navigation, Guidance, and Control , 1993 .

[17]  Reinhard Männer,et al.  An Efficient and Accurate Method for 3D-Point Reconstruction from Multiple Views , 2005, International Journal of Computer Vision.

[18]  Shwetak N. Patel,et al.  Finexus: Tracking Precise Motions of Multiple Fingertips Using Magnetic Sensing , 2016, CHI.

[19]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Joseph K. Kearney,et al.  Optimal Camera Placement for Motion Capture Systems , 2017, IEEE Transactions on Visualization and Computer Graphics.

[21]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[22]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[23]  Daniel Roetenberg,et al.  Inertial and magnetic sensing of human motion , 2006 .

[24]  Wolfgang Birkfellner,et al.  Electromagnetic Tracking in Medicine—A Review of Technology, Validation, and Applications , 2014, IEEE Transactions on Medical Imaging.

[25]  F. Raab,et al.  Magnetic Position and Orientation Tracking System , 1979, IEEE Transactions on Aerospace and Electronic Systems.

[26]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[27]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[28]  Fazel Naghdy,et al.  Motion capture in robotics review , 2009, 2009 IEEE International Conference on Control and Automation.

[29]  Markus Vincze,et al.  Fusion of Vision and Inertial Data for Motion and Structure Estimation , 2004, J. Field Robotics.

[30]  Stefano Paolucci,et al.  Wearable inertial sensors for human movement analysis , 2016, Expert review of medical devices.

[31]  M. M. Reijne,et al.  Accuracy of human motion capture systems for sport applications; state-of-the-art review , 2018, European journal of sport science.

[32]  Stevica Graovac Principles of Fusion of Inertial Navigation and Dynamic Vision , 2004, J. Field Robotics.

[33]  Joseph Bray Vision Markerless Based Human Motion Capture : A Survey , 2000 .

[34]  Wei Tech Ang,et al.  An efficient real-time human posture tracking algorithm using low-cost inertial and magnetic sensors , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[35]  H. C. Longuet-Higgins,et al.  A computer algorithm for reconstructing a scene from two projections , 1981, Nature.

[36]  G. Schmidt,et al.  Inertial sensor technology trends , 2001 .

[37]  Jia Deng,et al.  Stacked Hourglass Networks for Human Pose Estimation , 2016, ECCV.

[38]  Seth J. Teller,et al.  Epipolar Constraints for Vision-Aided Inertial Navigation , 2005, 2005 Seventh IEEE Workshops on Applications of Computer Vision (WACV/MOTION'05) - Volume 1.