Towards Personalized Biomechanical Model and MIND-Weighted Point Matching for Robust Deformable MR-TRUS Registration

This paper explores a novel deformable MR-TRUS registration method, which uses a personalized statistical motion model and a robust point matching strategy that is boosted by the modality independent neighborhood descriptor (MIND) algorithm. Current deformable MR-TRUS registration methods limit to inaccurate deformation estimation and unstable point correspondence. To precisely estimate tissue deformation, we construct a personalized statistical motion model (PSMM) on the basis of finite element methods and patient-specific biomechanical properties that were detected by ultrasound elastography. To accurately obtain the point correspondence between surface point sets segmented from MR and TRUS images, we first adopt the PSMM to provide realistic boundary conditions for point correspondence estimation. We further introduce the MIND to weight a robust point matching procedure. We evaluate our method on five datasets from volunteers. The experimental results demonstrate that our novel approach provides more accurate and robust MR-TRUS registration than state-of-the-art methods. The current target registration error was significantly improved from 2.6 to 1.8 mm. which completely meets the clinical requirement of less than 2.5 mm.