Two-Dimensional DICOM Feature Points and Their Mapping Extraction for Identifying Brain Shifts

In order to model organ deformation precisely, we extract numerous feature points and also their mapping correspondences from two layered two-dimensional Digital Imaging and Communications in Medicine (DICOM) images. In this study, we first selected the same image twice (the 68th image) from 124 layered two-dimensional DICOM images, and then two consecutive images (the 68th and 69th) and two that were far apart (the 55th and 80th). Next, twodimensional feature points were extracted from these images, and their mapping was searched. We utilized the two-dimensional image feature point extraction/correspondence algorithms scale-invariant feature transform (SIFT), KAZE, Accelerated KAZE (AKAZE), and oriented FAST and rotated BRIEF (ORB) from OpenCV with real DICOM files to confirm that the aforementioned extraction and mapping was possible. According to our results, although the method for searching for matches by only looking for similar feature points in the vicinity of a certain feature point required slightly more calculation time than the method of looking for matches across the entire DICOM area, in the end it did decrease the number of mistaken matching correspondences.