Improved quantitative MR thermometry using a 1.5-T scanner to monitor cooled applicator systems during laser-induced interstitial thermotherapy (LITT)

Cooled LITT applicator systems are known to induce complex temperature patterns. Typical for such devices, the temperature maxima are often shifted away from the applicator into the tissue. Therefore, an adequate temperature monitoring is essential. This, however, has not yet been realized for many of the latest MRI systems. We have implemented an improved MR thermometry system using a gradient echo pulse sequence (5 mm slice, FOV 230 mm, TR/TE equals 80/26 ms, matrix 192 X 256) on a 1.5 T scanner (Magnetron Vision, Siemens, Erlangen, Germany). The recorded temperature expansion during laser irradiation of bovine liver was used as a model setup for LITT. A commercially available water-cooled applicator system (Microdome light guide, Huettinger, Umkirch, Germany, in combination with the Power Catheter, Somatex, Berlin, Germany) was used for the delivery of the Nd:YAG laser radiation ((lambda) equals 1064 nm, cw, 15.5 W, Dornier 4060N, Germering, Germany) and tissue cooling, respectively. MR phase images were recorded every 30 seconds alternating in axial and radial orientation. The temperature distributions were calculated using the proton resonance frequency method. A sensitivity factor of 0.0097 ppm/ degree(s)C has been determined independently by a comparison with fluorooptic temperature measurements. The temperature accuracy of a single pixel (0.9 mm square) during 10 min laser irradiation of bovine liver tissue was found to be (-1.7 +/- 1.4) degree(s)C. The final lesion size diameters after 6 min laser irradiation (15 mm X 26 mm) were found to be in good agreement with the dimensions of the 60 degree(s)C isotherm of the respective 2D temperature map. This indicates that the implemented MR thermometry might be an essential tool for therapy control of interstitial laser treatment with cooled applicator systems.