Letter to the Editor: "Evaluation of Bone Mineral Density Using DXA and cQCT in Postmenopausal Patients Under Thyrotropin Suppressive Therapy".

In their recent article, Kim et al. (1) reported that substantial discordance exists in the diagnostic yield of osteoporosis between dual X-ray absorptiometry (DXA) and central quantitative CT (QCT) in postmenopausal women with thyroid cancer, receiving TSH suppression therapy, probably owing to the presence of aortic calcifications. Moreover, discrepancies regarding the diagnosis of osteopenia using DXA or QCT vs partially degraded microarchitecture using the trabecular bone score (TBS) were also observed between TBS vs DXA and TBS vs QCT, respectively. This discordance, concerning the trabecular bone, has been reported in postmenopausal women with TSH suppression using peripheral QCT (2) and TBS (3). In addition, this finding has also been observed in other disease states, such as glucocorticoid-induced osteoporosis, diabetes, and primary hyperparathyroidism (4). However, an important question, not answered in their study, given the absence of a control group and the retrospective analysis, is whether this discordance is specific to TSH suppression therapy or just an inherent limitation of DXA. Given the increasing number of patients receiving TSH suppression therapy and the high prevalence of vertebral fractures in these patients, even in cases with a normal bone mineral density (5), the need exists for more data using appropriate control groups and, probably, reevaluation of recommendations concerning fracture risk assessment and the initiation of specific osteoporosis treatment, incorporating clinical risk factors such as age, degree and duration of TSH suppression, and measures of bone strength beyond the use of DXA.