How do radiographic techniques affect mass lesion detection performance in digital mammography?

We investigated how the x-ray tube kV and mAs affected the detection of simulated lesions with diameters between 0.24 and 12 mm. Digital mammograms were acquired with and without mass lesions, permitting a difference image to be generated corresponding to the lesion alone. Isolated digital lesions were added at a reduced intensity to non-lesion images, and used in Four-Alternate Forced Choice (4-AFC) experiments to determine the lesion intensity that corresponded to an accuracy of 92% (I92%). Values of I92% were determined at x-ray tube output values ranging from 40 to 120 mAs, and x-ray tube voltages ranging from 24 to 32 kV. For mass lesions larger than ~0.8 mm, there was no significant change in detection peformance with changing mAs. Doubling of the x-ray tube output from 60 to 120 mAs resulted in an average change in I92% of only +3.8%, whereas the Rose model of lesion detection predicts a reduction in the experimental value of I92% of -29%. For the 0.24 mm lesion, however, reducing the x-ray beam mAs from 100 to 40 mAs reduced the average detection performance by ~60%. Contrast-detail curves for lesions with diameter ≥ 0.8 mm had a slope of ~+0.23, whereas the Rose model predicts a slope of -0.5. For lesions smaller than ~0.8 mm, contrast-detail slopes were all negative with the average gradient increasing with decreasing mAs value. Increasing the x-ray tube voltage from 24 to 32 kV at a constant display contrast resulted in a modest improvement in low contrast lesion detection performance of ~10%. Increasing the display window width from 2000 to 2500 reduced the average observer performance by ~6%. Our principal finding is that radiographic technique factors have little effect on detection performance for lesions larger than ~0.8 mm, but that the visibility of smaller lesions is affected by quantum mottle in qualitative agreement with the predictions of the Rose model.