The purpose of this study was to quantify the effect that breast density has on the detection performance of mammography, as determined by the area under the receiver operator characteristic curve (Az) and sensitivity. Each of 998 digitized normal mammograms were classified into five categories based on a radiologist-determined breast density scale. Computer-generated spherical soft-tissue cancer lesions (2-40 mm in diameter) were randomly positioned on each image with use of mathematics that model the physics of mammographic imaging. Computer simulations that use a computer-based ideal observer were used to determined Az for each image and lesion size. The results demonstrated that increased breast density resulted in significantly reduced detection performance between each of the five pentiles. A 22.2% reduction in absolute detection performance was demonstrated in women classified in the densest pentile compared to women in the least dense pentile. For a 10-mm soft-tissue lesion, the estimated sensitivity (at 90% specificity) decreased from 74% for an average adipose breast to 47% for a breast in the highest density pentile. It was estimated that a 1.3-year delay in detection occurs for women classified in the densest category compared to those in the least dense category. In lesions with diameters from 6 mm to 20 mm, a reduction of sensitivity of 8% was found for each pentile as breast density increased. Theses results were computed for soft tissue lesions, and cancers that present with microcalcifications would likely have higher sensitivities associated with them.
- Breast density
- Computer simulation
- Ideal observer
- Receiver operating characteristic (ROC)
ASJC Scopus subject areas
- Radiology Nuclear Medicine and imaging
- Obstetrics and Gynecology