‘You Only Look Once’ helps detect, classify lesions on deceptively normal screening mammograms
Researchers have combined three emerging technologies to detect and classify breast cancers found in follow-up imaging of women whose recent screening mammography was deemed normal.
The team proposes their innovative use of the technologies—deep learning, digital image processing and image-to-image translation—as a way to alert radiologists when breast tissue appears normal at screening but is, in fact, at heightened risk for soon developing cancer.
The work was carried out at the University of Louisville and is described in the June edition of Computer Methods and Programs in Biomedicine [1].
Computer scientists and engineers Asma Baccouche, PhD (candidate), Adel Elmaghraby, PhD, and colleagues built and tested their system using several hundred DICOM images from a private digital mammography database.
They worked with two existing forms of You Only Look Once (aka “YOLO”) image-translation software, CycleGAN and Pix2Pix, to render synthetic mammograms from priors. With these they customized models to diagnose masses, calcifications and architectural distortions as well as identify normal tissue.
For the study, all prior mammograms, most of which were acquired one year before the follow-up, had been reported as normal. All the follow-ups, or “current” studies, were either biopsy-confirmed cancerous or healthy.
Baccouche and colleagues found their system detected and classified breast lesions on the current mammograms with as high as 93% accuracy for masses, 88% for calcifications and 95% for architectural distortions. The system correctly classified normal mammograms with 92% accuracy for current exams and 90% for priors.
Noting the present study builds on the team’s previous research, the authors state they continued the project to “integrate the prior mammograms from all used follow-up screenings and provide an early detection and classification on initial screened mammograms.”
The new work, they add, “emphasize[s] the ability of a possible retrospective prediction on prior mammograms that were diagnosed as normal but at a later stage, they were reported with a clear presence and progress of abnormal findings.”
Baccouche and co-authors cite prior research showing that, thanks to an increase in follow-ups and screenings during diagnosis periods, close to 50% of prior mammograms end up retrospectively presenting lesions that were initially missed.
More from their discussion section:
[T]he contribution of this paper could be utilized to screen prior mammograms and detect those with the highest abnormal risk of breast cancer. Consequently, it will provide a warning signal for radiologists to forecast and anticipate the cancer progress.”
More Coverage of Mammography Technology:
3D mammography approaching 50% of breast imaging systems in the US
Mammography AI intercepts density overestimations
10 clues suggest scope, shape of AI’s future in mammography
Malignant architectural distortion ably diagnosed on breast imaging by human-AI combo
Screening breast MRI results in more downstream healthcare costs than mammography alone
Deep learning expedites normal findings on ultrafast breast screenings
Reference:
- Asma Baccouche, Adel Elmaghraby, et al., “Early detection and classification of abnormality in prior mammograms using image-to-image translation and YOLO techniques.” Computer Methods and Programs in Biomedicine, June 2022. DOI: https://doi.org/10.1016/j.cmpb.2022.106884