An artificial intelligence (AI) tool – trained on roughly a million screening mammography images – identified breast cancer with approximately 90 percent accuracy when combined with analysis by radiologists, a new study finds.
Led by researchers from NYU School of Medicine and the NYU Center for Data Science, the study examined the ability of a type of AI, a machine learning computer program, to add value to the diagnoses reached by a group of 14 radiologists as they reviewed 720 mammogram images.
“Our study found that AI identified cancer-related patterns in the data that radiologists could not, and vice versa,” says senior study author Krzysztof J. Geras, Ph.D., assistant professor in the Department of Radiology at NYU Langone.
“AI detected pixel-level changes in tissue invisible to the human eye, while humans used forms of reasoning not available to AI,” adds Dr. Geras, also an affiliated faculty member at the NYU Center for Data Science.
“The ultimate goal of our work is to augment, not replace, human radiologists.”
In 2014, more than 39 million mammography exams were performed in the United States to screen women (without symptoms) for breast cancer and determine those in need of closer follow-up. Women whose test results yield abnormal mammography findings are referred for biopsy, a procedure that removes a small sample of breast tissue for laboratory testing.
A New Tool to Identify Breast Cancer
In the new study, the research team designed statistical techniques that let their program “learn” how to get better at a task without being told exactly how
Such programs build mathematical models that enable decision-making based on data examples fed into them, with the program getting “smarter” as it reviews more and more data.
Modern AI approaches, inspired by the human brain, use complex circuits to process information in layers, with each step feeding information into the next, and assigning more or less importance to each piece of information along the way.
Published online recently by the journal IEEE Transactions on Medical Imaging, the current study authors trained their AI tool on many images matched with the results of biopsies performed in the past.
Their goal was to enable the tool to help radiologists reduce the number biopsies needed moving forward.
This can only be achieved, says Dr. Geras, by increasing the confidence that physicians have in the accuracy of assessments made for screening exams (for example, reducing false-positive and false-negative results).
For the current study, the research team analyzed images that had been collected as part of routine clinical care at NYU Langone Health over seven years, sifting through the collected data and connecting the images with biopsy results.
This effort created an extraordinarily large dataset for their AI tool to train on, say the authors, consisting of 229,426 digital screening mammography exams and 1,001,093 images.
Most databases used in studies to date have been limited to 10,000 images or fewer.
Thus, the researchers trained their neural network by programming it to analyze images from the database for which cancer diagnoses had already been determined.
This meant that researchers knew the “truth” for each mammography image (cancer or not) as they tested the tool’s accuracy, while the tool had to guess. Accuracy was measured in the frequency of correct predictions.
In addition, the researchers designed the study AI model to first consider very small patches of the full resolution image separately to create a heat map, a statistical picture of disease likelihood.
Then the program considers the entire breast for structural features linked to cancer, paying closer attention to the areas flagged in the pixel-level heat map.
Rather than have the researchers identify image features for their AI to search for, the tool is discovering on its own which image features increase prediction accuracy.
Moving forward, the team plans to further increase this accuracy by training the AI program on more data, perhaps even identifying changes in breast tissue that are not yet cancerous but have the potential to be.
“The transition to AI support in diagnostic radiology should proceed like the adoption of self-driving cars—slowly and carefully, building trust, and improving systems along the way with a focus on safety,” says first author Nan Wu, a doctoral candidate at the NYU Center for Data Science.
Artificial intelligence (AI) is gaining extensive attention for its excellent performance in image-recognition tasks and increasingly applied in breast ultrasound. AI can conduct a quantitative assessment by recognizing imaging information automatically and make more accurate and reproductive imaging diagnosis.
Breast cancer is the most commonly diagnosed cancer in women, severely threatening women’s health, the early screening of which is closely related to the prognosis of patients.
Therefore, utilization of AI in breast cancer screening and detection is of great significance, which can not only save time for radiologists, but also make up for experience and skill deficiency on some beginners.
This article illustrates the basic technical knowledge regarding AI in breast ultrasound, including early machine learning algorithms and deep learning algorithms, and their application in the differential diagnosis of benign and malignant masses. At last, we talk about the future perspectives of AI in breast ultrasound.
INTRODUCTION
Breast cancer is the most common malignant tumor and the second leading cause of cancer death among women in the United States[1]. In recent years, the incidence and mortality of breast cancer have increased year by year[2,3]. Mortality can be reduced by early detection and timely therapy. Therefore, its early and correct diagnosis has received significant attention. There are several predominant diagnostic methods for breast cancer, such as X-ray mammography, ultrasound, and magnetic resonance imaging (MRI).
Ultrasound is a first-line imaging tool for breast lesion characterization for its high availability, cost-effectiveness, acceptable diagnostic performance, and noninvasive and real-time capabilities. In addition to B-mode ultrasound, new techniques such as color Doppler, spectral Doppler, contrast-enhanced ultrasound, and elastography can also help ultrasound doctors obtain more accurate information. However, it suffers from operator dependence[4].
In recent years, artificial intelligence (AI), particularly deep learning (DL) algorithms, is gaining extensive attention for its extremely excellent performance in image-recognition tasks. AI can make a quantitative assessment by recognizing imaging information automatically so as to improve ultrasound performance in imaging breast lesions[5].
The use of AI in breast ultrasound has also been combined with other novel technology, such as ultrasound radiofrequency (RF) time series analysis[6], multimodality GPU-based computer-assisted diagnosis of breast cancer using ultrasound and digital mammography image[7], optical breast imaging[8,9], QT-based breast tissue volume imaging[10], and automated breast volume scanning (ABVS)[11].
So far, most studies on the use of AI in breast ultrasound focus on the differentiation of benign and malignant breast masses based on the B-mode ultrasound features of the masses. There is a need of a review to summarize the current status and future perspectives of the use of AI in breast ultrasound. In this paper, we introduce the applications of AI for breast mass detection and diagnosis with ultrasound.
AI EQUIPPED IN ULTRASOUND SYSTEM
Images are usually uploaded from the ultrasonic machine to the workstation for image re-processing, while a DL technique (S-detect) can directly identify and mark breast masses on the ultrasound system.
S-detect is a tool equipped in the Samsung RS80A ultrasound system, and based on the DL algorithm, it performs lesion segmentation, feature analysis, and descriptions according to the BI-RADS 2003 or BI-RADS 2013 lexicon.
It can give immediate judgment of benignity or malignancy in the freezd images on the ultrasound machine after choosing ROI automatically or manually (Figure (Figure3).3).
Kim et al[31] evaluated the diagnostic performance of S-detect for the differentiation of benign from malignant breast lesions. When the cutoff was set at category 4a in BI-RADS, the specificity, PPV, and accuracy were significantly higher in S-detect compared to the radiologist (P < 0.05 for all), and the AUC was 0.725 compared to 0.653 (P = 0.038).

S-detect technique in the Samsung RS80A ultrasound system. A and B: In a 47-year-old woman with left invasive breast cancer on B-mode ultrasound (A), S-detect correctly concluded that it is “Possibly Malignant” based on the lesion features listed on the right column (B); C and D: In a 55-year-old woman with fibroadenoma of left breast on B-mode ultrasound (C), S-detect correctly concluded that it is “Possibly Benign” based on the lesion features listed on the right column (D).
Di Segni et al[32] also evaluated the diagnostic performance of S-detect in the assessment of focal breast lesions. S-detect showed a sensitivity > 90% and a 70.8% specificity, with inter-rater agreement ranging from moderate to good. S-detect may be a feasible tool for the characterization of breast lesions and assist physicians in making clinical decisions.
CONCLUSION
AI has been increasingly applied in ultrasound and proved to be a powerful tool to provide a reliable diagnosis with higher accuracy and efficiency and reduce the workload of pyhsicians.
It is roughly divided into early machine learning controlled by manual input algorithms, and DL, with which software can self-study. There is still no guidelines to recommend the application of AI with ultrasound in clinical practice, and more studies are required to explore more advanced methods and to prove their usefulness.
In the near future, we believe that AI in breast ultrasound can not only distinguish between benign and malignant breast masses, but also further classify specific benign diseases, such as inflammative breast mass and fibroplasia.
In addition, AI in ultrasound may also predict Tumor Node Metastasis classification[33], prognosis, and the treatment response for patients with breast cancer. Last but not the least, the accuracy of AI on ultrasound to differentiate benign from malignant breast lesions may not only be based on B-mode ultrasound images, but also could combine images from other advanced techniqes, such as ABVS, elastography, and contrast-enhanced ultrasound.
More information: Nan Wu et al, Deep Neural Networks Improve Radiologists’ Performance in Breast Cancer Screening, IEEE Transactions on Medical Imaging (2019). DOI: 10.1109/TMI.2019.2945514
Provided by NYU Langone Health