Feb. 13, 2021
Autoimmune pancreatitis (AIP) is a benign autoinflammatory condition characterized by lymphoplasmacytic infiltration and fibrosis. Diagnosing this condition has proved challenging and can lead to diagnostic delays or mistakes and late or unwarranted therapy. Differentiating AIP from pancreatic ductal adenocarcinoma (PDAC) can be difficult due to similarities in clinical presentation and sonographic and cross-sectional imaging findings. The use of algorithms such as the HISORt criteria (which are based on histologic findings, imaging, serology, other organ involvement and response to steroid therapy) can assist in making a diagnosis, but they also have limitations.
To address these challenges, a team of researchers sought to create and validate a convolutional neural network (CNN) model trained to perform real-time analysis of endoscopic ultrasound (EUS) video examinations. The CNNs model was designed to differentiate AIP from PDAC, chronic pancreatitis (CP) and normal pancreas (NP). The results from this study were published in Gut in 2020. The article's first author, gastroenterology researcher Neil B. Marya, M.D., and corresponding author, gastroenterologist Michael J. Levy, M.D., are from Mayo Clinic's campus in Rochester, Minnesota.
Deep learning models using multilayered neural networks can efficiently analyze large data sets and provide sophisticated insights across a wide range of applications. Drs. Marya and Levy and co-authors note that CNNs have been used within gastroenterology and in other disciplines, and some have shown some promise in the endosonographic assessment of pancreatic disorders.
The researchers developed a CNN model using an extensive database of still image and video data. A total of 1,174,461 unique images obtained from 643 EUS examinations of 583 patients (146 with AIP, 292 with PDAC, 72 with CP and 73 with NP) yielded 4,945 physician-captured still (PCS) image assets and 1,852 video assets used to generate the data set. To identify and validate both the sonographic features most valued by the CNN model and the ability of the model to differentiate AIP from PDAC, the researchers also generated and analyzed occlusion heatmap visualizations from the EUS image database.
CNN model performance results
Overall, the researchers noted that the CNN model accurately differentiated AIP from PDAC and other benign pancreatic conditions. Additionally, the occlusion heatmap analysis identified discriminating sonographic features of AIP and PDAC, including the presence of a dilated pancreatic duct in PDAC patients.
Probabilidad de pancreatitis autoinmune y adenocarcinoma ductal pancreático basado en análisis de imágenes
Estas imágenes destacan imágenes representativas de ecografía endoscópica de casos reales de pancreatitis autoinmune y adenocarcinoma ductal pancreático verdadero a través de una gama de puntuaciones de probabilidad producidas tras un análisis de modelo de red neuronal convolucional. Imagen reimpresa con permiso de Gut.
The researchers evaluated the CNN model's performance in classifying pancreatic pathology in two phases. During the first "all-images" phase, they looked at how well the CNN model analyzed the images from both PCS and video assets. During the second "video-only" phase, they looked at how well the CNN model analyzed only continuous images from video assets with at least 100 frames.
During the all-images phase, the CNN was:
- 95% sensitive, 90% specific, with an area under the receiver operating characteristics curve (AUROC) of 0.977; 95% confidence interval (CI), 0.976 to 0.978 when distinguishing AIP from NP
- 90% sensitive, 59% specific, with an AUROC of 0.869; 95% CI, 0.867 to 0.871 when distinguishing AIP from CP
- 90% sensitive, 87% specific, with an AUROC of 0.950; 95% CI, 0.949 to 0.951 when distinguishing AIP from PDAC
- 90% sensitive, 78% specific, with an AUROC of 0.922; 95% CI, 0.921 to 0.923 when distinguishing AIP from all studied conditions (PDAC, CP and NP)
During the video-only phase, the CNN processed 955 EUS frames per second and was:
- 99% sensitive, 98% specific, with an AUROC of 0.992; 95% CI, 0.976 to 1.0 when distinguishing AIP from NP
- 94% sensitive, 71% specific, with an AUROC of 0.892; 95% CI, 0.829 to 0.946 when distinguishing AIP from CP
- 90% sensitive and 93% specific, with an AUROC of 0.963; 95% CI, 0.941 to 0.981 when distinguishing AIP from PDAC
- 90% sensitive and 85% specific, with an AUROC of 0.946; 95% CI, 0.919 to 0.967 when distinguishing AIP from all other studied conditions (PDAC, CP and NP)
According to Drs. Marya and Levy, these results demonstrate that the EUS-CNN model developed has the potential to help clinicians make more accurate and timely diagnoses and facilitate earlier targeted therapeutic intervention and improved patient outcomes.
"Our newly developed and validated artificial intelligence model will provide key additional information and interpretive analysis of pancreatic masses, both benign and malignant, that are often difficult to distinguish," explains Dr. Levy. "This improved diagnostic accuracy is essential for guiding the management of patients with pancreatic cancer, as studies demonstrate that early detection and early treatment can lead to more-timely therapeutic intervention and optimized outcomes. This tool is important not only for our Mayo Clinic practitioners and patients but also for practitioners who see such patients outside of a tertiary care setting."
Study limitations and next steps
The researchers note that additional steps are needed to address study limitations and to optimize the use of this CNN model in a clinical setting.
"Additional work underway now will eventually allow for real-time clinical implementation," explains Dr. Marya. "We are working to further optimize the performance of our AI model by increasing the volume of exams use to train the model. We are also seeking to expand the platform of our training model to include different imaging modalities such as transabdominal ultrasound, cholangiograms and cross-sectional imaging. And we are looking to generate EUS-based AI models that are capable of helping us solve other gastrointestinal or pancreaticobiliary clinical dilemmas."
Drs. Levy and Marya and co-authors published results from another study in Gastrointestinal Endoscopy in 2020 demonstrating the capability of an EUS-based artificial intelligence model to differentiate malignant and benign focal liver lesions.
For more information
Marya NB, et al. Utilisation of artificial intelligence for the development of an EUS-convolutional neural network model trained to enhance the diagnosis of autoimmune pancreatitis. Gut. In press.
Marya NB, et al. Application of artificial intelligence using a novel EUS-based convolutional neural network model to identify and distinguish benign and malignant hepatic masses. Gastrointestinal Endoscopy. In press.