HOMEPRODUCTSCOMPANYCONTACTFAQResearchDictionaryPharmaSign Up FREE or Login

Automatic stent recognition using perceptual attention U-net for quantitative intrafraction motion monitoring in pancreatic cancer radiotherapy.

AbstractPURPOSE:
Stent has often been used as an internal surrogate to monitor intrafraction tumor motion during pancreatic cancer radiotherapy. Based on the stent contours generated from planning CT images, the current intrafraction motion review (IMR) system on Varian TrueBeam only provides a tool to verify the stent motion visually but lacks quantitative information. The purpose of this study is to develop an automatic stent recognition method for quantitative intrafraction tumor motion monitoring in pancreatic cancer treatment.
METHODS:
A total of 535 IMR images from 14 pancreatic cancer patients were retrospectively selected in this study, with the manual contour of the stent on each image serving as the ground truth. We developed a deep learning-based approach that integrates two mechanisms that focus on the features of the segmentation target. The objective attention modeling was integrated into the U-net framework to deal with the optimization difficulties when training a deep network with 2D IMR images and limited training data. A perceptual loss was combined with the binary cross-entropy loss and a Dice loss for supervision. The deep neural network was trained to capture more contextual information to predict binary stent masks. A random-split test was performed, with images of ten patients (71%, 380 images) randomly selected for training, whereas the rest of four patients (29%, 155 images) were used for testing. Sevenfold cross-validation of the proposed PAUnet on the 14 patients was performed for further evaluation.
RESULTS:
Our stent segmentation results were compared with the manually segmented contours. For the random-split test, the trained model achieved a mean (±standard deviation) stent Dice similarity coefficient (DSC), 95% Hausdorff distance (HD95), the center-of-mass distance (CMD), and volume difference V o l d i f f $Vo{l_{diff}}$ were 0.96 (±0.01), 1.01 (±0.55) mm, 0.66 (±0.46) mm, and 3.07% (±2.37%), respectively. The sevenfold cross-validation of the proposed PAUnet had the mean (±standard deviation) of 0.96 (±0.02), 0.72 (±0.49) mm, 0.85 (±0.96) mm, and 3.47% (±3.27%) for the DSC, HD95, CMD, and V o l d i f f $Vo{l_{diff}}$ .
CONCLUSION:
We developed a novel deep learning-based approach to automatically segment the stent from IMR images, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for quantitative intrafraction motion monitoring in pancreatic cancer radiotherapy.
AuthorsXiuxiu He, Weixing Cai, Feifei Li, Pengpeng Zhang, Marsha Reyngold, John J Cuaron, Laura I Cerviño, Tianfang Li, Xiang Li
JournalMedical physics (Med Phys) Vol. 49 Issue 8 Pg. 5283-5293 (Aug 2022) ISSN: 2473-4209 [Electronic] United States
PMID35524706 (Publication Type: Journal Article)
Copyright© 2022 American Association of Physicists in Medicine.
Topics
  • Attention
  • Humans
  • Image Processing, Computer-Assisted (methods)
  • Pancreatic Neoplasms (diagnostic imaging, radiotherapy)
  • Retrospective Studies
  • Stents

Join CureHunter, for free Research Interface BASIC access!

Take advantage of free CureHunter research engine access to explore the best drug and treatment options for any disease. Find out why thousands of doctors, pharma researchers and patient activists around the world use CureHunter every day.
Realize the full power of the drug-disease research graph!


Choose Username:
Email:
Password:
Verify Password:
Enter Code Shown: