Deep convolutional framelets for dose reconstruction in BNCT with Compton camera detector
Boron Neutron Capture Therapy (BNCT) is an innovative binary form of radiation therapy with high selectivity towards cancer tissue based on the neutron capture reaction 10B(n,$\alpha$)7Li, consisting in the exposition of patients to neutron beams after administration of a boron compound with prefere...
Saved in:
Main Authors | , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.09.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2409.15916 |
Cover
Summary: | Boron Neutron Capture Therapy (BNCT) is an innovative binary form of
radiation therapy with high selectivity towards cancer tissue based on the
neutron capture reaction 10B(n,$\alpha$)7Li, consisting in the exposition of
patients to neutron beams after administration of a boron compound with
preferential accumulation in cancer cells. The high linear energy transfer
products of the ensuing reaction deposit their energy at cell level, sparing
normal tissue. Although progress in accelerator-based BNCT has led to renewed
interest in this cancer treatment modality, in vivo dose monitoring during
treatment still remains not feasible and several approaches are under
investigation. While Compton imaging presents various advantages over other
imaging methods, it typically requires long reconstruction times, comparable
with BNCT treatment duration. This study aims to develop deep neural network
models to estimate the dose distribution by using a simulated dataset of BNCT
Compton camera images. The models pursue the avoidance of the iteration time
associated with the maximum-likelihood expectation-maximization algorithm
(MLEM), enabling a prompt dose reconstruction during the treatment. The U-Net
architecture and two variants based on the deep convolutional framelets
framework have been used for noise and artifacts reduction in few-iterations
reconstructed images, leading to promising results in terms of reconstruction
accuracy and processing time. |
---|---|
DOI: | 10.48550/arxiv.2409.15916 |