Optimising a 3D convolutional neural network for head and neck computed tomography segmentation with limited training data

Research output: Contribution to journalArticlepeer-review

Abstract

Background and Purpose: Convolutional neural networks (CNNs) are increasingly used to automate segmentation for radiotherapy planning, where accurate segmentation of organs-at-risk (OARs) is crucial. Training CNNs often requires large amounts of data. However, large, high quality datasets are scarce. The aim of this study was to develop a CNN capable of accurate head and neck (HN) 3D auto-segmentation of planning CT scans using a small training dataset (34 CTs).
Materials and Method: Elements of our custom CNN architecture were varied to optimise segmentation performance. We tested and evaluated the impact of: using multiple contrast channels for the CT scan input at specific soft tissue and bony anatomy windows, resize vs. transpose convolutions, and loss functions based on overlap metrics and cross entropy in different combinations. Model segmentation performance was compared with the inter-observer deviation of two doctors’ gold standardsegmentations using the 95th percentile Hausdorff distance and mean distance-to-agreement (mDTA). The best performing configuration was further validated on a popular public dataset to compare with state-ofthe-art (SOTA) auto-segmentation methods.
Results: Our best performing CNN configuration was competitive with current SOTA methods when evaluated on the public dataset with mDTA of (0.81±0.31)mm for the brainstem, (0.20±0.08)mm for the mandible, (0.77 ± 0.14)mm for the left parotid and (0.81 ± 0.28)mm for the right parotid.
Conclusions: Through careful tuning and customisation we trained a 3D CNN with a small dataset to produce segmentations of HN OARs with an accuracy that is comparable with inter-clinician deviations. Our proposed model performed competitively with current SOTA methods.

Bibliographical metadata

Original languageEnglish
JournalPhysics and Imaging in Radiation Oncology
Publication statusAccepted/In press - 20 Apr 2022