Show simple item record

dc.contributor.authorSage, J.
dc.contributor.authorLooney, P.
dc.contributor.authorChuter, R.
dc.contributor.authorPrice, G.
dc.contributor.authorBoukerroui, D.
dc.contributor.authorBalfour, D.
dc.contributor.authorWhitehurst, P.
dc.contributor.authorGooding, M. J.
dc.date.accessioned2022-01-11T11:59:39Z
dc.date.available2022-01-11T11:59:39Z
dc.date.issued2021en
dc.identifier.citationSage J, Looney P, Chuter R, Price G, Boukerroui D, Balfour D, et al. Validating CBCT to CT registration QC using an AI generated dataset. Radiotherapy and Oncology. 2021;161:S1399-S401.en
dc.identifier.urihttp://hdl.handle.net/10541/624837
dc.description.abstractPurpose or Objective Deformable registration of CBCT to CT for adaptive tracking is difficult due to the CBCT image quality. Patient specific registration QC is possible by measuring the distance between 3D features in the two datasets which are matched within a local neighbourhood. This method of local similar points (LSP) was previously compared to an expert benchmark dataset for CT registration [Sage, AAPM 2020]. No benchmark studies exist for CBCT to CT registration. CycleGAN [Zhu, ICCV 2017] is a generative adversarial network which can be trained to transform images between modalities with unpaired training data. We use CycleGAN to create aligned pseudo CBCT:CT pairs to which known deformations can be applied to create benchmark data to validate LSP for CBCT to CT registration. Material and Methods From a data set of 168 thorax patients, 149 cases were randomly selected for model training and 19 for generating benchmark data. Each case consisted of 1 CT image (Philips) and 1-2 CBCT images (Elekta). A 2D CycleGAN was implemented with UNet generators and a 70x70 pixel patch discriminator. Training used unpaired 2D slices from the CT and CBCT images which were extracted and cropped to a central 256x256 square. Following training the CT images from the benchmark cases were converted to pseudo CBCT using the network. Deformable vector fields (DVFs) were created by summing randomly generated Gaussian peaks in a 3D grid. Multiple DVFs were generated for each case, varying the peak magnitude (2, 5, 10 mm), width (5, 10 voxels) and the proportion of the image within 2SD of a peak, (20%, 50% and 70%). The DVFs were used to deform the pseudo CBCT images resulting in 19 CBCT:CT test pairs per case (18 deformed CBCT images plus one undeformed). The mean displacement error was measured using LSP in each test pair and compared to the mean magnitude of the applied displacement calculated directly from the DVF. Results The pseudo CBCT images visually resembled CBCT images in regard to noise and shading as seen in Figure 1. Some features were inaccurately rendered, such as the shading of the vertebral bodies. The mean applied displacement in the generated DVFs ranged from 0.7 to 9.2 mm. Where no deformation was applied to the pseudo CBCT the LSP measured a mean displacement of 1.1 ± 0.2mm (SD). There was a strong correlation between inserted and measured displacement, as seen in Figure 2, with a correlation coefficient of 0.98. The average correlation coefficient within patient cases was 0.995 ± 0.002 (SD). Conclusion Displacement measured using LSP agreed well with artificially applied displacement for CBCT and CT image pairs. A cycleGAN was successfully used to create cross domain test pairs, able to demonstrate that LSP is robust to the types of shading and noise variation observed in CBCT data.en
dc.language.isoenen
dc.titleValidating CBCT to CT registration QC using an AI generated dataseten
dc.typeMeetings and Proceedingsen
dc.contributor.departmentMirada Medical Ltd, Science, Oxford, United Kingdomen
dc.identifier.journalRadiotherapy and Oncologyen
dc.description.noteen]


This item appears in the following Collection(s)

Show simple item record