Interactive generation of digital anthropomorphic phantoms from XCAT shape priors


Journal Article

In SPECT imaging, patient respiratory and body motion can cause artifacts that degrade image quality. Developing and evaluating motion correction algorithms are facilitated by simulation studies where a numerical phantom and its motion are precisely known, from which image data can be produced. Previous techniques to test motion correction methods generated XCAT phantoms modeled from MRI studies and motion tracking but required manually segmenting the major structures within the whole upper torso, which can take 8 hours to perform. Additionally, segmentation in two dimensional MRI slices and interpolating into three dimensional shapes can lead to appreciable interpolation artifacts as well as requiring expert knowledge of human anatomy in order to identify the regions to be segmented within each slice. We propose a new method that mitigates the long manual segmentation times for segmenting the upper torso. Our interactive method requires that a user provide only an approximate alignment of the base anatomical shapes from the XCAT model with an MRI data. Organ boundaries from aligned XCAT models are warped with displacement fields generated from registering a baseline MR image to MR images acquired during pre-determined motions, which amounts to automated segmentation each organ of interest. With our method we can show the quality of segmentation is equal that of expert manual segmentation does not require a user who is an expert in anatomy, and can be completed in minutes not hours. In some instances, due to interpolation artifacts, our method can generate higher quality models than manual segmentation. © 2012 SPIE.

Full Text

Duke Authors

Cited Authors

  • Lindsay, C; Gennert, MA; Connolly, CM; Konik, A; Dasari, PK; Segars, WP; King, MA

Published Date

  • May 14, 2012

Published In

Volume / Issue

  • 8317 /

International Standard Serial Number (ISSN)

  • 1605-7422

Digital Object Identifier (DOI)

  • 10.1117/12.911275

Citation Source

  • Scopus