Abstract

The use of additional spectral filtration for dual-energy (DE) imaging using a dual-source CT (DSCT) system was investigated and its effect on the material-specific DEratio was evaluated for several clinically relevant materials. The x-ray spectra, data acquisition, and reconstruction processes for a DSCT system (Siemens Definition) were simulated using information provided by the system manufacturer, resulting in virtual DE images. The factory-installed filtration for the 80 kV spectrum was left unchanged to avoid any further reductions in tube output, and only the filtration for the high-energy spectrum was modified. Only practical single-element filter materials within the atomic number range of 40?Z?83 were evaluated, with the aim of maximizing the separation between the two spectra, while maintaining similar noise levels for high- and low-energy images acquired at the same tube current. The differences between mean energies and the ratio of the 140 and 80 kV detector signals, each integrated below 80 keV, were evaluated. The simulations were performed for three attenuation scenarios: Head, body, and large body. The large body scenario was evaluated for the DE acquisition mode using the 100 and 140 kV spectra. The DEratio for calcium hydroxyapatite (simulating bone or calcifications), iodine, and iron were determined for CTimages simulated using the modified and factory-installed filtration. Several filter materials were found to perform well at proper thicknesses, with tin being a good practical choice. When imagenoise was matched between the low- and high-energy images, the spectral difference in mean absorbed energy using tin was increased from 25.7 to 42.7 keV (head), from 28.6 to 44.1 keV (body), and from 20.2 to 30.2 keV (large body). The overlap of the signal spectra for energies below 80 keV was reduced from 78% to 31% (head), from 93% to 27% (body), and from 106% to 79% (large body). The DEratio for the body attenuation scenario increased from 1.45 to 1.91 (calcium), from 1.84 to 3.39 (iodine), and from 1.73 to 2.93 (iron) with the additional tin filtration compared to the factory filtration. This use of additional filtration for one of the x-ray tubes used in dual-source DECT dramatically increased the difference between material-specific DE ratios, e.g., from 0.39 to 1.48 for calcium and iodine or from 0.28 to 1.02 for calcium and iron. Because the ability to discriminate between different materials in DE imaging depends primarily on the differences in DE ratios, this increase is expected to improve the performance of any material-specific DECT imaging task. Furthermore, for the large patient size and in conjunction with a 100/140 kV acquisition, the use of additional filtration decreased noise in the low-energy images and increased contrast in the DE image relative to that obtained with 80/140 kV and no additional filtration.

Department(s)

Nuclear Engineering and Radiation Science

International Standard Serial Number (ISSN)

0031-9007

Document Type

Article - Journal

Document Version

Final Version

File Type

text

Language(s)

English

Rights

© 2009 American Physical Society (APS), All rights reserved.

Publication Date

20 Feb 2009

Share

 
COinS