Publications

by Keyword: Monte Carlo


By year:[ 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006 | 2005 ]

Bueno, M., Paganetti, H., Duch, M. A., Schuemann, J., (2013). An algorithm to assess the need for clinical Monte Carlo dose calculation for small proton therapy fields based on quantification of tissue heterogeneity Medical Physics , 40, (8), 081704

Purpose: In proton therapy, complex density heterogeneities within the beam path constitute a challenge to dose calculation algorithms. This might question the reliability of dose distributions predicted by treatment planning systems based on analytical dose calculation. For cases in which substantial dose errors are expected, resorting to Monte Carlo dose calculations might be essential to ensure a successful treatment outcome and therefore the benefit is worth a presumably long computation time. The aim of this study was to define an indicator for the accuracy of dose delivery based on analytical dose calculations in treatment planning systems for small proton therapy fields to identify those patients for which Monte Carlo dose calculation is warranted. Methods: Fourteen patients treated at our facility with small passively scattered proton beams (apertures diameters below 7 cm) were selected. Plans were generated in the XiO treatment planning system in combination with a pencil beam algorithm developed at the Massachusetts General Hospital and compared to Monte Carlo dose calculations. Differences in the dose to the 50% of the gross tumor volume (D50, GTV) were assessed in a field-by-field basis. A simple and fast methodology was developed to quantify the inhomogeneity of the tissue traversed by a single small proton beam using a heterogeneity index (HI) - a concept presented by Plugfelder [Med. Phys. 34, 1506-1513 (2007)10.1118/1. 2710329] for scanned proton beams. Finally, the potential correlation between the error made by the pencil beam based treatment planning algorithm for each field and the level of tissue heterogeneity traversed by the proton beam given by the HI was evaluated. Results: Discrepancies up to 5.4% were found in D50 for single fields, although dose differences were within clinical tolerance levels (<3%) when combining all of the fields involved in the treatment. The discrepancies found for each field exhibited a strong correlation to their associated HI-values (Spearman's ρ = 0.8, p < 0.0001); the higher the level of tissue inhomogeneities for a particular field, the larger the error by the analytical algorithm. With the established correlation a threshold for HI can be set by choosing a tolerance level of 2-3% - commonly accepted in radiotherapy. Conclusions: The HI is a good indicator for the accuracy of proton field delivery in terms of GTV prescription dose coverage when small fields are delivered. Each HI-value was obtained from the CT image in less than 3 min on a computer with 2 GHz CPU allowing implementation of this methodology in clinical routine. For HI-values exceeding the threshold, either a change in beam direction (if feasible) or a recalculation of the dose with Monte Carlo would be highly recommended.

Keywords: Heterogeneities, Heterogeneity index, Monte Carlo, Proton therapy, Small fields


Crespo, C., Gallego, J., Cot, A., Falcón, C., Bullich, S., Pareto, D., Aguiar, P., Sempau, J., Lomeña, F., Calviño, F., Pavía, J., Ros, D., (2008). Quantification of dopaminergic neurotransmission SPECT studies with 123I-labelled radioligands. A comparison between different imaging systems and data acquisition protocols using Monte Carlo simulation European Journal of Nuclear Medicine and Molecular Imaging , 35, (7), 1334-1342

Purpose: 123I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. Methods: The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. Results: For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Conclusion: Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies.

Keywords: Brain SPECT, Monte Carlo methods, Receptor imaging, Reconstruction quantification, SPECT instrumentation and algorithms