Download presentation
Published byPearl Gibbs Modified over 9 years ago
1
IRCC & Mauriziano Hospital & INFN & S Croce e Carle Hospital
Genova, 8 Marzo 2004 Radiotherapy treatment planning with Monte Carlo on a distributed system Stéphane Chauvie, Ladies and gentlemen good afternoon, IRCC & Mauriziano Hospital & INFN & S Croce e Carle Hospital Turin, Italy
2
Contents Grant 2002-03/645 Radiotherapy Treatment Planning
Analitical algorithms for dose calculation Monte Carlo methods Cluster set-up Monte Carlo parallelization Data analisys and experimental measurements comparison: open field and IM field Head and neck tumor with IMRT In this presentation I will go firstly trough some fundamental of RT treatment planning to introduce hence the dose calculation algorithms: both analytical and Monte Carlo. I’ll then explain how a cluster has been set-up to speed-up calculation time and how the Monte Carlo has been parallelised. I’ll then show the comparison between simulation and experimental measurements for open and IM fields. Finally I’ll show the dose distribuiton obtained in a complex case: an head and neck tumor treated with IMRT. Grant /645
3
Radiotherapy Oncology
CTV PTV Spinal cord S.c. PRV Pharotid deliver high dose to the target volume spare the surrounding healthy tissues. The goal of the modern radiotherapy is to deliver an high as possible dose to the target volume while sparing the surrounding healthy tissues. This is often not trivial: we could for example have a look this chordoma cancer in which the tumor is wrapped around the spinal cord and the pharotid is embedded in the target volume. allow local control of tumor avoid side-effects
4
3D-CRT vs IMRT Critical points: - high dose gradients
- strongly unhomogeneous areas 3DCRT & IMRT used in complex anatomical regions This tumor has been treated with an evoluted 3D-CRT technique called IMRT. With this dynamic technique it was possible to deliver high dose gradient to paint the dose around the target volume. The critical question is: how much is accurate the dose calculation? IM field How much is accurate the dose calculation ?
5
Dose calculation algorithms Convolution/Superposition
Pencil beam Convolution/Superposition Monte Carlo Accurate but very slow Cheap (free) Expensive Quick but inaccurate Dose determination accuracy Total Total with dose calculation 4,1% 4,2%(1%) to 6,5% (5%) Ahnesjo 1999 The overall uncertainty in dose estimation inside the patient depends on: experimental measurements in the reference point, beam stability and flatness, patient data set uncertianties and set-up errors. If we add dose calculation errors go from 4 to 7%. The more used analitical methods are pencil beams and super/conv. Treatment planning that implement such algorithms are expensive and not very accurate but quick and could be therefore used to have a on-line dose evaluation. Monte Carlo methods have shown in the last years to be much more accurate and cheap (sometimes free). They have not been used in clinical practice since they requires hours or even day to provide statistical significant results. Since MC are intrinsically parallel their natural implementation is on parallel machines, generally very efficient but expensive Meas in ref pomint, neam stability &flatness, CT data, setup
6
Cluster Beowulf parallelisation
+ = PC & Ethernet Th: High performance networks of PCs are now realistic alternative since offer parallel processing of MC at a lower cost showing competitive performances. So, since we are convinced by parallelisation of MC we want to go ahead in this direction and endeavour the use of a Beowulf cluster. The idea of using an array of normal PCs as an alternative to supercomputers was born in the 50s at GSFC. The idea was simple, and, with modern tools affordable. First you take a series of N PCs, like the one you have on your desktop, and connect them with fast and cheap ethernet cards and switches. Second you install on them the Linux OS. The alchemia of this 2 components is the so called Beowulf cluster. The thesis we want to demostrate is that: High performance networks of PCs are now realistic alternative since offer parallel processing of MC at a lower cost showing competitive performances. Let’s try to demonstrate this using an inductive approach.
7
Cluster set-up Hardware installation Monte Carlo simulation
Software configuration Benchmarking We build a prototype of Beowulf cluster constitued of few nodes. In this talk I will go shortly trough the HW installation, SW configuration and the benchmarking of cluster parallelisation. At the same time MC simulation was set-up. When the 2 part were ready we parallelised the MC and then run it onto the cluster. Monte Carlo parallelisation RUN
8
Installation, configuration & benchmarking
Bios OS Disk conf Partition RAID Memory CPU Compilators Linking models to H-LAN SW I T C H Master Node03 Node04 Node08 Node07 Node06 Node05 Node02 Each off the shelf PC is a good compromise between costs and computer power. The requisite to be a slave are a CPU, memory and fast ethernet card. The master is more expensive since it is equipped to be the interface of the cluster with the outside world and hence is trhe only one with a monitor, CD-ROM anso on . The PCs have been networked on a private LAN trough a switching hub while the master, which has 2 etehrnet card is connected to the hospital LAN in order to exchange data and images with TPS and CTs… We installed Linux Red Hat that owns a collection of libraries for parallelisation and multiprocessor environment. We use the MPI/LAM parallel libraries which enable to run a daemon of parallelisation on the cluster using secure shell SSH connections. Parallelization: LAM-MPI Security: SSH
9
Installation, configuration & benchmarking
Sup = Tser/Tpar = 3.99 to H-LAN SW I T C H Master Node03 Node04 Node08 Node07 Node06 Node05 Node02 We were then ready to start to run simple examples on the cluster to both verify its correct behaviour and prove the scalability. In the graph you can see the execution time versus the number of nodes. The efficiency has a value of that prove the scalability. We are now ready. Let’s now have a look what’s happening with the MC simulation . Efficiency = Sup/ Nprocessors = 0.997
10
e- Simulation: geometry V = 6 MV Varian 600C/D Millenium 120-leaf MLC
As regards geometry we carried out the simulation of a Varian 600CD linear accelerator equipped with a 120MLC. The electrons impinge on the target were bremmstrahlung photons arise, interacting with the flattening filter surrounded by the primary collimator. Below the monitor chambers are mounted the jaws which could move to shape fields during the run. Then the MLC collimator in which every single leaf could be moved and positioned accordingly to the MLC files of the machine. Varian 600C/D Millenium 120-leaf MLC
11
simulation: physics Processes Particle NO TUNING, NO CUT
Multiple scattering Bremsstrahlung Ionisation Annihilation Photoelectric effect Compton scattering Rayleigh effect g conversion e+e- pair production e- e+ As regards physics Geant4 could model all PDG particles and ions for both em and hadronic interactions. For our simulation we use e-, e+ and photons with MSC, B, ………- Geant4 has only production thresholds, no tracking cuts all particles are tracked down to zero range energy, TOF ... cuts can be defined by the user NO TUNING, NO CUT
12
Soft tissue: DICOM interface Bone: Lung: Patient model
- CT-tissue relationship ICRU DICOM interface Bone: - CT-el linearity - cortical bone - bone marrow diluition To model the patient we use as input the CT data sets. The tissue modelization is a parameterised CT based method. For soft tissue we use CT number tissue relationship used in ICRU. For bone a experimental relationship beetween CT number and electron density has been shown if we descrbe the bone as a diluition of cortical bone into bone marrow. As regards lung we observed a linear relationship between CT number and lung density during breathing. Lung: - CT- linearity
13
Monte Carlo Parallelization
Take care of PRNG IM simple field in homogeneous phantom Phase Space Data Water measurements IM patient field in homogeneous phantom Anthropomorphic phantom measurements Now we parallelised the MC. An important point is that we must assure that RN are different for all the cluster. To verify simulation results we analysed phase space data below jaws and then simulated dose deposition inside water phantoms and sthropomorphic phantom. We then simulate open field inside patient At the same time we verified our capacity of have a dynemic simulation of simple and patient’s IM fields on phantoms. We were then ready to simluate an entire IMRT treatment on patient. Simulation inside patient IMRT treatment
14
PSD Phase Space Data (x,y,z) (px,py,pz)
To speed up calculatio time we divided up the simulation in 2 phase. In the first we simulated opend field 2, 10, 20 and 40 cm squared and store the kinematic of particle below the jaws into a PSD file. This is the input for the 2 part of the simulation patient fitted. Time calculation for PSD fields are quite long. As an example we use 2 billion of primary electrons to have 15 millions of points in the PS for a 10x10 cm squared field. Total simulation time of 46 hours un tempo di simulazione di 46 ore. We could note that the efficiency is nearby 1 that again prove the scalability even for the complete simulation E
15
Water measurements PDD and dose profile in water 10X10 20X20 PDD %
The simulation has been then verified comparing all the simulated 2, 10, 20 e 40 fields with percent depth dose curve and dose profile in water. Here you could see an example how both the maximum dose and the exponential tail is well reproduced. Scanner IC15 ionization chamber SSD=SAD
16
Anthropomorphic phantom measurements
Measure Monte Diff Broad Diff Pencil Diff Super/ Diff Carlo % beam % beam % conv % 100,02, ,02,2 0, ,1 3, ,9 2, ,9 1,9 178,43, ,42,3 -1, ,0 -6, ,2 -2, ,3 -1,2 120,12, ,02,2 -1, ,3 1, ,7 3, ,8 1,4 98,83, ,02,3 -1, ,0 1, ,0 8, , ,5 We then compared the exp measurements wit MC and different analytycal algorithms for a open field in phantom. We could see how MC and measurements agree while analytical algorithms have errors around 2-3 %. 2 higher deviation are registered in 2 more stressing condition: below the lung for the pencil beam and in the bone for broad beam. The conv/super model is otherwise more accurate Microchamber A14SL SSD=SAD
17
Patient simulation TAC X=10 Y=10 SSD=SAD Gantry 0°
After the verification of the simulation where exp measurements could be done we applied the MC to real patient case. Here we simulated an open field of 10x10 cm squared here are the isosodose curve obtained with the TPS and superimposed the MC ones. We could see how they differ: let’s have look there is a dose rise in front of mandibular bone and the isodose deformation into the trachea. If we track a PDD we couls see well how tps does not see the unhomogeneities leading to local errors of about 20 %
18
IMRT treatment simulation
10X10 isocentric technique 7 field! Every field segments no. 165,415,3 events no. (15,50,5)107 hits no. (4,02 0,39) 105 time (hours) 0,510,03 We finally simulated a complex treatment of the chordoma we have already seen. We used sliding window technique. In this case we realised that we had created too small PSD file because MLC is often closed in front of the target leading to a high number of photons unused (and a threfold number of MU). We hence we are now creating new ones- IMRT plan evaluation in 3,5 hours with hits and 3 nodes
19
Current ”Geant4” activities in Cuneo...
10 MeV Cyclotron CT-PET We finally simulated a complex treatment of the chordoma we have already seen. We used sliding window technique. In this case we realised that we had created too small PSD file because MLC is often closed in front of the target leading to a high number of photons unused (and a threfold number of MU). We hence we are now creating new ones-
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.