Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker.

Similar presentations


Presentation on theme: "Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker."— Presentation transcript:

1 Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker 2, V Panettieri 3, C Addison 1, AE Nahum 3 1 Computing Services Dept, University of Liverpool; 2 Directorate of Medical Imaging and Radiotherapy, University of Liverpool; 3 Physics Department, Clatterbridge Centre for Oncology

2 Outline  Introduction to radiotherapy treatment planning  University of Liverpool Grid Computing Server (GCS)  GCS tools  Command line job submission using the GCS  UL-GRID Portal  Results  Future directions

3 Rationale  Routine radiotherapy treatment planning is constrained by lack of sufficiently powerful computing resources  Monte Carlo (MC) based codes can provide accurate absorbed dose calculations but are computationally demanding (single simulation can take 3 weeks on a desktop machine)  Fortunately MC methods are inherently parallel – can run on HPC resources and (for some codes) HTC resources  So far looked at looked running simulations on local and centrally funded HPC clusters in a user-friendly manner  Starting to look at using Condor pools

4 Radiotherapy codes  Two MC codes have been investigated to date:  MCNPX (beta v2.7a)  general purpose transport code, tracks nearly all particles at nearly all energies (https://mcnpx.lanl.gov/).https://mcnpx.lanl.gov/  parallel (MPI-based) code, only runs on clusters  self contained – no need for pre- and post- processing steps  PENELOPE  general purpose MC code implemented as a set of FORTRAN routines  coupled electron-photon transport from 50 eV to 1 GeV in arbitrary materials and complex geometries [1].  serial implementation, will run on clusters and Condor pools  needs pre- and post- processing to set up input files and combine partial results  Starting to look at EGSnrc / BEAMnrc / DOSXYZnrc [1] Salvat F, Fernández-Varea JM, Sempau J. PENELOPE, a code system for Monte Carlo simulation of electron and photon transport. France: OECD Nuclear Energy Agency, Issy-les-Moulineaux; 2008. ISBN 9264023011. Available in pdf format at: http://www.nea.fr.http://www.nea.fr

5 Simulation of an electron treatment: from the treatment head to the patient (taken from Cygler et al) Courtesy of Prof. A. Nahum (CCO)

6 Grid Computing Server / UL-GRID Portal

7 Grid Computing Server / UL-GRID software stack

8 Grid Computing Server tools  single sign on to resources via MyProxy, use ulg-get-proxy (proxies automatically renewed)  job management is very similar to local batch systems such as SGE: ulg- qsub, ulg-qstat, ulg-qdel etc  support for submitting large numbers of jobs, file staging and pre- and post- processing  job submission process is the same for all compute clusters (local or external)  utility tools 1 provide simple Grid based extensions to standard UNIX commands: ulg-cp, ulg-ls, ulg-rm, ulg-mkdir etc.  status commands available e.g. ulg-status, ulg-rqstat 1 based on GROWL scripts from STFC Daresbury

9 PENELOPE (serial code) workflows  Rereasdasdas create random seeds for N input files using clonEasy[1] combine N individual phase-space files compute individual phase-space file create random seeds for N input files using clonEasy[1] stage-in phase-space file (only if necessary) compute partial treatment simulation results combine partial treatment simulation results using clonEasy[1] repeat for other patients phase-space file calculation patient treatment simulation Portal HPC cluster [1] Badal A and Sempau J 2006 A package of Linux scripts for the parallelization of Monte Carlo simulations Comput.Phys. Commun. 175 440–50

10 GCS job description files for PENELOPE (1) # # create phase space file (PSF) # job_type = remote_script host = ulgbc2 total_jobs = 16 name = penelopeLPO pre_processor = /opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments = penmain_acc6_LPO35_.in 16 indexed_input_files = penmain_acc6_LPO35_.in input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat executable = /usr/local/bin/run_penmain arguments= penmain_acc6_LPO35_INDEX.in penmain_LPO35_INDEX.out log = mylogfile

11 GCS job description files for PENELOPE (2) # # perform patient simulation using previously calculated phase space file (PSF) # job_type = remote_script host = ulgbc2 name = penelope total_jobs = 10 pre_processor=/opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments=penmain.in 10 staged_input_files=PSF_test.psf input_remote_stage_dir=staging input_files = water_phantom.geo,water.mat indexed_input_files = penmain.in executable = /usr/local/bin/run_penmain arguments= penmainINDEX.in penmainINDEX.out ics log = penelope.log

12 Condor job files for PENELOPE # job description file grid_resource = gt2 ulgbc2.liv.ac.uk/jobmanager-condorg universe = grid executable = /usr/local/bin/run_penmain arguments = penmain_acc6_LPO35_$(PROCESS).in penmain_LPO35_$(PROCESS).out ics_test +ulg_job_name = penelopeLPO log = log transfer_input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat, penmain_acc6_LPO35_$(PROCESS).in transfer_files = always transfer_executable = FALSE GlobusRSL = (count=1)(job_type=remote_script) \ (input_working_directory=/condor_data/smithic/penelope/big_test/create_psf) \ (job_name=penelopeLPO) notification = never queue 16 # DAG file JOB pre_process dummy1.sub JOB staging penelopeLPO35.sub SCRIPT PRE pre_process /opt1/ulgrid/apps/penelope/seed_input_files PARENT pre_process CHILD staging

13 GCS job submission and monitoring smithic(ulgp4)create_psf$ ulg-qsub penelopeLPO35 Grid job submitted successfully, Job ID is 125042 smithic(ulgp4)create_psf$ ulg-qstat Job ID Job Name Owner State Cores Host ------ -------- ----- ----- ----- ---- 125015.0 penelopeLPO smithic pr 1 ulgbc2.liv.ac.uk 125034.0 penelope vpanetti r 1 ulgbc2.liv.ac.uk 125035.0 penelope vpanetti w 1 ulgbc2.liv.ac.uk 125038.0 penelope smithic si 1 ulgbc2.liv.ac.uk 125042.0 penelopeLPO smithic qw 1 ulgbc2.liv.ac.uk 125043.0 mcnpx3 colinb r 64 ulgbc4.liv.ac.uk 125044.0 mcnpx3 colinb r 64 lancs2.nw-grid.ac.uk 125044.0 gamess_test bonarlaw r 32 ngs.rl.ac.uk

14

15

16

17

18

19 Lung treatment simulated with PENELOPE and penVOX 07 7 fields PSF calculation 1.5 days (14 cores) approximately 1.5 million particles Patient calculation 1.5 days for all 7 fields (single core) Statistical uncertainty 1% (1 sigma)

20 2.5cm diameter beam, full energy (~60 MeV at patient, ~3.2 cm range in water) 500 million histories 0.5x0.5x5 mm voxels 50keV proton cut-off <1% statistical uncertainty in absorbed dose in high dose region (1  ) Bragg peak Half-modulation Proton absorbed dose in water using MCNPX

21 Future Directions  Provide support for BEAM [1] and DOSxyz [3] (based on the EGSnrc MC code [2] )  Utilise Liverpool Windows Condor Pool for running PENELOPE jobs  Compare with other implementations e.g. RT-Grid. References: [1] 23D. W. Rogers, B. Faddegon, G. X. Ding, C. M. Ma, J. Wei, and T. Mackie, “BEAM: A Monte Carlo code to simulate radiotherapy treatment units,” Med. Phys. 22, 503–524 _1995_. [2] Kawrakow and D. W. O. Rogers. The EGSnrc Code System: Monte Carlo simulation of electron and photon transport. Technical Report PIRS-701 (4th printing), National Research Council of Canada, Ottawa, Canada, 2003. [3] Walters B, Kawrakow I and Rogers D W O 2007 DOSXYZnrc Users Manual Report PIRS 794 (Ottawa: National Research Council of Canada)


Download ppt "Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker."

Similar presentations


Ads by Google