Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker.

Slides:



Advertisements
Similar presentations
Derivation of initial electron beam energy spectrum Janusz Harasimowicz Establishment for Nuclear Equipment
Advertisements

The Vin č a Institute of Nuclear Sciences, Belgrade, Serbia * Universita degli studi di Bologna, DIENCA, Italia Radovan D. Ili}, Milan Pe{i}, Radojko Pavlovi}
LCSC - 01 Monte Carlo Simulation of Radiation Transport, for Stereotactic Radio Surgery Per Kjäll Elekta Instrument AB
Energy deposition and neutron background studies for a low energy proton therapy facility Roxana Rata*, Roger Barlow* * International Institute for Accelerator.
Ian C. Smith* Introduction to research computing using Condor *Advanced Research Computing University of Liverpool.
Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
D.G.Lewis Department of Medical Physics Velindre Cancer Centre Whitchurch, Cardiff RT-GRID: Grid Computing for Radiotherapy.
Using FLUKA to study Radiation Fields in ERL Components Jason E. Andrews, University of Washington Vaclav Kostroun, Mentor.
Implicit Capture Overview Jane Tinslay, SLAC March 2007.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
Cross Section Biasing & Path Length Biasing Jane Tinslay, SLAC March 2007.
Interaction Forcing Overview Jane Tinslay, SLAC March 2007.
Leading Particle Biasing Overview Jane Tinslay, SLAC March 2007.
MONTE CARLO RADIATION DOSE SIMULATIONS AND DOSIMETRY COMPARISON OF THE MODEL 6711 AND I BRACHYTHERAPY SOURCES Mark J. Rivard Department of Radiation.
Tissue inhomogeneities in Monte Carlo treatment planning for proton therapy L. Beaulieu 1, M. Bazalova 2,3, C. Furstoss 4, F. Verhaegen 2,5 (1) Centre.
etc… Analysing samples with complex geometries Particles Inclusions
1 M.G. Pia et al. The application of GEANT4 simulation code for brachytherapy treatment Maria Grazia Pia INFN Genova, Italy and CERN/IT
Department of Radiation Oncology and BIO-X, School of Medicine Stanford University Molecular Imaging Program at Stanford MIPS Monte Carlo treatment planning.
National Alliance for Medical Image Computing Grid Computing with BatchMake Julien Jomier Kitware Inc.
INFSO-RI Enabling Grids for E-sciencE Gilda experiences and tools in porting application Giuseppe La Rocca INFN – Catania ICTP/INFM-Democritos.
High Throughput Computing with Condor at Purdue XSEDE ECSS Monthly Symposium Condor.
Track 1: Cluster and Grid Computing NBCR Summer Institute Session 2.2: Cluster and Grid Computing: Case studies Condor introduction August 9, 2006 Nadya.
Sergey Ananko Saint-Petersburg State University Department of Physics
A Web interface to cloud-based Monte Carlo simulations for TrueBeam and C-linac Good morning. I will be introduce the ‘ cloud-based Monte Carlo simulations.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
LOGO Scheduling system for distributed MPD data processing Gertsenberger K. V. Joint Institute for Nuclear Research, Dubna.
The Pipeline Processing Framework LSST Applications Meeting IPAC Feb. 19, 2008 Raymond Plante National Center for Supercomputing Applications.
Photons e- e+ Bremsstrahlung Comton- scattered photon Pair production Photoelectric absorption Delta ray There are no analytical solutions to radiation.
M.G. Pia et al. Brachytherapy at IST Results from an atypical Comparison Project Stefano Agostinelli 1,2, Franca Foppiano 1, Stefania Garelli 1, Matteo.
Ian C. Smith* Introduction to research computing using the High Performance Computing facilities and Condor *Advanced Research Computing University of.
G. Bartesaghi, 11° ICATPP, Como, 5-9 October 2009 MONTE CARLO SIMULATIONS ON NEUTRON TRANSPORT AND ABSORBED DOSE IN TISSUE-EQUIVALENT PHANTOMS EXPOSED.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
IEEE Nuclear Science Symposium and Medical Imaging Conference Short Course The Geant4 Simulation Toolkit Sunanda Banerjee (Saha Inst. Nucl. Phys., Kolkata,
CSF4 Meta-Scheduler Name: Zhaohui Ding, Xiaohui Wei
Ian C. Smith ULGrid – Experiments in providing a campus grid.
Stuart Wakefield Imperial College London Evolution of BOSS, a tool for job submission and tracking W. Bacchi, G. Codispoti, C. Grandi, INFN Bologna D.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
Simulating Differential Dosimetry M. E. Monville1, Z. Kuncic2,3,4, C. Riveros1, P. B.Greer1,5 (1)University of Newcastle, (2) Institute of Medical Physics,
Bremsstrahlung Splitting Overview Jane Tinslay, SLAC March 2007.
Improvement of the Monte Carlo Simulation Efficiency of a Proton Therapy Treatment Head Based on Proton Tracking Analysis and Geometry Simplifications.
AMH001 (acmse03.ppt - 03/7/03) REMOTE++: A Script for Automatic Remote Distribution of Programs on Windows Computers Ashley Hopkins Department of Computer.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Experiences Running Seismic Hazard Workflows Scott Callaghan Southern California Earthquake Center University of Southern California SC13 Workflow BoF.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
Grid Compute Resources and Job Management. 2 Grid middleware - “glues” all pieces together Offers services that couple users with remote resources through.
Pedro Arce Introducción a GEANT4 1 GAMOS tutorial RadioTherapy Exercises Pedro Arce Dubois CIEMAT
EGSnrc The Center For High Energy Physics Kyungook National University Hye Yoon Chin1 EGSnrc Experiment Method and Data Process Prof. K. Cho Presentation.
Flair development for the MC TPS Wioletta Kozłowska CERN / Medical University of Vienna.
Dae-Hyun Kim Dept. of Biomedical Engineering The Catholic University of Korea Department of Biomedical Engineering Research Institute.
GATE application example using SuperComputer 국립암센터 박 세 준 연구원.
A Short Course on Geant4 Simulation Toolkit Introduction
MCS overview in radiation therapy
Enabling Grids for E-sciencE LRMN ThIS on the Grid Sorina CAMARASU.
Job submission overview Marco Mambelli – August OSG Summer Workshop TTU - Lubbock, TX THE UNIVERSITY OF CHICAGO.
Fast & Accurate Biophotonic Simulation for Personalized Photodynamic Cancer Therapy Treatment Planning Investigators: Vaughn Betz, University of Toronto.
INTERCOMPARISON P3. Dose distribution of a proton beam
Beam quality correction factors for linear accelerator with and without flattening filter Damian Czarnecki1,3, Philip von Voigts-Rhetz1, Björn Poppe3,
Introduction to the Application Hosting Environment
CNRS applications in medical imaging
A Brachytherapy Treatment Planning Software Based on Monte Carlo Simulations and Artificial Neural Network Algorithm Amir Moghadam.
The Hadrontherapy Geant4 advanced example
The Condor JobRouter.
A Short Course on Geant4 Simulation Toolkit Introduction
Introduction to research computing using Condor
Presentation transcript:

Ian C. Smith 1 A portal-based system for quality assurance of radiotherapy treatment plans using Grid-enabled High Performance Computing clusters CR Baker 2, V Panettieri 3, C Addison 1, AE Nahum 3 1 Computing Services Dept, University of Liverpool; 2 Directorate of Medical Imaging and Radiotherapy, University of Liverpool; 3 Physics Department, Clatterbridge Centre for Oncology

Outline  Introduction to radiotherapy treatment planning  University of Liverpool Grid Computing Server (GCS)  GCS tools  Command line job submission using the GCS  UL-GRID Portal  Results  Future directions

Rationale  Routine radiotherapy treatment planning is constrained by lack of sufficiently powerful computing resources  Monte Carlo (MC) based codes can provide accurate absorbed dose calculations but are computationally demanding (single simulation can take 3 weeks on a desktop machine)  Fortunately MC methods are inherently parallel – can run on HPC resources and (for some codes) HTC resources  So far looked at looked running simulations on local and centrally funded HPC clusters in a user-friendly manner  Starting to look at using Condor pools

Radiotherapy codes  Two MC codes have been investigated to date:  MCNPX (beta v2.7a)  general purpose transport code, tracks nearly all particles at nearly all energies (  parallel (MPI-based) code, only runs on clusters  self contained – no need for pre- and post- processing steps  PENELOPE  general purpose MC code implemented as a set of FORTRAN routines  coupled electron-photon transport from 50 eV to 1 GeV in arbitrary materials and complex geometries [1].  serial implementation, will run on clusters and Condor pools  needs pre- and post- processing to set up input files and combine partial results  Starting to look at EGSnrc / BEAMnrc / DOSXYZnrc [1] Salvat F, Fernández-Varea JM, Sempau J. PENELOPE, a code system for Monte Carlo simulation of electron and photon transport. France: OECD Nuclear Energy Agency, Issy-les-Moulineaux; ISBN Available in pdf format at:

Simulation of an electron treatment: from the treatment head to the patient (taken from Cygler et al) Courtesy of Prof. A. Nahum (CCO)

Grid Computing Server / UL-GRID Portal

Grid Computing Server / UL-GRID software stack

Grid Computing Server tools  single sign on to resources via MyProxy, use ulg-get-proxy (proxies automatically renewed)  job management is very similar to local batch systems such as SGE: ulg- qsub, ulg-qstat, ulg-qdel etc  support for submitting large numbers of jobs, file staging and pre- and post- processing  job submission process is the same for all compute clusters (local or external)  utility tools 1 provide simple Grid based extensions to standard UNIX commands: ulg-cp, ulg-ls, ulg-rm, ulg-mkdir etc.  status commands available e.g. ulg-status, ulg-rqstat 1 based on GROWL scripts from STFC Daresbury

PENELOPE (serial code) workflows  Rereasdasdas create random seeds for N input files using clonEasy[1] combine N individual phase-space files compute individual phase-space file create random seeds for N input files using clonEasy[1] stage-in phase-space file (only if necessary) compute partial treatment simulation results combine partial treatment simulation results using clonEasy[1] repeat for other patients phase-space file calculation patient treatment simulation Portal HPC cluster [1] Badal A and Sempau J 2006 A package of Linux scripts for the parallelization of Monte Carlo simulations Comput.Phys. Commun –50

GCS job description files for PENELOPE (1) # # create phase space file (PSF) # job_type = remote_script host = ulgbc2 total_jobs = 16 name = penelopeLPO pre_processor = /opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments = penmain_acc6_LPO35_.in 16 indexed_input_files = penmain_acc6_LPO35_.in input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat executable = /usr/local/bin/run_penmain arguments= penmain_acc6_LPO35_INDEX.in penmain_LPO35_INDEX.out log = mylogfile

GCS job description files for PENELOPE (2) # # perform patient simulation using previously calculated phase space file (PSF) # job_type = remote_script host = ulgbc2 name = penelope total_jobs = 10 pre_processor=/opt1/ulgrid/apps/penelope/seed_input_files pre_processor_arguments=penmain.in 10 staged_input_files=PSF_test.psf input_remote_stage_dir=staging input_files = water_phantom.geo,water.mat indexed_input_files = penmain.in executable = /usr/local/bin/run_penmain arguments= penmainINDEX.in penmainINDEX.out ics log = penelope.log

Condor job files for PENELOPE # job description file grid_resource = gt2 ulgbc2.liv.ac.uk/jobmanager-condorg universe = grid executable = /usr/local/bin/run_penmain arguments = penmain_acc6_LPO35_$(PROCESS).in penmain_LPO35_$(PROCESS).out ics_test +ulg_job_name = penelopeLPO log = log transfer_input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat, penmain_acc6_LPO35_$(PROCESS).in transfer_files = always transfer_executable = FALSE GlobusRSL = (count=1)(job_type=remote_script) \ (input_working_directory=/condor_data/smithic/penelope/big_test/create_psf) \ (job_name=penelopeLPO) notification = never queue 16 # DAG file JOB pre_process dummy1.sub JOB staging penelopeLPO35.sub SCRIPT PRE pre_process /opt1/ulgrid/apps/penelope/seed_input_files PARENT pre_process CHILD staging

GCS job submission and monitoring smithic(ulgp4)create_psf$ ulg-qsub penelopeLPO35 Grid job submitted successfully, Job ID is smithic(ulgp4)create_psf$ ulg-qstat Job ID Job Name Owner State Cores Host penelopeLPO smithic pr 1 ulgbc2.liv.ac.uk penelope vpanetti r 1 ulgbc2.liv.ac.uk penelope vpanetti w 1 ulgbc2.liv.ac.uk penelope smithic si 1 ulgbc2.liv.ac.uk penelopeLPO smithic qw 1 ulgbc2.liv.ac.uk mcnpx3 colinb r 64 ulgbc4.liv.ac.uk mcnpx3 colinb r 64 lancs2.nw-grid.ac.uk gamess_test bonarlaw r 32 ngs.rl.ac.uk

Lung treatment simulated with PENELOPE and penVOX 07 7 fields PSF calculation 1.5 days (14 cores) approximately 1.5 million particles Patient calculation 1.5 days for all 7 fields (single core) Statistical uncertainty 1% (1 sigma)

2.5cm diameter beam, full energy (~60 MeV at patient, ~3.2 cm range in water) 500 million histories 0.5x0.5x5 mm voxels 50keV proton cut-off <1% statistical uncertainty in absorbed dose in high dose region (1  ) Bragg peak Half-modulation Proton absorbed dose in water using MCNPX

Future Directions  Provide support for BEAM [1] and DOSxyz [3] (based on the EGSnrc MC code [2] )  Utilise Liverpool Windows Condor Pool for running PENELOPE jobs  Compare with other implementations e.g. RT-Grid. References: [1] 23D. W. Rogers, B. Faddegon, G. X. Ding, C. M. Ma, J. Wei, and T. Mackie, “BEAM: A Monte Carlo code to simulate radiotherapy treatment units,” Med. Phys. 22, 503–524 _1995_. [2] Kawrakow and D. W. O. Rogers. The EGSnrc Code System: Monte Carlo simulation of electron and photon transport. Technical Report PIRS-701 (4th printing), National Research Council of Canada, Ottawa, Canada, [3] Walters B, Kawrakow I and Rogers D W O 2007 DOSXYZnrc Users Manual Report PIRS 794 (Ottawa: National Research Council of Canada)