Code description: pyORBIT

Slides:



Advertisements
Similar presentations
Two-dimensional Effects on the CSR Interaction Forces for an Energy-Chirped Bunch Rui Li, J. Bisognano, R. Legg, and R. Bosch.
Advertisements

Benchmark of ACCSIM-ORBIT codes for space charge and e-lens compensation Beam’07, October 2007 Masamitsu AIBA, CERN Thank you to G. Arduini, C. Carli,
Linac4 Beam Commissioning Committee PSB Beam Optics and Painting Schemes 9 th December 2010 Beam Optics and Painting Schemes C. Bracco, C. Carli, B. Goddard,
Introduction Status of SC simulations at CERN
SIMULATION PROGRESS AND PLANS AT ROSTOCK/DESY Aleksandar Markovic ECL2, CERN, March 1, 2007 Gisela Pöplau.
GRD - Collimation Simulation with SIXTRACK - MIB WG - October 2005 LHC COLLIMATION SYSTEM STUDIES USING SIXTRACK Ralph Assmann, Stefano Redaelli, Guillaume.
Loss maps of RHIC Guillaume Robert-Demolaize, BNL CERN-GSI Meeting on Collective Effects, 2-3 October 2007 Beam losses, halo generation, and Collimation.
SciDAC Accelerator Simulation project: FNAL Booster modeling, status and plans Robert D. Ryne, P. Spentzouris.
PTC ½ day – Experience in PS2 and SPS H. Bartosik, Y. Papaphilippou.
1 Tracking code development for FFAGs S. Machida ASTeC/RAL 21 October, ffag/machida_ ppt & pdf.
25-26 June, 2009 CesrTA Workshop CTA09 Electron Cloud Single-Bunch Instability Modeling using CMAD M. Pivi CesrTA CTA09 Workshop June 2009.
Oliver Boine-FrankenheimSIS100-4: High current beam dynamics studies SIS 100 ‘high current’ design challenges o Beam loss in SIS 100 needs to be carefully.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, N.Kazarinov.
PS Booster Studies with High Intensity Beams Magdalena Kowalska supervised by Elena Benedetto Space Charge Collaboration Meeting May 2014.
Astra A Space Charge Tracking Algorithm
Development of Simulation Environment UAL for Spin Studies in EDM Fanglei Lin December
Details of space charge calculations for J-PARC rings.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, A.Drozhdin, N.Kazarinov.
Oliver Boine-Frankenheim, High Current Beam Physics Group Simulation of space charge and impedance effects Funded through the EU-design study ‘DIRACsecondary.
1 FFAG Role as Muon Accelerators Shinji Machida ASTeC/STFC/RAL 15 November, /machida/doc/othertalks/machida_ pdf/machida/doc/othertalks/machida_ pdf.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
Beam Modulation due to Longitudinal Space Charge Zhirong Huang, SLAC Berlin S2E Workshop 8/18/2003.
HB’2008 August 25-29, 2008M. Martini1 Evolution beam parameters during injection and storage of the high brightness beams envisaged for the Linac4 injection.
Comparison between simulations and measurements in the LHC with heavy ions T. Mertens, R. Bruce, J.M. Jowett, H. Damerau,F. Roncarolo.
Physics of electron cloud build up Principle of the multi-bunch multipacting. No need to be on resonance, wide ranges of parameters allow for the electron.
Collimator wakefields - G.Kurevlev Manchester 1 Collimator wake-fields Wake fields in collimators General information Types of wake potentials.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
Update on injection studies of LHC beams from Linac4 V. Forte (BE/ABP-HSC) Acknowledgements: J. Abelleira, C. Bracco, E. Benedetto, S. Hancock, M. Kowalska.
1 EMMA Tracking Studies Shinji Machida ASTeC/CCLRC/RAL 4 January, ffag/machida_ ppt & pdf.
Simplified Modeling of Space Charge Losses in Booster at Injection Alexander Valishev June 17, 2015.
Accelerator Systems DivisionOak Ridge National Laboratory March 10-12, 2003 ORBIT SIMULATIONS AND RESULTS Jeff Holmes Accelerator Physicist September 30,
GWENAEL FUBIANI L’OASIS GROUP, LBNL 6D Space charge estimates for dense electron bunches in vacuum W.P. LEEMANS, E. ESAREY, B.A. SHADWICK, J. QIANG, G.
DEVELOPMENT OF A STEADY STATE SIMULATION CODE FOR KLYSTRONS
PTC-ORBIT code for CERN machines (PSB, PS, SPS) Alexander Molodozhentsev (KEK) Etienne Forest (KEK) Group meeting, CERN June 1, 2011 current status …
GSI Helmholtzzentrum für Schwerionenforschung GmbH Sabrina Appel | PBBP11 March GSI Helmholtzzentrum für Schwerionenforschung GmbH Tracking simulations.
UPDATE IN PTC-ORBIT PSB STUDIES Space charge meeting ( ) * Vincenzo Forte * Follows LIS meeting presentation 16/04/2012.
Benchmark and Study of PSR Longitudinal Beam Dynamics Sarah M. Cousineau, Jeffrey A. Holmes, Slava Danilov MAP Workshop March 18, 2004.
Space Charge with PyHEADTAIL and PyPIC on the GPU Stefan Hegglin and Adrian Oeftiger Space Charge Working Group meeting –
By Verena Kain CERN BE-OP. In the next three lectures we will have a look at the different components of a synchrotron. Today: Controlling particle trajectories.
Warp LBNL Warp suite of simulation codes: developed to study high current ion beams (heavy-ion driven inertial confinement fusion). High.
Accelerator Physics GroupSNS, ORNL ORBIT J. A. HolmesORNL FNAL: September 11, 2001.
Main activities and news from the Impedance working group.
Ion effects in low emittance rings Giovanni Rumolo Thanks to R. Nagaoka, A. Oeftiger In CLIC Workshop 3-8 February, 2014, CERN.
1 IMPACT: Benchmarking Ji Qiang Lawrence Berkeley National Laboratory CERN Space-Charge Collaboration Meeting, May 20-21, 2014.
Marcel Schuh CERN-BE-RF-LR CH-1211 Genève 23, Switzerland 3rd SPL Collaboration Meeting at CERN on November 11-13, 2009 Higher.
BBFP J. Wei’s Fokker-Planck solver for bunched beams November 21 st, 2007 CLIC Beam dynamics meeting Y. Papaphilippou.
2 February 8th - 10th, 2016 TWIICE 2 Workshop Instability studies in the CLIC Damping Rings including radiation damping A.Passarelli, H.Bartosik, O.Boine-Fankenheim,
Pushing the space charge limit in the CERN LHC injectors H. Bartosik for the CERN space charge team with contributions from S. Gilardoni, A. Huschauer,
Managed by UT-Battelle for the Department of Energy Python ORBIT in a Nutshell Jeff Holmes Oak Ridge National Laboratory Spallation Neutron Source Space.
Warm linac simulations (DTL) and errors analysis M. Comunian F. Grespan.
Collective Effect II Giuliano Franchetti, GSI CERN Accelerator – School Prague 11/9/14G. Franchetti1.
Şerban Udrea, Peter Forck, GSI
LIU-PS Beam Dynamics Working Group Introduction and objectives
Orbit Response Matrix Analysis
People who attended the meeting:
402.5 MHz Debunching in the Ring
Space charge studies at the SPS
Multi-Turn Extraction studies and PTC
Sabrina Appel, GSI, Beam physics Space charge workshop 2013, CERN
Review Lecture Jeffrey Eldred Classical Mechanics and Electromagnetism
U C L A Electron Cloud Effects on Long-Term Beam Dynamics in a Circular Accelerator By : A. Z. Ghalam, T. Katsouleas(USC) G. Rumolo, F.Zimmermann(CERN)
TRANSVERSE RESISTIVE-WALL IMPEDANCE FROM ZOTTER2005’S THEORY
Single-bunch instability preliminary studies ongoing.
SC Overview 2013 White & Rouge The Codes in Comparison The Noise Issue
Update on PTC/ORBIT space charge studies in the PSB
Preliminary results of the PTC/ORBIT convergence studies in the PSB
Electron Rings Eduard Pozdeyev.
MEBT1&2 design study for C-ADS
Physics 417/517 Introduction to Particle Accelerator Physics
Emittance Studies at Extraction of the PS Booster: Emittance Calculation S. Albright, F. Antoniou, F. Asvesta, H. Bartosik, G.P. Di Giovanni, M. Fraser,
Presentation transcript:

Code description: pyORBIT H. Bartosik with input from members of the space charge working group ABP-CWG meeting, 2.3.2017 H. Bartosik

Outline Introduction Code features & description of the physics modeling Details about space charge modules Code implementation Impact on CERN studies Example of application (use case) Future plans and needs ABP-CWG 02.03.2017 H. Bartosik

Introduction to pyORBIT General description Macro particle simulation code mainly developed by. S. Cousineau, J. Holmes and A. Shishlo (SNS Oak Ridge) with contributions from colleagues at CERN and GSI Computational beam dynamics models for accelerators, main focus on space charge effects and related issues (H- injection with foil, collimation, …) for single bunch Modernized version of the ORBIT* code with Python as user interface Open source, available at GitHub https://github.com/PyORBIT-Collaboration Licensing policy MIT License (open source) Documentation Mainly provided in form of code examples demonstrating/testing individual modules Some documentation available on WIKI page PTC interface described in notes available at space charge working group website * ORBIT was developed at Oak Ridge National Laboratory (ORNL) for designing the Spallation Neutron Source (SNS) accelerator complex ABP-CWG 02.03.2017 H. Bartosik

pyORBIT – overview of relevant features Injection Injection bump collapse Stripper foil scattering Painting Collimation interaction of particles with matter Aperture restriction Losses the lattice is divided into “nodes” Tracking Teapot PTC Matrizes additional features are activated by adding “nodes” to the lattice Diagnostics Emittance Bunch twiss parameters Particle phase advances … Space charge Direct space charge (analytical or FFT PIC) Indirect space charge (FFT PIC with perfect conducting wall) Longitudinal space charge ABP-CWG 02.03.2017 H. Bartosik

Tracking Teapot lattice PTC Linear tracking with matrices The symplectic Teapot tracker from ORBIT is included in pyORBIT Each machine element is represented as a “lattice node” Contains all standard elements (standard magnets, RF cavities, kickers, …) Allows for acceleration and time varying fields, alignment errors Lattice definition can be directly imported from MAD8 and MADX (“MAD parsers”) Space charge, diagnostic and collimation nodes can be inserted as additional nodes PTC An interface to PTC is available (i.e. PTC can be used for tracking) PTC lattices can be called in the form of a “PTC flat file” (generated directly with MADX taking into account multipole and alignment errors) Time varying fields, acceleration and double harmonic RF systems through “PTC tables” Space charge, diagnostic and collimation nodes can be inserted in form of “child nodes” Linear tracking with matrices ABP-CWG 02.03.2017 H. Bartosik

Analytical (frozen) space charge solver Transverse force due to space charge is derived from the closed expression for electric field of a two-dimensional Gaussian (Basetti Erskine* formula) Transverse bunch dimensions obtained from local beta-functions, emittances, dispersion and momentum spread (optics functions from PTC) Space charge force is weighted for each particle with the line density at its longitudinal position: from analytical line density profile (Gaussian, parabolic, constant, …) or user defined (array input) Beam parameters (intensity, transverse emittance, momentum spread, line density) and lattice parameters (TWISS parameters at location of space charge nodes) can be updated by the user whenever needed (otherwise truly frozen) Not self consistent! (always transverse Gaussian) No coherent motion! Interest “Noise free”  allows using small number of macro particles to represent the beam Studies of single particle dynamics in presence of space charge (e.g. Frequency maps) * M. Bassetti and G.A. Erskine, Closed expression for the electrical field of a two-dimensional Gaussian charge, CERN-ISR-TH/80-06 ABP-CWG 02.03.2017 H. Bartosik

2D FFT PIC solver The 2D FFT Particle in Cell Poisson solver is a standalone building block used for several space charge modules in pyORBIT Solves Poisson’s equation in beam frame (1/g2 factor) Convolution (FFT) Method* (charge density and Green function are Fourier transformed to frequency domain and multiplied  potential is obtained by inverse transform) Perfect conducting boundary can be added (indirect space charge) Circular, elliptical, rectangular or arbitrary geometry defined by the user Potential on the grid is modified by adding free space potential such that the sum of the two potentials becomes zero on the boundary in a least squares sense ** electrostatic potential * R.W. Hockney and J.W. Eastwood, “Computer Simulation Using Particles”, IOP Publishing Ltd, Adam Hilger,p.211,(1988) ** F. W. Jones, A Method for incorporating image forces in multiparticle tracking with space charge, EPAC2000, (Vienna) ABP-CWG 02.03.2017 H. Bartosik

PIC models (the most commonly used) “2.5D” space charge solver Transverse distribution is binned onto single 2D grid Potential is obtained with 2D FFT Poisson solver Transverse kicks are weighted by local line density l obtained from longitudinal 1D grid Longitudinal space charge can be included (line density derivative, transverse uniform beam) Perfect conducting boundary conditions can be used “Slice-by-slice 2D” space charge solver Particles are binned to a 3D grid Potential is solved on each 2D grid slices Grid slices “feel” the nearest neighbors only Transverse and longitudinal kicks from gradient of potential on 3D grid ABP-CWG 02.03.2017 H. Bartosik

Longitudinal space charge Longitudinal space charge impedance From line density derivative evaluated once per turn Transverse uniform round beam in circular pipe (A. Chao) Evaluated on axis  no transverse dependence Longitudinal kicks from Slice-By-Slice 2D module Derivative of potential between 2D slices Arbitrary boundary conditions can be used Longitudinal kicks from 2.5D module (special version) Calculated from line density derivative assuming transverse uniform beam Transverse dependence: for each particle integration from actual radial position r to beam pipe a after calculating beam size b of macroparticle distribution (R. Baartman) longitudinally Gaussian bunch with transverse KV distribution in circular pipe ABP-CWG 02.03.2017 H. Bartosik

Diagnostic modules Bunch TWISS analysis Bunch tune analysis Losses Calculates statistical moments and correlations Provides statistical beam TWISS parameters, dispersion, beam emittance (with or without dispersive part, …) Bunch tune analysis Transforms the transverse phase space coordinates into normalized phase space at a given location in the lattice Calculates the phase advance per turn for each particle (and actually not the tune!) Losses Information on lost particles including loss location Aperture limitations can be installed on pyORBIT level, but also apertures from PTC are taken into account V. Forte ABP-CWG 02.03.2017 H. Bartosik

Code implementation Programming language(s) User interface via Python (simulation runs are actually Python scripts) Computationally intensive code of pyORBIT is written in C++, some functions in Python PTC is written in Fortran Wrapper functions make C++ routines and PTC tracking available at Python level Parallelization is based on MPI (more details on next slide) Prerequisites to run the code Python 2.6.6, C++, Fortran (if using PTC) Can be compiled on Linux, MAC OSX and Windows The source code and the compiled executable are available on AFS: /afs/cern.ch/project/LIUsc/space_charge/Codes/py-orbit Where is the code run? LXPLUS (mostly for quick checks and debugging) Space charge cluster (presently only 20 boxes with 16 Cores each, networked for MPI) Was tested on Bologna HPC cluster HPC cluster at CERN? ABP-CWG 02.03.2017 H. Bartosik

Parallelization strategy The particles are distributed equally between all CPUs during the initialization of the simulation … example for one lattice node s position along machine circumference Diagnostics node: In some cases requires MPI communication (e.g. emittance calculation) Space charge node with PIC solver: Each CPU bins its particles to its own grid Grids are synchronized with MPI Each CPU solves electric potential for a subset of the 2D grids Solution is propagated to other CPUs with MPI Each CPU applies kicks for its own particles Tracking through the lattice node: Each CPU tracks its own particles Collimation node: Each CPU removes its lost particles from bunch ABP-CWG 02.03.2017 H. Bartosik

Impact on CERN studies Understanding and modeling of space charge in LHC injectors is essential for LIU Tight requirements on emittance growth and losses PSB: optimizing beam brightness (fast beam behavior at low energy) Prediction of PSB brightness curve with Linac4 with optimized working point H- charge exchange injection with painting and capture in double RF on the ramp Impact of falling chicane bump including multipoles generated by eddy currents, beta- beat compensation, effect of resonances on final emittance, emittance blow-up from stripper foil with lower current from Linac4, … PS & SPS: emittance and loss evolution on injection plateau (fast & long-term) Understand performance limitation due to machine and space charge resonances Quantitative simulations of losses and blow-up as function of working point for optimization of working point and machine settings Beam halo and tail creation Fast blow-up at injection (PS in particular) Prediction of achievable brightness at 2 GeV PS injection energy ABP-CWG 02.03.2017 H. Bartosik

Example usage: PS (the most challenging …) Studies of the 8th order structural space charge driven resonance at Qy=6.25 Measurements of emittance and losses after 1.2 s storage time as a function of Qy PIC model for short term behavior (runs for different tunes, …) 1 run: 1700 nodes, grid 64x64x20, 200k macro particles, 13 days for 10k turns (16 CPUs) F. Asvesta, M. Serluca et al. F. Asvesta ABP-CWG 02.03.2017 H. Bartosik

Example usage: PS (the most challenging …) Studies of the 8th order structural space charge driven resonance at Qy=6.25 Measurements of emittance and losses after 1.2 s storage time as a function of Qy PIC model for short term behavior (runs for different tunes, …) 1 run: 1700 nodes, grid 64x64x20, 200k macro particles, 13 days for 10k turns (16 CPUs) Analytical (“frozen”) model for long term behavior (runs for different tunes, …) 1 run: 1700 nodes, 5k macro particles, 14 days for 300k turns (16 CPUs) cross check with PIC 300k turns (0.6s in PS) F. Asvesta ABP-CWG 02.03.2017 H. Bartosik

Future plans and needs Maintenance, extension and further development pyORBIT is actively maintained and further developed by colleagues from SNS Minor development at CERN for special features needed for LIU studies (mainly by fellows and PhD students) Performance improvement? No active work in this direction on pyORBIT code side Is the performance in general adequate to the present needs? Performance could be faster … substaintal part of computation time is spent on PTC tracking What type of hardware resources would be best suited for the physics case? Multi-core boxes properly networked for MPI Presently using CERN “space charge cluster”: 20 boxes with 16 cores each  Medium-term plan is to use Bologna HPC cluster Long-term hope is to get HPC cluster at CERN … presently computing resources not adequate ABP-CWG 02.03.2017 H. Bartosik