PyECLOUD for PyHEADTAIL: development work G. Iadarola, A. Axford, H. Bartosik, K. Li, G. Rumolo Electron cloud meeting – 14 May 2015 Many thanks to: A.

Slides:



Advertisements
Similar presentations
PyECLOUD G. Iadarola, G. Rumolo Thanks to: F. Zimmermann, G. Arduini, H. Bartosik, C. Bhat, O. Dominguez, M. Driss Mensi, E. Metral, M. Taborelli.
Advertisements

IPAC10, Kyoto, Japan, May 23-28, 2010 E-cloud feedback simulations - Vay et al. 1 Simulation of E-Cloud driven instability and its attenuation using a.
Heat load due to e-cloud in the HL-LHC triplets G. Iadarola, G. Rumolo 19th HiLumi WP2 Task Leader Meeting - 18 October 2013 Many thanks to: H.Bartosik,
Summary of the two-stream instability session G. Rumolo, R. Cimino Based on input from the presentations of G. Iadarola, H. Bartosik, R. Nagaoka, N. Wang,
Electron-cloud instability in the CLIC damping ring for positrons H. Bartosik, G. Iadarola, Y. Papaphilippou, G. Rumolo TWIICE workshop, TWIICE.
1 Warp-POSINST is used to investigate e-cloud effects in the SPS Beam ions Electrons Spurious image charges from irregular meshing controlled via guard.
Synchrotron Radiation What is it ? Rate of energy loss Longitudinal damping Transverse damping Quantum fluctuations Wigglers Rende Steerenberg (BE/OP)
Helmholtz International Center for Oliver Boine-Frankenheim GSI mbH and TU Darmstadt/TEMF FAIR accelerator theory (FAIR-AT) division Helmholtz International.
E-Cloud Effects in the Proposed CERN PS2 Synchrotron M. Venturini, M. Furman, and J-L Vay (LBNL) ECLOUD10 Workshshop, Oct Cornell University Work.
Pyheadtail Comparison and Development
Introduction Status of SC simulations at CERN
SIMULATION PROGRESS AND PLANS AT ROSTOCK/DESY Aleksandar Markovic ECL2, CERN, March 1, 2007 Gisela Pöplau.
GRD - Collimation Simulation with SIXTRACK - MIB WG - October 2005 LHC COLLIMATION SYSTEM STUDIES USING SIXTRACK Ralph Assmann, Stefano Redaelli, Guillaume.
Particle Studio simulations of the resistive wall impedance of copper cylindrical and rectangular beam pipes C. Zannini E. Metral, G. Rumolo, B. Salvant.
SciDAC Accelerator Simulation project: FNAL Booster modeling, status and plans Robert D. Ryne, P. Spentzouris.
Transverse Impedance Localization in SPS Ring using HEADTAIL macroparticle simulations Candidato: Nicolò Biancacci Relatore: Prof. L.Palumbo Correlatore.
PTC ½ day – Experience in PS2 and SPS H. Bartosik, Y. Papaphilippou.
25-26 June, 2009 CesrTA Workshop CTA09 Electron Cloud Single-Bunch Instability Modeling using CMAD M. Pivi CesrTA CTA09 Workshop June 2009.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, N.Kazarinov.
LHC Scrubbing Runs Overview H. Bartosik, G. Iadarola, K. Li, L. Mether, A. Romano, G. Rumolo, M. Schenk, G. Arduini ABP information meeting 03/09/2015.
Electron cloud simulations for SuperKEKB Y.Susaki,KEK-ACCL 9 Feb, 2010 KEK seminar.
A 3D tracking algorithm for bunches in beam pipes with elliptical cross-section and a concept for simulation of the interaction with an e-cloud Aleksandar.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, A.Drozhdin, N.Kazarinov.
Oliver Boine-Frankenheim, High Current Beam Physics Group Simulation of space charge and impedance effects Funded through the EU-design study ‘DIRACsecondary.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
FCC electron cloud study plan K. Ohmi (KEK) Mar FCC electron cloud study meeting CERN.
PyECLOUD G. Iadarola, G. Rumolo ECLOUD meeting 28/11/2011 Thanks to: R. De Maria, K. Li.
Physics of electron cloud build up Principle of the multi-bunch multipacting. No need to be on resonance, wide ranges of parameters allow for the electron.
PyHEADTAIL-PyECLOUD Simulations for LHC and HL- LHC Aaron Axford 27/05/20151.
E-cloud studies at LNF T. Demma INFN-LNF. Plan of talk Introduction New feedback system to suppress horizontal coupled-bunch instability. Preliminary.
Improved electron cloud build-up simulations with PyECLOUD G. Iadarola (1),(2), G. Rumolo (1) (1) CERN, Geneva, Switzerland, (2) Università di Napoli “Federico.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
Simplified Modeling of Space Charge Losses in Booster at Injection Alexander Valishev June 17, 2015.
Cesr-TA Simulations: Overview and Status G. Dugan, Cornell University LCWS-08.
Progress on electron cloud studies for HL-LHC A. Axford, G. Iadarola, A. Romano, G. Rumolo Acknowledgments: R. de Maria, R. Tomás HL-LHC WP2 Task Leader.
Simulation of multipacting thresholds G. Iadarola and A. Romano on behalf of the LIU-SPS e-cloud team LIU SPS scrubbing review, 8 September, 2015.
Heat load analysis for Inner Triplet and Stand Alone Modules H. Bartosik, J. Hulsmann, G. Iadarola and G. Rumolo LBOC meeting 28 October 2014 Based on.
Space Charge with PyHEADTAIL and PyPIC on the GPU Stefan Hegglin and Adrian Oeftiger Space Charge Working Group meeting –
BE-ABP-HSC Non convex chamber in PyECLOUD Electron Cloud meeting G. Iadarola, A. Romano, G. Rumolo.
By Verena Kain CERN BE-OP. In the next three lectures we will have a look at the different components of a synchrotron. Today: Controlling particle trajectories.
FCC-hh: First simulations of electron cloud build-up L. Mether, G. Iadarola, G. Rumolo FCC Design meeting.
LIU-SPS e-cloud contribution to TDR Electron cloud meeting, 17/02/20141 o First draft by end of February Between 5 to 10 max pages per chapter, refer.
A. Z. Ghalam, T. Katsouleas (USC) C. Huang, V. Decyk, W.Mori(UCLA) G. Rumolo and F.Zimmermann(CERN) U C L A 3-D Parallel Simulation Model of Continuous.
Principle of Wire Compensation Theory and Simulations Simulations and Experiments The Tevatron operates with 36 proton bunches and 36 anti-proton bunches.
Beam-beam Simulation at eRHIC Yue Hao Collider-Accelerator Department Brookhaven National Laboratory July 29, 2010 EIC Meeting at The Catholic University.
2 February 8th - 10th, 2016 TWIICE 2 Workshop Instability studies in the CLIC Damping Rings including radiation damping A.Passarelli, H.Bartosik, O.Boine-Fankenheim,
Vacuum specifications in Linacs J-B. Jeanneret, G. Rumolo, D. Schulte in CLIC Workshop 09, 15 October 2009 Fast Ion Instability in Linacs and the simulation.
200 MHz option for HL-LHC: e-cloud considerations (heat load aspects) G. Iadarola and G. Rumolo HLLHC WP2 meeting 03/05/2016 Many thanks to: K. Li, J.
Managed by UT-Battelle for the Department of Energy Python ORBIT in a Nutshell Jeff Holmes Oak Ridge National Laboratory Spallation Neutron Source Space.
Electron Cloud Effects: Heat Load and Stability Issues G. Iadarola, A. Axford, K. Li, A. Romano, G. Rumolo Joint HiLumi LHC - LARP Annual Meeting,
Transverse Stability Simulations with Linear Coupling in PyHEADTAIL X. Buffat, L. R. Carver, S. Fartoukh, K. Li, E. Métral, T. Persson, B. Salvant, M.
Benchmarking Headtail with e-cloud observations with LHC 25ns beam H. Bartosik, W. Höfle, G. Iadarola, Y. Papaphilippou, G. Rumolo.
Numerical Simulations for IOTA Dmitry Shatilov BINP & FNAL IOTA Meeting, FNAL, 23 February 2012.
Juan F. Esteban Müller P. Baudrenghien, T. Mastoridis, E. Shaposhnikova, D. Valuch IPAC’14 – Acknowledgements: T. Bohl, G. Iadarola, G. Rumolo,
Elias Métral, CERN-GSI bi-lateral working meeting on Collective Effects – Coordination of Theory and Experiments, GSI, 30-31/03/06 1/15 TRANSVERSE LANDAU.
OPERATED BY STANFORD UNIVERSITY FOR THE U.S. DEPT. OF ENERGY 1 Alexander Novokhatski April 13, 2016 Beam Heating due to Coherent Synchrotron Radiation.
People who attended the meeting:
Cryo Problem MD Planning Tue (1.11.) C B Day Time MD MP Tue 01:00
FASTION L. Mether, G. Rumolo ABP-CWG meeting
Electron Cooling Simulation For JLEIC
Multi-Turn Extraction studies and PTC
Beam-beam effects in eRHIC and MeRHIC
PyECLOUD and Build Up Simulations at CERN
Proposals for 2015 impedance-related MD requests for PSB and SPS
Impact of remanent fields on SPS chromaticity
U C L A Electron Cloud Effects on Long-Term Beam Dynamics in a Circular Accelerator By : A. Z. Ghalam, T. Katsouleas(USC) G. Rumolo, F.Zimmermann(CERN)
Single-bunch instability preliminary studies ongoing.
E. Métral, G. Rumolo, R. Tomás (CERN Switzerland), B
Candidato: Nicolò Biancacci
Beam-Beam Interaction in Linac-Ring Colliders
Presentation transcript:

PyECLOUD for PyHEADTAIL: development work G. Iadarola, A. Axford, H. Bartosik, K. Li, G. Rumolo Electron cloud meeting – 14 May 2015 Many thanks to: A. Oeftiger, M. Schenk

PyHEADTAIL v The PyECLOUD-PyHEADTAIL simulation setup We dropped the traditional approach of having separate tools for ecloud buildup and instability Use PyECLOUD also simulate the interaction beam/ecloud within PyHEADTAIL  Possible thanks to the highly modular structure of the two codes (object oriented) PyHEADTAIL bunch PyHEADTAIL slicer For each slice PyHEADTAIL bunch PyECLOUD (PyEC4PyHT object) Evaluate beam slice electric field (Particle in Cell) Generate seed e - Compute e - motion (t->t+Δt) (possibly with substeps) Compute e - motion (t->t+Δt) (possibly with substeps) Detect impacts and generate secondaries Evaluate the e - electric field (Particle in Cell) Evaluate the e - electric field (Particle in Cell) Apply kick on the beam particles Legend: From PyHEADTAIL From PyECLOUD Developed ad hoc Initial e- distribution (from PyECLOUD buildup sim.) PyHEADTAIL Transverse tracking  with Q’, octupoles etc. Longitudinal tracking Transverse feedback Impedances Space charge … Transverse tracking  with Q’, octupoles etc. Longitudinal tracking Transverse feedback Impedances Space charge …

The PyECLOUD-PyHEADTAIL simulation setup We dropped the traditional approach of having separate tools for ecloud buildup and instability Use PyECLOUD also simulate the interaction beam/ecloud within PyHEADTAIL  Possible thanks to the highly modular structure of the two codes (object oriented) Advantages of this approach: Profits from years of optimization and testing work on PyECLOUD and PyHEADTAIL All advanced e-cloud modeling features implemented in PyECLOUD become naturally available for beam dynamics simulations (arbitrary chamber shape, secondary electron emission, arbitrary magnetic field map, Boris electron tracker, accurate modeling for curved boundary) From now on the two tools can share most of the work of development and maintenance This enables several new simulation scenarios: Scenarios where the electron wall interaction cannot be neglected, e.g. long bunches (PS), doublet beams Quadrupoles (including triplets, for example we could keep one beam rigid…) Combined function magnets … but simulations can become very heavy (>1 Week)  performance optimization is crucial!

The computational core: PyPIC A key component of the e-cloud simulator is the Particle In Cell (PIC) Poisson solver (takes >50% of the computational time) Born as a buildup code, PyECLOUD did not allow to simulate the e-cloud dynamics in free space, extensively used in the past for HEADTAIL simulations  impossible to start the development from a well known model  We decided to reorganize our Particle In Cell (PIC) Poisson solvers  We wrote a Python library (PyPIC) including different PIC solvers having the same interface which can be used as plug-in modules for PyECLOUD but also for other applications (e.g. space charge, beam-beam) PyPIC is now available on the PyCOMPLETE git repository:

The computational core: PyPIC A key component of the e-cloud simulator is the Particle In Cell (PIC) Poisson solver (takes >50% of the computational time) Born as a buildup code, PyECLOUD did not allow to simulate the e-cloud dynamics in free space, extensively used in the past for HEADTAIL simulations  impossible to start the development from a well known model  We decided to reorganize our Particle In Cell (PIC) Poisson solvers  We wrote a Python library (PyPIC) including different PIC solvers having the same interface which can be used as plug-in modules for PyECLOUD but also for other applications (e.g. space charge, beam-beam) PyPIC is now available on the PyCOMPLETE git repository:

The computational core: PyPIC PyPIC includes the following solvers: Open boundary FFT PEC rectangular boundary FFT PEC arbitrarily shaped boundary – Finite Differences – staircase approx. of curved boundaries PEC arbitrarily shaped boundary – Finite Differences – Shortley-Weller method for curved bound. Bassetti-Erskine (not really a solver, for testing purposes) + test scripts: crosscheck the different modules against each other (for special cases in which this is possible) Uniformly charged cylinder inside circular boundary Uniformly charged cylinder inside square boundary

The computational core: PyPIC In PyECLOUD-PyHEADTAIL simulations the PIC solver is called twice per slice at each e-cloud interaction e.g. 512 turns x 35 kicks x 64 slices: 2.3 x 10 6 recalculations!!!  speed is crucial Performance optimization carried out for all the modules: FFTW routines used for the FFT based solvers LU factorization precomputed and stored the FD solvers Special properties of the relevant matrices have been exploited o FFT is performed block-wise skipping blocks filled with zeros o Trivial equations for points outside the chamber are removed Open boundary is heavier (2x larger matrix) For PEC: FFT and FD have similar performances 3.2 x 10 6 nodes

The computational core: PyPIC Situation different for smaller matrices but still FD has the best performance 3.2 x 10 5 nodes In PyECLOUD-PyHEADTAIL simulations the PIC solver is called twice per slice at each e-cloud interaction e.g. 512 turns x 35 kicks x 64 slices: 2.3 x 10 6 recalculations!!!  speed is crucial Performance optimization carried out for all the modules: FFTW routines used for the FFT based solvers LU factorization precomputed and stored the FD solvers Special properties of the relevant matrices have been exploited o FFT is performed block-wise skipping blocks filled with zeros o Trivial equations for points outside the chamber are removed

An extra boost to the Finite Difference solver Even after these optimizations simulation for the LHC at 7 TeV would still take ~8 days PIC still completely dominant in the profiling  tried to optimize even further: Iterative method turned out to be slower  Cython wrapped C-implemented SuperLU gave performances similar to scipy  o …but we learnt how to use C libraries in Python Found in literature that a simpler algorithm (KLU) outperforms SuperLU for very sparse matrices o Implemented using Cython to wrap the available C implementation  PyKLU  Available on the PyCOMPLETE git repository It worked!

Optimization on other modules of PyECLOUD Performance optimization through Cython and small C parts applied also in other parts of the code: Polygonal chamber routines o Gives more flexibility  made much easier the implementation of the non-convex case Boris tracker o Ready for the implementation of the generic multipole This routines offered also the occasion for first parallelization tests in Cython o Promising (x2 speedup with 4 cores) but for now not in the production version... Side effect of optimization work: also PyECLOUD buildup simulations became faster  x2 gain on HL-LHC triplet simulations between Nov and Nov. 2014

Validation procedure Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion Turn 1

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion Turn 10 Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied Turn 1, 3 kicks Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied 3)Check with betatron motion enabled Turn 1 Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied 3)Check with betatron motion enabled 4)Check on many turns in stable conditions (bunch by bunch tunes)  simulation is reasonably noise free Dipole Quadrupole Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied 3)Check with betatron motion enabled 4)Check on many turns in stable conditions (bunch by bunch tunes)  simulation is reasonably noise free 5)Check on many turns in unstable conditions  coherent bunch motion is correct HEADTAIL PyHEADTAIL Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied 3)Check with betatron motion enabled 4)Check on many turns in stable conditions (bunch by bunch tunes)  simulation is reasonably noise free 5)Check on many turns in unstable conditions  coherent bunch motion is correct HEADTAIL PyHEADTAIL Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

Validation procedure 1)Check on the single e-cloud kick (no betatron motion) change in angle)  e-cloud dynamics is correct  forces are correctly applied on the beam  crosscheck on synchrotron motion 2)Check with multiple e-cloud kicks (on the integer)  Phase advance correctly applied 3)Check with betatron motion enabled 4)Check on many turns in stable conditions (bunch by bunch tunes)  simulation is reasonably noise free 5)Check on many turns in unstable conditions  coherent bunch motion is correct HEADTAIL PyHEADTAIL Very difficult to benchmark directly on instability simulations  Started from more basic but deterministic checks comparing against HEADTAIL for dipole and drift  5000 probe particles start from the same initial conditions in HEADTAIL and PyHEADTAIL

First pilot study Started with simple model and complicated it in steps: Smooth approximation of the optics, D x,y = 0, no boundary, linear longitudinal motion, uniform initial e - distribution Nonlinear longitundinal motion (implemented longitudinal losses) Realistic chamber shape (see Kevin’s talk) (implemented transverse losses) Instability thresholds as a function of the intensity for the SPS at 26 GeV – Q26 vs Q20 – dipoles and drifts  Reasonably “lightweight”, corresponding HEADTAIL simulations already available

First pilot study Instability thresholds as a function of the intensity for the SPS at 26 GeV – Q26 vs Q20 – dipoles and drifts  Reasonably “lightweight”, corresponding HEADTAIL simulations already available Each simulation (512 turns) is divided in few jobs (4x128 turns) launched subsequently (automatic) Easier to recover crashed simulations Single Job size can be optimized for lxplus queues Simulation can be “extended” a posteriori For the same simulations settings we repeat the simulation 5 times with different seeds and take the average emittance evolution

PyECLOUD-PyHEADTAIL at work Already used for studies for the SPS and for the LHC, profiting already of features that were no available in HEADTAIL: Quadrupolar field Realistic chamber shape Correct tracking of the electrons in the H plane for dipoles Simulations with recorded pinch field map for tune footprint estimation

Summary and future work PyECLOUD and PyHEADTAIL can be used together to simulate effects of e-cloud on beam dynamics Fields from the electrons and from the beam are calculated using PyPIC, a newly developed library including several different Poisson solvers Important work on performance optimization was necessary to make computational time affordable Simulation results validated against HEADTAIL New tool already used for SPS and LHC simulations (see following talks) What next: We know for quadrupoles we need the distribution from buildup  robust way to get good resolution for buildup still to be developed, some work on job management “infrastructure” also to be done Investigate effect of non uniform beta function Other important players (Q’, octupoles, damper), should be included in the simulations And then to simulate the triplets: Double de-synchronized pinch Beta variation along the device Position variation along the device

Thanks for your attention!