PyECLOUD G. Iadarola, G. Rumolo ECLOUD meeting 28/11/2011 Thanks to: R. De Maria, K. Li.

Slides:



Advertisements
Similar presentations
Introduction to RF for Accelerators
Advertisements

Two-dimensional Effects on the CSR Interaction Forces for an Energy-Chirped Bunch Rui Li, J. Bisognano, R. Legg, and R. Bosch.
PyECLOUD G. Iadarola, G. Rumolo Thanks to: F. Zimmermann, G. Arduini, H. Bartosik, C. Bhat, O. Dominguez, M. Driss Mensi, E. Metral, M. Taborelli.
Heat load due to e-cloud in the HL-LHC triplets G. Iadarola, G. Rumolo 19th HiLumi WP2 Task Leader Meeting - 18 October 2013 Many thanks to: H.Bartosik,
Electron-cloud instability in the CLIC damping ring for positrons H. Bartosik, G. Iadarola, Y. Papaphilippou, G. Rumolo TWIICE workshop, TWIICE.
Some Ideas Behind Finite Element Analysis
Radiative Transfer with Predictor-Corrector Methods ABSTRACT TITLE : Radiative Transfer with Predictor-Corrector Methods OBJECTIVE: To increase efficiency,
Electron Cloud Modeling for CesrTA Daniel Carmody Mentors: Levi Schächter, David Rubin August 8th, 2007.
ECLOUD Calculations of Coherent Tune Shifts for the April 2007 Measurements - Study of SEY Model Effects - Jim Crittenden Cornell Laboratory for Accelerator-Based.
Chamber Dynamic Response Modeling Zoran Dragojlovic.
BE-ABP-HSC Electron cloud simulations in the LHC MKI Electron Cloud meeting A. Romano, G. Iadarola, G. Rumolo Many thanks to: M. Barnes, M. Taborelli.
1 Ion Optics Simulations What it is. How it’s useful. The SIMION ion optics software. –How it works. –Limitations and cautions –Demonstrations and examples.
PyECLOUD for PyHEADTAIL: development work G. Iadarola, A. Axford, H. Bartosik, K. Li, G. Rumolo Electron cloud meeting – 14 May 2015 Many thanks to: A.
Introduction Status of SC simulations at CERN
SIMULATION PROGRESS AND PLANS AT ROSTOCK/DESY Aleksandar Markovic ECL2, CERN, March 1, 2007 Gisela Pöplau.
Ch 8.1 Numerical Methods: The Euler or Tangent Line Method
GRD - Collimation Simulation with SIXTRACK - MIB WG - October 2005 LHC COLLIMATION SYSTEM STUDIES USING SIXTRACK Ralph Assmann, Stefano Redaelli, Guillaume.
Javier Junquera Molecular dynamics in the microcanonical (NVE) ensemble: the Verlet algorithm.
Electron Cloud Simulations Using ORBIT Code - Cold Proton Bunch model April 11, 2007 ECLOUD07 Yoichi Sato, Nishina Center, RIKEN 1 Y. Sato ECLOUD07.
25-26 June, 2009 CesrTA Workshop CTA09 Electron Cloud Single-Bunch Instability Modeling using CMAD M. Pivi CesrTA CTA09 Workshop June 2009.
Molecular Dynamics Simulation Solid-Liquid Phase Diagram of Argon ZCE 111 Computational Physics Semester Project by Gan Sik Hong (105513) Hwang Hsien Shiung.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, N.Kazarinov.
Finite Element Method.
LHC Scrubbing Runs Overview H. Bartosik, G. Iadarola, K. Li, L. Mether, A. Romano, G. Rumolo, M. Schenk, G. Arduini ABP information meeting 03/09/2015.
Astra A Space Charge Tracking Algorithm
PyEcloud code and simulations G. Iadarola, G. Rumolo ICE meeting 9 April 2012.
Statistical Description of Charged Particle Beams and Emittance Measurements Jürgen Struckmeier HICforFAIR Workshop.
Electron cloud simulations for SuperKEKB Y.Susaki,KEK-ACCL 9 Feb, 2010 KEK seminar.
U. IrisoECLOUD’04 – 21 April ECLOUD’04 April , Napa, CA Use of Maps for exploration of Electron Cloud parameter space Ubaldo Iriso and.
Fast scrubbing optimization: e- cloud maps C. Octavio Domínguez Thanks to G. Iadarola, G. Rumolo, F. Zimmermann 15 February e - cloud meeting.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, A.Drozhdin, N.Kazarinov.
March 23 & 28, Csci 2111: Data and File Structures Week 10, Lectures 1 & 2 Hashing.
March 23 & 28, Hashing. 2 What is Hashing? A Hash function is a function h(K) which transforms a key K into an address. Hashing is like indexing.
FCC electron cloud study plan K. Ohmi (KEK) Mar FCC electron cloud study meeting CERN.
Parallel Solution of the Poisson Problem Using MPI
J-PARC Trace3D Upgrades Christopher K. Allen Los Alamos National Laboratory.
Physics of electron cloud build up Principle of the multi-bunch multipacting. No need to be on resonance, wide ranges of parameters allow for the electron.
Self-consistent non-stationary theory of multipactor in DLA structures O. V. Sinitsyn, G. S. Nusinovich, T. M. Antonsen, Jr. and R. Kishek 13 th Advanced.
Improved electron cloud build-up simulations with PyECLOUD G. Iadarola (1),(2), G. Rumolo (1) (1) CERN, Geneva, Switzerland, (2) Università di Napoli “Federico.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
1 The induced field distribution in materials which inherently possess large internal magnetic fields, or in materials which get magnetized when placed.
IPM EM Simulations 9 th DITANET Topical Workshop on Non-Invasive Beam Size Measurement for High Brightness Proton and Heavy Ion Accelerators April.
CERN, LIU-SPS ZS Review, 20/02/ Brief review on electron cloud simulations for the SPS electrostatic septum (ZS) G. Rumolo and G. Iadarola in LIU-SPS.
Cesr-TA Simulations: Overview and Status G. Dugan, Cornell University LCWS-08.
Progress on electron cloud studies for HL-LHC A. Axford, G. Iadarola, A. Romano, G. Rumolo Acknowledgments: R. de Maria, R. Tomás HL-LHC WP2 Task Leader.
Simulation of multipacting thresholds G. Iadarola and A. Romano on behalf of the LIU-SPS e-cloud team LIU SPS scrubbing review, 8 September, 2015.
GWENAEL FUBIANI L’OASIS GROUP, LBNL 6D Space charge estimates for dense electron bunches in vacuum W.P. LEEMANS, E. ESAREY, B.A. SHADWICK, J. QIANG, G.
Brookhaven Science Associates U.S. Department of Energy MERIT Project Review December 12, 2005, BNL, Upton NY MHD Studies of Mercury Jet Target Roman Samulyak.
Heat load analysis for Inner Triplet and Stand Alone Modules H. Bartosik, J. Hulsmann, G. Iadarola and G. Rumolo LBOC meeting 28 October 2014 Based on.
BE-ABP-HSC Non convex chamber in PyECLOUD Electron Cloud meeting G. Iadarola, A. Romano, G. Rumolo.
By Verena Kain CERN BE-OP. In the next three lectures we will have a look at the different components of a synchrotron. Today: Controlling particle trajectories.
Electron cloud in Final Doublet IRENG07) ILC Interaction Region Engineering Design Workshop (IRENG07) September 17-21, 2007, SLAC Lanfa Wang.
Warp LBNL Warp suite of simulation codes: developed to study high current ion beams (heavy-ion driven inertial confinement fusion). High.
FCC-hh: First simulations of electron cloud build-up L. Mether, G. Iadarola, G. Rumolo FCC Design meeting.
Simulation of the new PS EC detector: implementation and results A. Romano, G. Iadarola, G. Rumolo LIU-PS Student Day 20 April 2015.
Vacuum specifications in Linacs J-B. Jeanneret, G. Rumolo, D. Schulte in CLIC Workshop 09, 15 October 2009 Fast Ion Instability in Linacs and the simulation.
Electron Cloud Effects: Heat Load and Stability Issues G. Iadarola, A. Axford, K. Li, A. Romano, G. Rumolo Joint HiLumi LHC - LARP Annual Meeting,
Fast Ion Instability Study G. Rumolo and D. Schulte CLIC Workshop 2007, General introduction to the physics of the fast ion instability Fastion.
Benchmarking Headtail with e-cloud observations with LHC 25ns beam H. Bartosik, W. Höfle, G. Iadarola, Y. Papaphilippou, G. Rumolo.
OPERATED BY STANFORD UNIVERSITY FOR THE U.S. DEPT. OF ENERGY 1 Alexander Novokhatski April 13, 2016 Beam Heating due to Coherent Synchrotron Radiation.
IPM Simulations at Fermilab 3-4 March 2016 Randy Thurman-Keup.
Şerban Udrea, Peter Forck, GSI
Chamber Dynamic Response Modeling
People who attended the meeting:
Heat load estimates for the Long Straight Sections of the HL-LHC
PyECLOUD and Build Up Simulations at CERN
Electron cloud and collective effects in the FCC-ee Interaction Region
A Mapping Approach to the Electron Cloud for LHC
Ph.D. Thesis Numerical Solution of PDEs and Their Object-oriented Parallel Implementations Xing Cai October 26, 1998.
Physics 417/517 Introduction to Particle Accelerator Physics
Presentation transcript:

PyECLOUD G. Iadarola, G. Rumolo ECLOUD meeting 28/11/2011 Thanks to: R. De Maria, K. Li

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Why a new code for electron cloud build-up simulation? At the very beginning our idea was to reorganize ECLOUD in order to make it easier the development of new features, to identify and correct of present and future bugs, to access to a larger amount of information about our simulations.

Why a new code for electron cloud build-up simulation? This task has revealed to be quite hard since: The code is scarcely modular: the frequent use of global variables makes very difficult to extract subroutines and building self consistent modules for development and testing purposes In Fortran 77 is very difficult to write readable and flexible code (even a proper indentation is a non-trivial task) At the very beginning our idea was to reorganize ECLOUD in order to make it easier the development of new features, to identify and correct of present and future bugs, to access to a larger amount of information about our simulations.

Why a new code for electron cloud build-up simulation? At the very beginning our idea was to reorganize ECLOUD in order to make it easier the development of new features, to identify and correct of present and future bugs, to access to a larger amount of information about our simulations. We have retained that the effort of writing a fully reorganized code, in a newer and more powerful language, should be compensated by a significantly increased efficiency in future development and debugging.

Python Allows incremental and interactive development of the code, encouraging an highly modular structure. Reading and understanding the code, writing extensions, exploring different solutions is much faster (orders of magnitude!!) with respect to compiled languages (especially looking at FORTRAN 77) Open source libraries (like NumPy and SciPy) make it a very powerful tool for scientific computation (comparable to a specialized commercial tool like MATLAB) Computationally intensive parts can be written in C/C++ or FORTRAN and easily integrated in a Python code (successfully employed in our distribution computation/interpolation routines – x6 overall speed up) It is a powerful general purpose interpreted language:

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

From ECLOUD to PyECLOUD Writing PyECLOUD has required a detailed analysis of ECLOUD algorithm and implementation, looking also at the related long experience in electron cloud simulations. Attention has been devoted to the individuation of ECLOUD limitations, in particular in terms of its convergence properties with respect to the electron distribution in bending magnets (how many stripes…)

From ECLOUD to PyECLOUD Writing PyECLOUD has required a detailed analysis of ECLOUD algorithm and implementation, looking also at the related long experience in electron cloud simulations. Attention has been devoted to the individuation of ECLOUD limitations, in particular in terms of its convergence properties with respect to the electron distribution in bending magnets (how many stripes…) As a result, several features have been significantly modified in order to improve the code’s performances in terms: Accuracy Efficiency Flexibility

Ingredients for e-cloud build-up simulation 1. Seed electrons generation (gas ionization, photoemission)

Ingredients for e-cloud build-up simulation 1. Seed electrons generation (gas ionization, photoemission) 2. Force exerted by the beam on e -

Ingredients for e-cloud build-up simulation 1. Seed electrons generation (gas ionization, photoemission) 2. Force exerted by the beam on e - 3. Force exerted by the e - on each other (space charge)

Ingredients for e-cloud build-up simulation 1. Seed electrons generation (gas ionization, photoemission) 2. Force exerted by the beam on e - 3. Force exerted by the e - on each other (space charge) 4. Equations of motion (also in presence of an external magnetic field)

Ingredients for e-cloud build-up simulation 1. Seed electrons generation (gas ionization, photoemission) 2. Force exerted by the beam on e - 3. Force exerted by the e - on each other (space charge) 4. Equations of motion (also in presence of an external magnetic field) 5. Secondary emission

The most relevant improvements introduced in PyECLOUD are: A different management of macroparticle size and number A more accurate back-tracking algorithm for the impacting electrons A more efficient computation of the electric field generated by the travelling proton beam A more general ad accurate method for the evaluation of the electrons space-charge field From ECLOUD to PyECLOUD

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Macroparticles The simulation of the dynamics of the entire number of electrons (≈10 10 per meter) is extremely heavy (practically unaffordable) Since the dynamics equation of the electron depends only on the q/m ratio of the electron a macroparticle (MP) method can be used: The MP size can be seen as the “resolution” our electron gas simulation

Macroparticles In an electron-cloud buildup, due to the multipacting process, the electron number spreads several orders of magnitude: It is practically impossible to choose a MP size that is suitable for the entire simulation (allowing a satisfactory description of the phenomenon and a computationally affordable number of MPs)

ECLOUD – MP number control: the CLEAN routine In ECLOUD the clean routine is called once per bunch passage and when the number of MPs goes beyond 2*10 5

ECLOUD – MP number control: the CLEAN routine In ECLOUD the clean routine is called once per bunch passage and when the number of MPs goes beyond 2*10 5 If the number of MPs N mp is >5*10 4 it randomly eliminates MPs in order to obtain N mp ≈ 5*10 4.

ECLOUD – MP number control: the CLEAN routine In ECLOUD the clean routine is called once per bunch passage and when the number of MPs goes beyond 2*10 5 If the number of MPs N mp is >5*10 4 it randomly eliminates MPs in order to obtain N mp ≈ 5*10 4.

ECLOUD – MP number control: the CLEAN routine In ECLOUD the clean routine is called once per bunch passage and when the number of MPs goes beyond 2*10 5 If the number of MPs N mp is >5*10 4 it randomly eliminates MPs in order to obtain N mp ≈ 5*10 4. Then the remaining MPs’ charge is rescaled in order to obtain the same total charge present in the chamber before the clean

ECLOUD – MP number control: the CLEAN routine In ECLOUD the clean routine is called once per bunch passage and when the number of MPs goes beyond 2*10 5 If the number of MPs N mp is >5*10 4 it randomly eliminates MPs in order to obtain N mp ≈ 5*10 4. Then the remaining MPs’ charge is rescaled in order to obtain the same total charge present in the chamber before the clean This approach guarantees the charge conservation but does not preserve the total energy or the velocity distribution of the electrons.

In a typical ECLOUD simulation (SPS MBB SEY=1.5): The number and the size of MPs produced by the seed generation mechanism is kept constant during the entire simulation. As consequence, at saturation, we have a certain number of MPs carrying practically no charge (≈ 5% of the MPs carries ≈1ppm of the total charge) ECLOUD – MP number control: other observations

In a typical ECLOUD simulation (SPS MBB SEY=1.5): The number and the size of MPs produced by the seed generation mechanism is kept constant during the entire simulation. As consequence, at saturation, we have a certain number of MPs carrying practically no charge (≈ 5% of the MPs carries ≈1ppm of the total charge) At saturation, there is a consistent number of MPs which are trapped near the chamber wall and undergo several low energy impact becoming smaller and smaller. In ECLOUD there is no mechanism which selectively eliminates them. ECLOUD – MP number control: other observations

In PyECLOUD we try to treat in a unified way all issues related to MP size and number, namely: PyECLOUD – MP number control: a global approach

In PyECLOUD we try to treat in a unified way all issues related to MP size and number, namely: 1)Given the number of seed e - to be generated in a certain time-step how many MP do we generate? (i.e. how do we choose their size?) PyECLOUD – MP number control: a global approach

In PyECLOUD we try to treat in a unified way all issues related to MP size and number, namely: 1)Given the number of seed e - to be generated in a certain time-step how many MP do we generate? (i.e. how do we choose their size?) 2)When a MP hits the wall when it is sufficient to change the size of the current MP according to the Secondary Emission Yield and when true secondary MPs must be created? PyECLOUD – MP number control: a global approach

In PyECLOUD we try to treat in a unified way all issues related to MP size and number, namely: 1)Given the number of seed e - to be generated in a certain time-step how many MP do we generate? (i.e. how do we choose their size?) 2)When a MP hits the wall when it is sufficient to change the size of the current MP according to the Secondary Emission Yield and when true secondary MPs must be created? 3)When a MP (after some low energy impacts) is considered so small to be eliminated without affecting the simulation? PyECLOUD – MP number control: a global approach x

In PyECLOUD we try to treat in a unified way all issues related to MP size and number, namely: 1)Given the number of seed e - to be generated in a certain time-step how many MP do we generate? (i.e. how do we choose their size?) 2)When a MP hits the wall when it is sufficient to change the size of the current MP according to the Secondary Emission Yield and when true secondary MPs must be created? 3)When a MP (after some low energy impacts) is considered so small to be eliminated without affecting the simulation? 4)What to do when, because of multipacting process, the number of MPs becomes too large for a reasonable computational burden ? PyECLOUD – MP number control: a global approach x

We control all this issues through the single parameter N ref which is a target average size of the MPs (i.e. our resolution) and is adaptively changed during the simulation: PyECLOUD – MP number control: a global approach

We control all this issues through the single parameter N ref which is a target average size of the MPs (i.e. our resolution) and is adaptively changed during the simulation: 1)Given the number of seed e - to be generated, the generated MPs have size N ref and their number is chosen as a consequence PyECLOUD – MP number control: a global approach

We control all this issues through the single parameter N ref which is a target average size of the MPs (i.e. our resolution) and is adaptively changed during the simulation: 1)Given the number of seed e - to be generated, the generated MPs have size N ref and their number is chosen as a consequence 2)When a MP hits the wall, additional true secondary MPs are emitted if the total emitted charge is >1.5N ref and, in that case, their number is chosen so that their size is as close as possible to N ref PyECLOUD – MP number control: a global approach

We control all this issues through the single parameter N ref which is a target average size of the MPs (i.e. our resolution) and is adaptively changed during the simulation: 1)Given the number of seed e - to be generated, the generated MPs have size N ref and their number is chosen as a consequence 2)When a MP hits the wall, additional true secondary MPs are emitted if the total emitted charge is >1.5N ref and, in that case, their number is chosen so that their size is as close as possible to N ref 3)At each bunch passage a clean function is called that eliminates all the MPs with charge <10 -4 N ref x PyECLOUD – MP number control: a global approach

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: PyECLOUD – MP number control: a global approach

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution PyECLOUD – MP number control: a global approach

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution PyECLOUD – MP number control: a global approach

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution PyECLOUD – MP number control: a global approach

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution b.The new target MP size is chosen so that: PyECLOUD – MP regeneration

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution b.The new target MP size is chosen so that: c.A new MPs set, having the new reference size, is generated according to the computed distribution. PyECLOUD – MP regeneration

4)When the number of MPs becomes larger than a certain threshold (≈10 5 ) that means that the computational burden is becoming too high, we change MP target size (N ref ) and perform a regeneration of the MPs system: a.Each macroparticle is assigned to a cell of a uniform grid in the 5-D space (x,y,v x,v y,v z ) obtaining an approximation of the phase space distribution b.The new target MP size is chosen so that: c.A new MPs set, having the new reference size, is generated according to the computed distribution. PyECLOUD – MP regeneration All moments related to position and velocity (e.g. energy distrib., charge distrib.) are preserved. The error on total charge and total energy does not go beyond 1-2%

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Impacting electrons backtracking - ECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. ECLOUD applies a simple rescaling of the electron’s position:

Impacting electrons backtracking - ECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. ECLOUD applies a simple rescaling of the electron’s position:

Impacting electrons backtracking - ECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. ECLOUD applies a simple rescaling of the electron’s position: This forces to choose very small time-steps in order to avoid perturbations on the distribution of the electron cloud.

Impacting electrons backtracking - PyECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. PyECLOUD always stores the MPs’ position at previous timestep in order to compute the crossing point between the chamber’s wall and the electron’s trajectory :

Impacting electrons backtracking - PyECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. PyECLOUD always stores the MPs’ position at previous timestep in order to compute the crossing point between the chamber’s wall and the electron’s trajectory :

Impacting electrons backtracking - PyECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. PyECLOUD always stores the MPs’ position at previous timestep in order to compute the crossing point between the chamber’s wall and the electron’s trajectory :

Impacting electrons backtracking - PyECLOUD Electrons impacting on the chamber’s wall are detected when they are found outside the chamber and must be backtracked on the wall. PyECLOUD always stores the MPs’ position at previous timestep in order to compute the crossing point between the chamber’s wall and the electron’s trajectory : Together with the improved space-charge computation, this simple trick has a great impact on the convergence properties with respect to the timestep (as we will see in a few slides).

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

ECLOUD - Beam field computation In ECLOUD the electric field of the beam is computed at each time-step at each electron location. x y σyσy σxσx E0E0 b a x y E imag. ch. Beam field calculated in free space Image charge contributions (effect of the perfectly conducting chamber)

In ECLOUD the electric field of the beam is computed at each time-step at each electron location. x y σyσy σxσx E0E0 b a x y E imag. ch. Beam field calculated in free space Image charge contributions (effect of the perfectly conducting chamber) Based on the Bassetti-Erskine formula:Images of a point charge: where: With: ECLOUD - Beam field computation

In ECLOUD the electric field of the beam is computed at each time-step at each electron location. x y σyσy σxσx E0E0 b a x y E imag. ch. Beam field calculated in free space Image charge contributions (effect of the perfectly conducting chamber) This task takes a great part of the computation time (70% with a quite coarse time-step) ECLOUD - Beam field computation

We can exploit the information available about the beam structure to make its field evaluation much more efficient. The beam is modeled as a travelling charge distribution having the form: PyECLOUD - Beam field computation

We can exploit the information available about the beam structure to make its field evaluation much more efficient. The beam is modeled as a travelling charge distribution having the form: PyECLOUD - Beam field computation

We can exploit the information available about the beam structure to make its field evaluation much more efficient. The beam is modeled as a travelling charge distribution having the form: We assume is the longitudinal charge density of the beam PyECLOUD - Beam field computation

If we considered a beam charge that is perfectly concentrated in s and has the same shape of our beam in the transverse plane: We obtain that the field has the following form: PyECLOUD - Beam field computation

If we considered a beam charge that is perfectly concentrated in s and has the same shape of our beam in the transverse plane: We obtain that the field has the following form: It is easy to prove that that, in case of a generic longitudinal distribution, the field can be written as: PyECLOUD - Beam field computation

This suggests to exploit the following procedure: The field map for the relevant chamber geometry and beam shape is pre- computed on a suitable rectangular grid and loaded from file in the initialization stage PyECLOUD - Beam field computation

This suggests to exploit the following procedure: The field map for the relevant chamber geometry and beam shape is pre- computed on a suitable rectangular grid and loaded from file in the initialization stage When the field at a certain location is needed a linear (4 points) interpolation algorithm is employed PyECLOUD - Beam field computation

This suggests to exploit the following procedure: The field map for the relevant chamber geometry and beam shape is pre- computed on a suitable rectangular grid and loaded from file in the initialization stage When the field at a certain location is needed a linear (4 points) interpolation algorithm is employed The field is rescaled by the relevant beam longitudinal density PyECLOUD - Beam field computation

This approach gives some advantages: Improvement of computation efficiency Improvement of the field accuracy (since the field is evaluated “offline” we have no time constraints on the field computation) We are not restricted to elliptical chambers anymore PyECLOUD - Beam field computation

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Electron space charge - ECLOUD In ECLOUD, the electric field due to the electrons in the chamber is calculated as follows: The charge of each electron is attributed to the nearest node of a predefined uniform rectangular grid

Electron space charge - ECLOUD In ECLOUD, the electric field due to the electrons in the chamber is calculated as follows: The charge of each electron is attributed to the nearest node of a predefined uniform rectangular grid The electric field of this grid of point charges is calculated on the points of the same grid by adding to their free-space field the image terms related to the elliptical perfectly conducting boundary

Electron space charge - ECLOUD In ECLOUD, the electric field due to the electrons in the chamber is calculated as follows: The charge of each electron is attributed to the nearest node of a predefined uniform rectangular grid The electric field of this grid of point charges is calculated on the points of the same grid by adding to their free-space field the image terms related to the elliptical perfectly conducting boundary The electric field at the location of each electron is evaluated employing a linear (4-point) interpolation formula

Electron space charge - ECLOUD This approach presents some drawbacks: The evaluation of the field of grid points that are very close to the chamber’s surface require a large amount of image terms (≈200) To avoid pathological situations, the grid points that are closer to the camber’s wall are not used but this can affect the accuracy on the space- charge field near the wall This kind of approach is limited to elliptical geometries

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. The electrostatic potential in the chamber is solution of the Poisson equation with homogeneous boundary conditions: withon the boundary Electrostatic potentialElectron charge density

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. The electrostatic potential in the chamber is solution of the Poisson equation with homogeneous boundary conditions: withon the boundary The electric field can be retrieved from the potential by the following formula Electrostatic potentialElectron charge density This problem can be handled numerically using the following strategy…

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. 1)The electron charge density distribution ρ(x,y) is computed on a uniform square grid by distributing the charge of each electron to the nearest 4 nodes:

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. 2)The finite difference method is employed to solve the Poisson equation:

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. 2)The finite difference method is employed to solve the Poisson equation: Sparse matrix depending only on the geometry (can be computed in the initialization stage and reused for all the space-charge evaluations).

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. 3)The electric field on the same grid is obtained using central difference equations:

Electron space charge - PyECLOUD In PyECLOUD we have implemented a 2D Particle in Cell algorithm. 4)The electric field at each electron locations is computed using a linear (4- points) interpolation:

Electron space charge - PyECLOUD

In PyECLOUD we have implemented a 2D Particle in Cell algorithm. This eliminates the convergence problems related to the image- charge approach and also eliminates the limitation to geometries for which image the charge expansion exists.

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Convergence study The most relevant parameters of the considered test-case are summarized below: Geometry of an MBB bending magnet with its average beta functions ( = 33.85m = 71.87m) Energy E=26GeV assuming SEY max = 1.5 and r.m.s. bunch length σ z =0.2m bunch spacing b s =25ns, normalized emittance ε n =3μm Filling pattern: To have an idea of the impact of the new features introduced in PyECLOUD, we have studied the convergence properties of both the codes as a function of the time-step ns buckets 72 8

Convergence study - Number of electrons ECLOUD

Convergence study - Number of electrons ECLOUDPyECLOUD

Convergence study – Heat load ECLOUDPyECLOUD

Convergence study – Center electron density Zoom here ECLOUDPyECLOUD

Convergence study – Center electron density ECLOUDPyECLOUD

Convergence study – Scrubbing electrons ECLOUDPyECLOUD

Convergence study – Electrons ditribution ECLOUDPyECLOUD

TimestepECLOUDPYECLOUD 0.2 ns29 min12 min 0.1 ns1h 27 min13 min 0.05 ns1h 45 min24 min 0.025ns3h 7 min40 min 0.012ns4h 15 min1h 6 min Processing time

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

PyECLOUD at work SPS: More than one turn – 5h computing time For details see present. at SPSU metting 13 October 2011

Heat load benchmarking of last 25ns scrubbing run (24 October) PyECLOUD at work

Heat load benchmarking of last 25ns scrubbing run (24 October) Simulations with measured bunch intensity (FBCT) and bunch length (BQM) – 9h computing time PyECLOUD at work

Heat load benchmarking of last 25ns scrubbing run (24 October) Simulations with measured bunch intensity (FBCT) and bunch length (BQM) – 9h computing time

Heat load benchmarking of last 25ns scrubbing run (24 October) Simulations with measured bunch intensity (FBCT) and bunch length (BQM) – 9h computing time

Heat load benchmarking of last 25ns scrubbing run (24 October) Simulations with measured bunch intensity (FBCT) and bunch length (BQM) – 9h computing time

Heat load benchmarking of last 25ns scrubbing run (24 October) Simulations with measured bunch intensity (FBCT) and bunch length (BQM) – 9h computing time PyECLOUD at work

For details see present. at SPSU metting 13 October 2011 Build-up bunch passage 40 PyECLOUD at work

Electron wave LHC 7TeVPS 26GeV PyECLOUD at work

Outline Why a new code for electron cloud build-up simulation? From ECLOUD to PyECLOUD o Management of macroparticle size and number o Back-tracking algorithm for the impacting electrons o Beam field calculation o Space charge field Preliminary convergence study PyECLOUD at work Conclusion and future work

Conclusion and future developments We have implemented an evolution of the ECLOUD code for the simulation of the electron cloud build-up Several improvements have been introduced with a significant impact on convergence properties and processing time Future developments of PyECLOUD may include: Further convergence studies (LHC top energy) Generalization of the particle backtracking algorithm to non elliptical chambers (e.g. SPS/LHC bending magnets) Generalization of the dynamics equations to an arbitrary magnetic field map (e.g. for the simulation of the electron cloud in quadrupoles or in PS combined function magnets)

Thanks for your attention!

Slope in the build-upNumber of electrons in “deep” sat.

t=t+Δt Evaluate the electric field of beam and spacecharge at each e - location Generate seed e - Dynamics equations solution for this timestep Detect impacts and generate secondaries Ingredients for e-cloud build-up simulation