3D solvers for impedance simulations

Slides:



Advertisements
Similar presentations
Lorentz force detuning measurements on the CEA cavity
Advertisements

RF aspects of module vacuum system Riccardo Zennaro CERN.
Impedance of SPS travelling wave cavities (200 MHz) A. Grudiev, E. Métral, B. Salvant, E. Shaposhnikova, B. Spataro Acknowledgments: Erk Jensen, Eric Montesinos,
Wakefield Damping Effects in the CLIC Power Extraction and Transfer Structure (PETS) Wakefield Simulation of CLIC PETS Structure Using Parallel 3D Finite.
Introduction Status of SC simulations at CERN
GRD - Collimation Simulation with SIXTRACK - MIB WG - October 2005 LHC COLLIMATION SYSTEM STUDIES USING SIXTRACK Ralph Assmann, Stefano Redaelli, Guillaume.
Particle Studio simulations of the resistive wall impedance of copper cylindrical and rectangular beam pipes C. Zannini E. Metral, G. Rumolo, B. Salvant.
Agenda: General kickers analysis Wang-Tsutsui method for computing impedances Benchmarks Conclusions Bibliography Acknowledgments: E.Métral, M.Migliorati,
Update of the SPS transverse impedance model Benoit for the impedance team.
Higher-Order Modes and Beam-Loading Compensation in CLIC Main Linac Oleksiy Kononenko BE/RF, CERN CLIC RF Structure Development Meeting, March 14, 2012.
Update on BGV impedance studies Alexej Grudiev, Berengere Luthi, Benoit Salvant for the impedance team Many thanks to Bernd Dehning, Massimiliano Ferro-Luzzi,
Update on wire scanner impedance studies
‘Computer power’ budget for the CERN Space Charge Group Alexander Molodozhentsev for the CERN-ICE ‘space-charge’ group meeting March 16, 2012 LIU project.
ILC Damping Rings Mini-Workshop, KEK, Dec 18-20, 2007 Status and Plans for Impedance Calculations of the ILC Damping Rings Cho Ng Advanced Computations.
Outline: Motivation Comparisons with: > Thick wall formula > CST Thin inserts models Tests on the Mode Matching Method Webmeeting N.Biancacci,
Instability rise-time far above the TMCI threshold: Comparison between simple theory, MOSES and HEADTAIL E. Benedetto, E. Metral Acknowledgements: G. Rumolo,
J. Pfingstner Imperfections tolerances for on-line DFS Improved imperfection tolerances for an on-line dispersion free steering algorithm Jürgen Pfingstner.
Collimation Wakefield Simulations Carl Beard ASTeC Daresbury Laboratory.
Impedance Budget Database Olga Zagorodnova BD meeting, DESY.
Coupler Short-Range Wakefield Kicks Karl Bane and Igor Zagorodnov Wake Fest 07, 11 December 2007 Thanks to M. Dohlus; and to Z. Li, and other participants.
… Work in progress at CTF3 … Davide Gamba 01 July 2013 Study and Implementation of L INEAR F EEDBACK T OOLS for machine study and operation.
Main activities and news from the Impedance working group.
Ion effects in low emittance rings Giovanni Rumolo Thanks to R. Nagaoka, A. Oeftiger In CLIC Workshop 3-8 February, 2014, CERN.
1 Update on the impedance of the SPS kickers E. Métral, G. Rumolo, B. Salvant, C. Zannini SPS impedance meeting - Oct. 16 th 2009 Acknowledgments: F. Caspers,
Update on Modeling Activities Gun Simulation Optimization Using Genetic Algorithms.
August 21st 2013 BE-ABP Bérengère Lüthi – Summer Student 2013
Impedance Working Group Update ICE meeting June 12 th 2013.
Three examples of application of Sussix 1)Data from simulations  sensitivity 2)Data from measurements  frequency resolution.
Pushing the space charge limit in the CERN LHC injectors H. Bartosik for the CERN space charge team with contributions from S. Gilardoni, A. Huschauer,
Update on the TDI impedance simulations and RF heating for HL- LHC beams Alexej Grudiev on behalf of the impedance team TDI re-design meeting 30/10/2012.
1 R&D Plans for Impedance Driven Single-Bunch Instabilities (WBS 2.2.1) M. Venturini ( presented by M. Zisman) LBNL LCWS07 DESY, Hamburg May 30 - June.
Victoria Ibarra Mat:  Generally, Computer hardware is divided into four main functional areas. These are:  Input devices Input devices  Output.
General – mode matching for transverse impedance being compared with CST and infinitely long pipes (Nicolo) – Carlo found a way to disentangle direct space.
Feasibility of impedance measurements with beam N. Biancacci, N. Wang, E. Métral and B.Salvant COLUSM meeting 27/05/2016 Acknowledgements: A. Lafuente.
Geometric Impedance of LHC Collimators O. Frasciello, S. Tomassini, M. Zobov LNF-INFN Frascati, Italy With contributions and help of N.Mounet (CERN), A.Grudiev.
OPERATED BY STANFORD UNIVERSITY FOR THE U.S. DEPT. OF ENERGY 1 Alexander Novokhatski April 13, 2016 Beam Heating due to Coherent Synchrotron Radiation.
Overview Modern chip designs have multiple IP components with different process, voltage, temperature sensitivities Optimizing mix to different customer.
ICE SECTION The coolest place to be! Elias Métral
People who attended the meeting:
Panel Discussion 3: Impedance Codes
FASTION L. Mether, G. Rumolo ABP-CWG meeting
Finemet cavity impedance studies
The TWIICE workshop Benoit for the ABP participants (Andrea, Gianni, Giovanni, Hannes, Lotta, Stefania, Yannis) ABP information meeting March 31st 2016.
Kickers analysis and benchmark
Update on HL-LHC triplet fingers
MDI: Trapped modes and other power losses
DELPHI and Vlasov solvers used at CERN
Follow up on SPS transverse impedance
10 MHz amplifier status G. Favia
Proposals for 2015 impedance-related MD requests for PSB and SPS
HOM power in FCC-ee cavities
LIU, ABP-CWG, PBC, miscellaneous
Electron cloud and collective effects in the FCC-ee Interaction Region
CST simulations of VMTSA
Impedance working group update
TRANSVERSE RESISTIVE-WALL IMPEDANCE FROM ZOTTER2005’S THEORY
Dummy septum impedance measurements
E. Métral, N. Mounet and B. Salvant
Wakefield Simulation of CLIC PETS Structure Using Parallel 3D Finite Element Time-Domain Solver T3P* Arno Candel, Andreas Kabel, Zenghai Li, Cho Ng, Liequan.
Agenda Lessons from TU Darmstadt New total wakes with CST 2010
E. Métral, G. Rumolo, R. Tomás (CERN Switzerland), B
Impedance working group update 21st August 2013
Simulations and RF Measurements of SPS Beam Position Monitors (BPV and BPH) G. Arduini, C. Boccard, R. Calaga, F. Caspers, A. Grudiev, E. Metral, F. Roncarolo,
Simulation with Particle Studio
TRANSVERSE RESISTIVE-WALL IMPEDANCE FROM ZOTTER2005’S THEORY
HBP impedance calculations
CERN-SPS horizontal instability
Multiphysics simulations of impedance effects in accelerators
Impedance working group update 07th August 2013
Current impedance issues
Presentation transcript:

3D solvers for impedance simulations Benoit and Kyrre

Agenda Description of the physics and the modeling of the code What effects are included and what are not? What is the impact of the simulation tool for CERN studies? Code implementation Programming language(s) Programming style (object oriented, procedural, …) Prerequisites to run the code (OS, compilers, libraries, other codes) Parallelisation technology (if any) Example of typical application (use case) How many simulations for one study Where the code is run (lxplus, lxbatch, other clusters) Computing time Performance Is the performance in general adequate to the present needs? Available documentation Licensing policy Open source? Future plans and needs Maintenance, extension and further development Include more physics to better model cases of interest for CERN? Performance improvement? Resource estimation for maintenance/development over next years What type of hardware resources would be best suited for the physics case?

codes CST (~10 users in ABP) ABCI (1 or 2 users) ACE3P (1 superuser and 3 beginners) GdfidL (0 user so far) [ANSYS HFSS] [COMSOL]

Description of the physics and the modeling of the code Goal: simulate impedance and wakefields of accelerator devices For direct observation (comparison with existing models and known limits) For use in beam dynamics tools (Sacherer formula, MOSES, DELPHI, HEADTAIL, ELEGANT, etc.)  critical for CERN studies since many years to determine intensity limitations for current machines (PSB, PS, SPS, LHC), projects (LIU, HL- LHC, CLIC) and studies (e.g. FCC-ee and FCC-hh) Method: 3D codes cut the simulation domain into mesh cells, in which Maxwell’s equations can be solved.

Code implementation Programming language(s) Source code is not accessible easily Some VB for the postprocessing Programming style (object oriented, procedural, …) Not applicable Prerequisites to run the code (OS, compilers, libraries, other codes) Runs well on Windows 7, exists also for Linux OS (but apparently less optimized) Parallelisation technology (if any) Yes, but requires additional license

Example of typical application (use case)  A device is being considered for installation  requires validation of its impact on performance (now done by impedance WG)  the model is designed in CATIA/SMARTEAM by e.g EN-MME  ST number is sent to us  we access the ST database, save locally the CATIA file, suppress the many unnecessary items  we import the model into CST, simplify it further, define frequency range, materials, and typically run: the wakefield solver, to get the longitudinal and transverse impedances and/or wakes (dip and quad) as a function of frequency the eigenmode solver to assess which modes can resonate in the structure run some parameter scans to check convergence and consistency (outgoing beam pipe length, mesh size, wake length) computing time can range from minutes to days depending on the mesh size to geometry aspect ratio In case there is a need to mitigate issues, a campaign of simulations is launched to check the sensitivity to geometry and material parameters. This can be very long.

Where is the code run (lxplus, lxbatch, other clusters) There are currently the following possibilities for our team: Run on local PC or server Run on servers managed by IT and fully or partly paid by ABP Run on HPC remote servers Run on CST servers (need to pay)

PerformaIs the performance in general adequate to the present needs? Yes and no Yes: for “simple” reasonably small structures in the LHC complex, since the average beam pipe cut-off for resonant modes is usually well above the frequency range excited by the beam. No: it is really (1) a pain and (2) very unreliable for studies that require short bunch length (e.g. FCC-ee, CLIC damping rings and light sources) Some shapes are difficult to mesh for the wakefield solver (wires, shallow tapers) Some materials (loss, coating, non linear) hit the limits of the code  The problem is not just the computing power, but also the noise levels

Available documentation Very useful embedded help and documentation Very responsive helpdesk if triggered adequately

Licensing policy The opposite of open source A certain number of expensive licenses are available inside CERN through IT. Like BMW, every option costs more. ABP and RF chipped in at some point to get more licenses. There used to be problems with licenses, but actions were taken by IT to mitigate this It is forbidden to remotely use CERN CST licenses from other labs

Future plans and needs No cost for maintenance (obviously) T. Weiland has left, now part of Dassault-systemes.  Difficult to predict what will happen in the near future (maintenance? Improvements?).  Clearly accelerators are a very small business compared to other customers. There has been slow but steady improvement in the performance of the code since 2006 (bugs correction, lossy materials in eigenmode, etc.). Next steps include the possibility to use the TET mesh in the wakefield solver. There are alternatives (GdfidL, ACE3P, ABCI for wakefields) and in particular HFSS for eigenmode, but they are not as convenient as CST for bulk of the cases.

What type of hardware resources would be best suited for the physics case? A combination of HPC and local servers are very efficient for now. GPU acceleration is not available for the wakefield solver yet.

codes CST ABCI ACE3P GdfidL [ANSYS HFSS] [COMSOL]

Description of the physics and the modeling of the code Goal: simulate impedance and wakefields of accelerator devices For direct observation (comparison with existing models and known limits) For use in beam dynamics tools (Sacherer formula, MOSES, DELPHI, HEADTAIL, ELEGANT, etc.)  critical for CERN studies since many years to determine intensity limitations for current machines (PSB, PS, SPS, LHC), projects (LIU, HL- LHC, CLIC) and studies (e.g. FCC-ee and FCC-hh)  can only be used for axisymmetric structures Method: 2D codes cut the simulation domain into mesh cells, in which Maxwell’s equations can be solved in time domain.

Code implementation Code developed by Y.H. Chin (KEK/JPARC, Japan). CERN-SL helped for the computing optimization (H. Grote). Programming language(s) Fortran Programming style (object oriented, procedural, …) Not applicable as not open source Prerequisites to run the code (OS, compilers, libraries, other codes) Runs well on Windows 7, exists also for Linux OS Parallelisation technology (if any) Yes, in openMP for shared memory computer “ The MPI version of ABCI is also under consideration.” (sic)

Example of typical application (use case)  A rotationally symmetric device is being considered for installation We get the coordinates of the vacuum chamber and type them into the input file computing time is typically minutes for our case.

Where is the code run (lxplus, lxbatch, other clusters)  I run it on my local PC. It crashes on local servers so far.  It is not clear that it is needed though.

Is the performance in general adequate to the present needs? Yes It is a good complement to CST and other 3D codes, as it is very fast even for small mesh size thanks to the 2D symmetry. However, only perfect conductors are used.

Available documentation http://abci.kek.jp/abci.htm The manual is embedded in the installation package

Licensing policy “the ABCI programs [is] free to use, but it is not Open Source. If you find someone who want to use it, just tell him the above URL to download the package.”

Future plans and needs No cost for maintenance (obviously) No update since 2009 apparently, but some papers suggest that also other types of materials can be used, which is not obvious from the manual. TO be followed up.

codes CST ABCI ACE3P (done by Kyrre) GdfidL [ANSYS HFSS] [COMSOL]

Description of the physics and the modeling of the code What effects are included and what are not? Eigenmode and S-parameter; including damping (ports, lossy materials [volume or surface]), periodic solutions Time domain; driven by beam or port, post processing for longitudinal- and transverse wakefield, moving mesh window for long structures, extraction of signals on ports. Damping through ports. Tracking/PIC: Calculate the field using the eigenmode- or S-parameter solver, then solve particle motion in the field. For multipacting and dark current simulation. Can also include the fields from the tracked particles and self- consistently solve the particle motion (for gun or klystron design). Multiphysics: Lorentz detuning, detuning from heat, mechanical eigenmodes What is the impact of the simulation tool for CERN studies? Ability to solve large and complex systems, such as the RF fingers. Still requires benchmark Meshing always done using Trellis / Cubit, this produces a tetrahedral mesh

Code implementation Code developed and maintained by SLAC. Programming language(s) C++ (uses libraries written in various languages) Programming style (object oriented, procedural, …) Object oriented Prerequisites to run the code (OS, compilers, libraries, other codes) Linux, MPI, various open source libraries Parallelisation technology (if any) MPI

Example of typical application (use case) How many simulations: Depends on the study - for geometry optimisation it can be ~100s-1000s (Depends on # parameters), for wakefield a few simulations * geometry variations Computing time: (few minutes - 1 day) * (1 - 10) nodes Where: nersc.gov (2 large servers available), but no control over priority.

Where is the code run (lxplus, lxbatch, other clusters)  I run it on my local PC. It crashes on local servers so far.  It is not clear that it is needed though.

Is the performance in general adequate to the present needs? Inconvenient setup of simulations (several different tools needed successively), long learning curve Generally fast enough; access to large cluster -> possible to run many jobs in parallel. Throughput can sometimes be lowered due to reduced capacity (Cori upgrade this autumn) Benchmark ongoing for transverse impedance

Available documentation https://confluence.slac.stanford.edu/display/AdvCo mp/Materials+for+CW16

Licensing policy - Need for license for the mesher (Trelis)  1 license bought by ABP - Need for collaboration agreement with SLAC  25 kCHF per year (partly paid by ABP). This includes support and access to NERSC servers. This contribution should be reviewed depending on the outcome of the benchmarks and the future needs.

Future plans and needs Need to get confident with this code as seems the only available tool that can treat complicated structures and very short bunch length (e.g. FCC, RF fingers). Can not be used for collimators (only devices that allow for the “indirect testbeam integration method”) Cannot be used for dispersive materials.

codes CST ABCI ACE3P (done by Kyrre) GdfidL [ANSYS HFSS] [COMSOL]

Description of the physics and the modeling of the code What effects are included and what are not? dispersive materials not included, lossy metal benchmarked by Oscar, no TET mesh, but moving hexahedral mesh What is the impact of the simulation tool for CERN studies? Not much used in ABP (only Eleonora is trying), mostly in RF and other labs

Code implementation Code developed and maintained by SLAC. Programming language(s) ? Programming style (object oriented, procedural, …) Green-orange Prerequisites to run the code (OS, compilers, libraries, other codes) At CERN, only on lxplus and only on lxclic Parallelisation technology (if any) Yes, is efficiently parallelized

Example of typical application (use case) How many simulations: Same as CST

Is the performance in general adequate to the present needs? Was useful for light sources (Simon White), as can use brute force for large number of mesh cells but still many issues pending, and need for benchmark

Available documentation http://www.gdfidl.de/

Licensing policy - Need to pay for a license, but CERN has a site license (not paid by ABP) ~10 kCHF per year

Future plans and needs Need to get the code working and see if can be used for FCC-ee