Lessons Learned from Single-Pass Electron Cooling Simulations MEIC Collaboration Meeting 31 March 2015 – Jefferson Lab David Bruhwiler Preparation of this.

Slides:



Advertisements
Similar presentations
What’s new in this release? September 6, Milestone Systems Confidential Milestone’s September release 2012 XProtect ® Web Client 1 Connect instantly.
Advertisements

Simulation Study For MEIC Electron Cooling He Zhang, Yuhong Zhang Thomas Jefferson National Accelerator Facility Abstract Electron cooling of the ion beams.
COOL’11 Workshop on Beam Cooling and Related Topics Alushta, Sep. 13, 2011 Simulations of an FEL Amplifier for Coherent Electron Cooling Ilya V. Pogorelov.
Creating a Program In today’s lesson we will look at: what programming is different types of programs how we create a program installing an IDE to get.
Software Evolution Managing the processes of software system change
An Introduction to Breakdown Simulations With PIC Codes C. Nieter, S.A. Veitzer, S. Mahalingam, P. Stoltz Tech-X Corporation MTA RF Workshop 2008 Particle-in-Cell.
Recent Numerical Advances for Beam-Driven HEDP Experiments S.A. Veitzer, P.H. Stoltz, J.R. Cary Tech-X Corporation J.J. Barnard Lawrence Livermore National.
A new algorithm for the kinetic analysis of intra-beam scattering in storage rings. P.R.Zenkevich*,O. Boine-Frenkenheim**, A. Ye. Bolshakov* *ITEP, Moscow,
January 15, 2005D. Rubin - Cornell1 CESR-c Status -Operations/Luminosity December/January vs September/October -Machine studies and instrumentation -Simulation.
V.N. Litvinenko, ElC Collaboration Meeting, Hampton University, May, 2008 Staging of eRHIC Increased Reach in c.m. Energy and Luminosity Vladimir.
VORPAL for Simulating RF Breakdown Kevin Paul VORPAL is a massively-parallel, fully electromagnetic particle- in-cell (PIC) code, originally.
Server Operating Systems Last Update Copyright Kenneth M. Chipps Ph.D. 1.
Basic Unix Dr Tim Cutts Team Leader Systems Support Group Infrastructure Management Team.
Types of software. Sonam Dema..
High-current ERL-based electron cooling system for RHIC Ilan Ben-Zvi Collider-Accelerator Department Brookhaven National Laboratory.
The Pursuit for Efficient S/C Design The Stanford Small Sat Challenge: –Learn system engineering processes –Design, build, test, and fly a CubeSat project.
D.L. Bruhwiler, 1 G.I. Bell, 1 A.V. Sobol, 1 V. Litvinenko, 2 E. Pozdeyev, 2 A. Fedotov, 2 I. Ben-Zvi, 2 Y. Zhang, 3 S. Derbenev 3 & B. Terzic 3 Parallel.
CS110/CS119 Introduction to Computing (Java)
“SEMI-AUTOMATED PARALLELISM USING STAR-P " “SEMI-AUTOMATED PARALLELISM USING STAR-P " Dana Schaa 1, David Kaeli 1 and Alan Edelman 2 2 Interactive Supercomputing.
Michael Ernst, page 1 Collaborative Learning for Security and Repair in Application Communities Performers: MIT and Determina Michael Ernst MIT Computer.
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
MEIC Electron Cooling Simulation He Zhang 03/18/2014, EIC 14 Newport News, VA.
Plans and Opportunities Involving Beam Dynamics Components ComPASS SAP Project and Phase I and II Doe SBIR Boyana Norris (ANL) In collaboration with Stefan.
Building an Electron Cloud Simulation using Bocca, Synergia2, TxPhysics and Tau Performance Tools Phase I Doe SBIR Stefan Muszala, PI DOE Grant No DE-FG02-08ER85152.
Boulder, Colorado USA – www. radiasoft.net 1 Chromatic and Space Charge Effects in Nonlinear Integrable Optics Stephen D. Webb #1, David Bruhwiler 1, Alexander.
MEIC Staged Cooling Scheme and Simulation Studies He Zhang MEIC Collaboration Meeting, 10/06/2015.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, A.Drozhdin, N.Kazarinov.
V.N. Litvinenko, 30 th FEL Conference, Gyeongju, Korea, August 28, 2008 PROGRESS WITH FEL-BASED COHERENT ELECTRON COOLING Vladimir N. Litvinenko, Ilan.
Diffraction of the XFEL femtosecond pulse in a crystal BELARUSIAN STATE UNIVERSITY A.Benediktovich, I.Feranchuk, A.Leonov, D.Ksenzov, U.Pietsch The Actual.
and Accelerator Physics
Description of the friction force and degree of uncertainty with and without wigglers Work supported by US DOE, Office of NP, SBIR program, contract #
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
November 14, 2004First ILC Workshop1 CESR-c Wiggler Dynamics D.Rubin -Objectives -Specifications -Modeling and simulation -Machine measurements/ analysis.
S.A. Veitzer H IGH -P ERFORMANCE M ODELING OF E LECTRON C LOUD E FFECT AND RF D IAGNOSTICS SIMULATIONS EMPOWERING YOUR INNOVATIONS 1 MEIC Collaboration.
April 23, 2008 Workshop on “High energy photon collisions at the LHC 1 Cem Güçlü İstanbul Technical University Physics Department Physics of Ultra-Peripheral.
Virtual Classes Provides an Innovative App for Education that Stimulates Engagement and Sharing Content and Experiences in Office 365 MICROSOFT OFFICE.
Office of Science U.S. Department of Energy 1 International Linear Collider In August 2004 ICFA announced their technology selection for an ILC: 1.The.
Data Science Background and Course Software setup Week 1.
A U.S. Department of Energy Office of Science Laboratory Operated by The University of Chicago Office of Science U.S. Department of Energy Containing a.
Oliver Boine-Frankenheim, HE-LHC, Oct , 2010, Malta Simulation of IBS (and cooling) Oliver Boine-Frankenheim, GSI, Darmstadt, Germany 1.
D. L. Bruhwiler, 1 G. I. Bell, 1 A. Sobol, 1 P. Messmer, 1 J. Qiang, 2 R. Ryne, 2 W. Mori, 3 A. Fedotov, 4 I. Ben-Zvi, 4 V. Litvinenko, 4 S. Derbenev,
BROOKHAVEN SCIENCE ASSOCIATES Electron Cooling at RHIC Enhancement of Average Luminosity for Heavy Ion Collisions at RHIC R&D Plans and Simulation Studies.
1 Performance of the STAR Heavy Flavor Tracker in Measuring Charged B Meson through charged B  J/Ψ + X Decay.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
GWENAEL FUBIANI L’OASIS GROUP, LBNL 6D Space charge estimates for dense electron bunches in vacuum W.P. LEEMANS, E. ESAREY, B.A. SHADWICK, J. QIANG, G.
February 5, 2005D. Rubin - Cornell1 CESR-c Status -Operations/Luminosity -Machine studies -Simulation and modeling -4.1GeV.
Round-to-Flat Beam Transformation and Applications Yine Sun Accelerator System Division Advanced Photon Source Argonne Nation Lab. International Workshop.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Round-to-Flat Beam Transformation and Applications
Intra-Beam scattering studies for CLIC damping rings A. Vivoli* Thanks to : M. Martini, Y. Papaphilippou *
DECTRIS Ltd Baden-Daettwil Switzerland Continuous Integration and Automatic Testing for the FLUKA release using Jenkins (and Docker)
EC plans in connection with eRHIC Wolfram Fischer ILCDR08 – Cornell University, Ithaca, New York 10 July 2008.
GPU Acceleration of Particle-In-Cell Methods B. M. Cowan, J. R. Cary, S. W. Sides Tech-X Corporation.
New algorithm for dynamical friction ​ of ions in a magnetized electron beam Advanced Accelerator Concepts 3 August 2016 – National Harbor, MD David L.
OPERATED BY STANFORD UNIVERSITY FOR THE U.S. DEPT. OF ENERGY 1 Alexander Novokhatski April 13, 2016 Beam Heating due to Coherent Synchrotron Radiation.
Toward Gear-Change and Beam-Beam Simulations with GHOST
Peter Stoltz Tech-X Corp. collaborators: J. Cary, P. Messmer (Tech-X)
Electron Cooling Simulation For JLEIC
Large Booster and Collider Ring
First Look at Nonlinear Dynamics in the Electron Collider Ring
Jeffrey Eldred, Sasha Valishev
Week 1 Gates Introduction to Information Technology cosc 010 Week 1 Gates
Development of the Nanoconfinement Science Gateway
Beam-Beam Interaction in Linac-Ring Colliders
M. Pivi PAC09 Vancouver, Canada 4-8 May 2009
Progress report on PIC simulations of a neutralized electron beam
High-fidelity 3D modulator simulations of coherent electron cooling systems Tech-X Corp: George Bell, David Bruhwiler, Brian Schwartz and Ilya Pogorelov.
Development of JSPEC for Cooling Simulation
EQUILIBRIA AND SYNCHROTRON STABILITY IN TWO ENERGY STORAGE RINGs*
Optimization of JLEIC Integrated Luminosity Without On-Energy Cooling*
Presentation transcript:

Lessons Learned from Single-Pass Electron Cooling Simulations MEIC Collaboration Meeting 31 March 2015 – Jefferson Lab David Bruhwiler Preparation of this presentation was supported by RadiaSoft LLC. Previous work on electron cooling physics was supported by the US Department of Energy, Office of Science, Office of Nuclear Physics, through the SBIR program Redwood Ave. Boulder, Colorado USA

MEIC collaboration meeting – 31 March 2015 – Newport News # 2 RadiaSoft Vision world class computational scientists and engineers – near-term: beams, plasma and radiation – work with and contribute to community codes WARP (LBL, beams and plasmas, Linux) Synergia (Fermilab, beams, Linux) Elegant (ANL, beams, cross platform) SRW (BNL, synchrotron radiation & X-ray optics, cross platform) Genesis (PSI, free electron lasers, cross platform) commitment to open source software – scientific software should be open for inspection – enables collaborative development with other scientists eliminates expensive, time-consuming IP discussions scientific cloud computing services – market is large & independent of any particular field – accelerator technology can provide initial users success will bring significant value to our community

MEIC collaboration meeting – 31 March 2015 – Newport News # 3 The RadiaSoft Team

MEIC collaboration meeting – 31 March 2015 – Newport News # 4 RadiaSoft Projects Nonlinear integrable optics (collaboration w/ Fermilab) – DOE/HEP Phase I highlights (Phase II begins April 6, 2015) chromaticity doesn't break integrability, if horiz. & vert. are equal space charge does break the integrability, unless its perfectly linear – interesting dynamics in the statistical distribution of the two invariants – distribution quickly evolves to a Vlasov quasi-equilibrium – these equilibria appear to evolve over longer time scales added magnetic fields from Danilov and Nagaitsev to Synergia Software for X-ray optics (collaboration w/ BNL) – DOE/BES Phase I highlights (Phase II begins April 6, 2015) improvements to SRW code by O. Chubar IPython, JavaScript interfaces to SRW containerization of SRW, security for IPython as web service AFOSR Young Investigator Award (Stephen Webb) – development of novel, symplectic EM-PIC algorithms

MEIC collaboration meeting – 31 March 2015 – Newport News # 5 Our Vision for Scientific Cloud Computing Containerized computing – put your simulation in a box – remove the pain of installing scientific codes we install codes in a Linux container with all dependencies & tools compilation and development can be restricted to one flavor of Linux you can run the code directly on your PC or Mac, or in the cloud – open source technologies: Docker, Vagrant, virtual machines (VM) Docker and Vagrant enable rapid distribution of app’s to the cloud headless VMs enable execution on Mac and Windows with low overhead – archive your simulation in the cloud, then grab it weeks later user input files, output files, etc. are saved in the container version control can be used to recover previous state of files share the container with a collaborator, students, etc. The browser is the UI – never install software again – near term: implement X in the browser many codes run on Linux, with X-based post-processing and graphics we will enable immediate capture of existing code capabilities in the cloud existing open source projects are close to having this working – in the longer term, anything is possible with JavaScript Seamless legacy – you won’t realize you’re in the cloud – option to run codes locally (e.g. laptop on the airplane) – switch from local to cloud computing will be seamless

MEIC collaboration meeting – 31 March 2015 – Newport News # 6 Dynamical friction in relativistic electron coolers Lessons learned over 8 years of work Diffusive kicks must be suppressed for single-pass studies – friction wins over millions of turns, but not in a single pass – quiet start with correlated electron-positron pairs – use more macroparticles than there are physical electrons Semi-analytic models are required for design studies – using codes like BETACOOL – single-pass results help to improve these models Relativistic cooling very different from standard regime – friction force scales like 1/  2 (Lorentz contraction, time dilation) – time dilation (& other considerations)  short interaction times violates the assumptions of introductory beam & plasma texts often, our physical intuition can be wrong D.L. Bruhwiler, “Simulating single-pass dynamics for relativistic electron cooling,” in ICFA Beam Dynamics Newsletter 65, Beam Cooling II;

MEIC collaboration meeting – 31 March 2015 – Newport News # 7 Ya. S. Derbenev and A.N. Skrinsky, “The Effect of an Accompanying Magnetic Field on Electron Cooling,” Part. Accel. 8 (1978), 235. Ya. S. Derbenev and A.N. Skrinskii, “Magnetization effects in electron cooling,” Fiz. Plazmy 4 (1978), p. 492; Sov. J. Plasma Phys. 4 (1978), 273. I. Meshkov, “Electron Cooling; Status and Perspectives,” Phys. Part. Nucl. 25 (1994), 631. V.V. Parkhomchuk, “New insights in the theory of electron cooling,” Nucl. Instr. Meth. in Phys. Res. A 441 (2000), p. 9. As of 2000, there were competing models

MEIC collaboration meeting – 31 March 2015 – Newport News # 8 VORPAL modeling of binary collisions clarified differences in formulae for magnetized friction A.V. Fedotov, D.L. Bruhwiler, A.O. Sidorin, D.T. Abell, I. Ben-Zvi, R. Busby, J. R. Cary and V.N. Litvinenko, “Numerical study of the magnetized friction force,” Phys. Rev. ST/AB 9, (2006) green – Parkhomchuk blue – D&S integral gray – D&S asymptotics pink – VORPAL, cold e- blue – VORPAL, warm e- D&S algorithm is accurate for cold electrons – not for warm Parkhomchuk formula is approximately correct for typical parameters, but not always 3D quadrature of standard formula with modified  min is better for idealized solenoidal field In general, direct simulation is required

MEIC collaboration meeting – 31 March 2015 – Newport News # 9 Replacing SC solenoid with a conventional wiggler offers lower cost & technical risk Why look for alternatives to solenoid design? –solenoid design & beam requirements for RHIC are challenging  80 m, 5 T, superconducting, field errors <10 -5 Advantages of a wiggler –like a solenoid, it provides focusing & suppresses recombination  modest fields (~10 Gauss) effectively reduce recombination via ‘wiggle’ motion of electrons: –e- bunch is easier: less charge and un-magnetized –lower construction costs; less technical risk What’s the effect of ‘wiggle’ motion on cooling? –independent suggestion of V. Litvinenko & Ya. Derbenev  increases  min of Coulomb logarithm: – strong need for simulations

MEIC collaboration meeting – 31 March 2015 – Newport News # 10 VORPAL simulations support decision to use conventional wiggler for e- cooling of 100 GeV/n Au+79 Culmination of years of work, beginning in 2002 A.V. Fedotov, D.L. Bruhwiler, A. Sidorin, D. Abell, I. Ben-Zvi, R. Busby, J.R. Cary, and V.N. Litvinenko, "Numerical study of the magnetized friction force," Phys. Rev. ST Accel. Beams 9, (2006). A.V. Fedotov, I. Ben-Zvi, D.L. Bruhwiler, V.N. Litvinenko and A.O. Sidorin, "High-energy electron cooling in a collider," New J. Phys. 8 (2006), p G.I. Bell, D.L. Bruhwiler, A. Fedotov, A.V. Sobol, R. Busby, P. Stoltz, D.T. Abell, P. Messmer, I. Ben-Zvi and V.N. Litvinenko, “Simulating the dynamical friction force on ions due to a briefly co-propagating electron beam”, J. Comp. Phys. 227 (2008), p Conventional wiggler could replace expensive solenoid –friction force is reduced only logarithmically

MEIC collaboration meeting – 31 March 2015 – Newport News # 11 Acknowledgments Preparation of this talk was supported by RadiaSoft LLC. Much of the work discussed here was supported by the US Department of Energy, Office of Science, Office of Nuclear Physics, in part through the SBIR program. I gratefully acknowledge the contributions made by all my coauthors, with special thanks to the following scientists for key contributions: George Bell, Ilan Ben-Zvi, Michael Blaskiewicz, Alexey Burov, Yaroslav Derbenev, Alexei Fedotov, Vladimir Litvinenko, Sergei Nagaitsev, Gregg Penn, Ilya Pogorelov, Sven Reiche, Brian Schwartz, Andrey Sobol, Peter Stoltz, Gang Wang Most of the simulations discussed here were conducted using the parallel VORPAL framework (now known as VSim) [1,2,3], including the implementation of several algorithms (Hermite, BCC,  f-PIC, Vlasov/Poisson), and I acknowledge the important contributions of John Cary and all members of the VORPAL development team at Tech-X Corp. [2] G.I. Bell, D.L. Bruhwiler, A. Fedotov, A. Sobol, R.S. Busby, P. Stoltz, D.T. Abell, P. Messmer, I. Ben-Zvi and V. Litvinenko, “Simulating the dynamical friction force on ions due to a briefly co-propagating electron beam,” J. Comput. Phys. 227, 8714 (2008). [3] The VSim website; [1] C. Nieter and J.R. Cary, “VORPAL: a versatile plasma simulation code,” J. Comput. Phys.196, 448 (2004)