Recent advances in modeling advanced accelerators:

Slides:



Advertisements
Similar presentations
Plasma Medicine in Vorpal Tech-X Workshop / ICOPS 2012, Edinburgh, UK 8-12 July, 2012 Alexandre Likhanskii Tech-X Corporation.
Advertisements

Chengkun Huang UCLA Quasi-static modeling of beam/laser plasma interactions for particle acceleration Zhejiang University 07/14/2009.
Erdem Oz* USC E-164X,E167 Collaboration Plasma Dark Current in Self-Ionized Plasma Wake Field Accelerators
The scaling of LWFA in the ultra-relativistic blowout regime: Generation of Gev to TeV monoenergetic electron beams W.Lu, M.Tzoufras, F.S.Tsung, C. Joshi,
Physics of a 10 GeV laser-plasma accelerator stage Eric Esarey HBEB Workshop, Nov , C. Schroeder, C. Geddes, E. Cormier-Michel,
Beam characteristics UCLA What is a “perfect” beam? It comes from the Injector. It is affected by many factors A few highlights from contributed talks…
Dynamic Load Balancing for VORPAL Viktor Przebinda Center for Integrated Plasma Studies.
Particle acceleration in plasma By Prof. C. S. Liu Department of Physics, University of Maryland in collaboration with V. K. Tripathi, S. H. Chen, Y. Kuramitsu,
0 Relativistic induced transparency and laser propagation in an overdense plasma Su-Ming Weng Theoretical Quantum Electronics (TQE), Technische Universität.
Contour plots of electron density 2D PIC in units of  [n |e|] cr wake wave breaking accelerating field laser pulse Blue:electron density green: laser.
UCLA Experiments with short single e-bunch using preformed and beam ionized plasma Retain ability to run short single bunch with pre-ionized plasma Ken.
Modeling narrow trailing beams and ion motion in PWFA Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting 2009.
Enhancement of electron injection using two auxiliary interfering-pulses in LWFA Yan Yin ( 银燕 ) Department of Physics National University of Defense Technology.
Chengkun Huang | Compass meeting 2008 Chengkun Huang, I. Blumenfeld, C. E. Clayton, F.-J. Decker, M. J. Hogan, R. Ischebeck, R. Iverson, C. Joshi, T. Katsouleas,
Modeling Generation and Nonlinear Evolution of VLF Waves for Space Applications W.A. Scales Center of Space Science and Engineering Research Virginia Tech.
Hybrid Simulation of Ion-Cyclotron Turbulence Induced by Artificial Plasma Cloud in the Magnetosphere W. Scales, J. Wang, C. Chang Center for Space Science.
Computational Modeling Capabilities for Neutral Gas Injection Wayne Scales and Joseph Wang Virginia Tech Center for Space Science and Engineering.
An Introduction to Breakdown Simulations With PIC Codes C. Nieter, S.A. Veitzer, S. Mahalingam, P. Stoltz Tech-X Corporation MTA RF Workshop 2008 Particle-in-Cell.
Latest Advances in “Hybrid” Codes & their Application to Global Magnetospheric Simulations A New Approach to Simulations of Complex Systems H. Karimabadi.
Accelerator Modeling: Present capabilities, future prospects, and applications to the HEP Program (with emphasis on SciDAC) Presented to the HEPAP AARD.
Recent Numerical Advances for Beam-Driven HEDP Experiments S.A. Veitzer, P.H. Stoltz, J.R. Cary Tech-X Corporation J.J. Barnard Lawrence Livermore National.
Chamber Dynamic Response Modeling Zoran Dragojlovic.
1 UCLA Plans Warren B. Mori John Tonge Michail Tzoufras University of California at Los Angeles Chuang Ren University of Rochester.
Ultra-High-Intensity Laser-Plasma Interactions: Comparing Experimental Results with Three- Dimensional,Fully-Relativistic, Numerical Simultations Donald.
Lecture 3: Laser Wake Field Acceleration (LWFA)
Full-scale particle simulations of high- energy density science experiments W.B.Mori, W.Lu, M.Tzoufras, B.Winjum, J.Fahlen,F.S.Tsung, C.Huang,J.Tonge M.Zhou,
VORPAL for Simulating RF Breakdown Kevin Paul VORPAL is a massively-parallel, fully electromagnetic particle- in-cell (PIC) code, originally.
2 Lasers: Centimeters instead of Kilometers ? If we take a Petawatt laser pulse, I=10 21 W/cm 2 then the electric field is as high as E=10 14 eV/m=100.
Improved pipelining and domain decomposition in QuickPIC Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting.
SciDAC-II Compass SciDAC-II Compass 1 Vay - Compass 09 Boosted frame e-cloud simulations J.-L. Vay Lawrence Berkeley National Laboratory Compass 2009 all.
FACET and beam-driven e-/e+ collider concepts Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting 2009 LA-UR.
W.B.Mori UCLA Orion Center: Computer Simulation. Simulation component of the ORION Center Just as the ORION facility is a resource for the ORION Center,
Beam-Beam Simulations for RHIC and LHC J. Qiang, LBNL Mini-Workshop on Beam-Beam Compensation July 2-4, 2007, SLAC, Menlo Park, California.
Particle acceleration by circularly polarized lasers W-M Wang 1,2, Z-M Sheng 1,3, S Kawata 2, Y-T Li 1, L-M Chen 1, J Zhang 1,3 1 Institute of Physics,
SciDAC-II Compass SciDAC-II Compass 1 Vay - Compass 09 Boosted frame LWFA simulations J.-L. Vay, C. G. R. Geddes, E. Cormier-Michel Lawrence Berkeley National.
UCLA and USC AARD PROGRAMS C.Joshi, W.Mori, C.Clayton(UCLA), T.Katsouleas, P.Muggli(USC) “Putting the Physics of Beams at the Forefront of Science” 50+
Beam Dynamics: Planned Activities Code Development Intrabeam collisions Electron cooling Continued support for IMPACT Continued development of  beam-beam.
LASER-PLASMA ACCELERATORS: PRODUCTION OF HIGH-CURRENT ULTRA-SHORT e - -BEAMS, BEAM CONTROL AND RADIATION GENERATION I.Yu. Kostyukov, E.N. Nerush (IAP RAS,
W.Lu, M.Tzoufras, F.S.Tsung, C.Joshi, W.B.Mori
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
SIMULATIONS FOR THE ELUCIDATION OF ELECTRON BEAM PROPERTIES IN LASER-WAKEFIELD ACCELERATION EXPERIMENTS VIA BETATRON AND SYNCHROTRON-LIKE RADIATION P.
S.A. Veitzer H IGH -P ERFORMANCE M ODELING OF E LECTRON C LOUD E FFECT AND RF D IAGNOSTICS SIMULATIONS EMPOWERING YOUR INNOVATIONS 1 MEIC Collaboration.
A Domain Decomposition Method for Pseudo-Spectral Electromagnetic Simulations of Plasmas Jean-Luc Vay, Lawrence Berkeley Nat. Lab. Irving Haber & Brendan.
1 Vay, SCIDAC Review, April 21-22, 2009 Developing the tools for “boosted frame” calculations. J.-L. Vay* 1,4 in collaboration with W.M. Fawley 1, A. Friedman.
Lecture 9: Inelastic Scattering and Excited States 2/10/2003 Inelastic scattering refers to the process in which energy is transferred to the target,
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
GWENAEL FUBIANI L’OASIS GROUP, LBNL 6D Space charge estimates for dense electron bunches in vacuum W.P. LEEMANS, E. ESAREY, B.A. SHADWICK, J. QIANG, G.
Beam-Plasma Working Group Summary Barnes, Bruhwiler, DavidTech-X Clayton,
Erik Adli CLIC Project Meeting, CERN, CH 1 Erik Adli Department of Physics, University of Oslo, Norway Input from: Steffen Doebert, Wilfried Farabolini,
Non Double-Layer Regime: a new laser driven ion acceleration mechanism toward TeV 1.
A. Z. Ghalam, T. Katsouleas (USC) C. Huang, V. Decyk, W.Mori(UCLA) G. Rumolo and F.Zimmermann(CERN) U C L A 3-D Parallel Simulation Model of Continuous.
Prospects for generating high brightness and low energy spread electron beams through self-injection schemes Xinlu Xu*, Fei Li, Peicheng Yu, Wei Lu, Warren.
 Accelerator Simulation P. Spentzouris Accelerator activity coordination meeting 03 Aug '04.
Ionization Injection E. Öz Max Planck Institute Für Physik.
–W.B.Mori UCLA F.S.Tsung CK. Huang V.K.Decyk –D.Bruhwiler TechX D.Dimitrov –E.Esarey LBNL B.Shadwick G.Fubiani –T.Katsouleas USC S.Deng SciDAC plasma-based.
______ APPLICATION TO WAKEFIELD ACCELERATORS EAAC Workshop – Elba – June juillet 2016 | PAGE 1 CEA | 10 AVRIL 2012 X. Davoine 1, R. Lehe 2, A.
GPU Acceleration of Particle-In-Cell Methods B. M. Cowan, J. R. Cary, S. W. Sides Tech-X Corporation.
LCODE: a code for fast simulations of plasma wakefield acceleration Konstantin Lotov Budker Institute of Nuclear Physics SB RAS, Novosibirsk, Russia Novosibirsk.
V.N. Litvinenko (SBU) C. Joshi, W. Mori (UCLA)
Proton-driven plasma wakefield acceleration in hollow plasma
Peter Stoltz Tech-X Corp. collaborators: J. Cary, P. Messmer (Tech-X)
Stefano Romeo on behalf of SPARC_LAB collaboration
Wakefield Accelerator
U C L A Electron Cloud Effects on Long-Term Beam Dynamics in a Circular Accelerator By : A. Z. Ghalam, T. Katsouleas(USC) G. Rumolo, F.Zimmermann(CERN)
Parallel 3D Finite Element Particle-In-Cell Simulations with Pic3P*
E-164 E-162 Collaboration: and E-164+X:
Wakefield Simulation of CLIC PETS Structure Using Parallel 3D Finite Element Time-Domain Solver T3P* Arno Candel, Andreas Kabel, Zenghai Li, Cho Ng, Liequan.
2. Crosschecking computer codes for AWAKE
Presentation transcript:

Recent advances in modeling advanced accelerators: plasma based acceleration and e-clouds W.B.Mori , C.Huang, W.Lu, M.Zhou, M.Tzoufras, F.S.Tsung, V.K.Decyk (UCLA) D.Bruhwiler, J. Cary, P. Messner, D.A.Dimtrov, C. Neiter (Tech-X) T. Katsouleas, S.Deng, A.Ghalam (USC) E.Esarey, C.Geddes (LBL) J.H.Cooley, T.M.Antonsen (U. Maryland)

Accelerators!

Particle Accelerators Why Plasmas? Conventional Accelerators Plasma Limited by peak power and breakdown 20-100 MeV/m No breakdown limit 10-100 GeV/m

Concepts For Plasma Based Accelerators Plasma Wake Field Accelerator(PWFA) A high energy electron bunch Laser Wake Field Accelerator(LWFA, SMLWFA, PBWA) A single short-pulse of photons Drive beam Trailing beam Wake excitation Evolution of driver and wake Loading the wake with particles Physics necessitates the use of particle based methods: Many length and time scales for fields + particles--grand challenge!

Trajectory crossing Beam driver Laser driver Wake excitation is nonlinear: Trajectory crossing Rosenzweig et al. 1990 Puhkov and Meyer-te-vehn 2002 Ion column provides ideal accelerating and focusing forces Trajectory crossing Beam driver Laser driver

Plasma Accelerator Progress and the “Accelerator Moore’s Law” Slide 2 LOA,RAL LBL ,RALOsaka Courtesy of Tom Katsouleas

What Is a Fully Explicit Particle-in-cell Code? Not all PIC codes are the same! Particle positions Lorentz Force push particles weight to grid t Computational cycle (at each step in time) Maxwell’s equations for field solver Lorentz force updates particle’s position and momentum Interpolate to particles Typical simulation parameters: ~108-109 particles ~10-100 Gbytes ~105 time steps ~104-105 cpu hours

Advanced accelerators: Before SciDAC 5000+ node hours for each GeV of energy One 3D PIC code

Accomplishments and highlights: Code development Four independent high-fidelity particle based codes OSIRIS: Fully explicit PIC VORPAL: Fully explicit PIC + ponderomotive guiding center QuickPIC: quasi-static PIC + ponderomotive guiding center UPIC: Framework for rapid construction of new codes--QuickPIC is based on UPIC: FFT based Each code or Framework is fully parallelized. They each have load balancing and particle sorting. Each production code has ionization packages for more realism. Effort was made to make codes scale to 1000+ processors.

OSIRIS:full parallel PIC for plasma accelerators Successfully applied to various LWFA and PWFA problems Mangles et al., Nature 431, 538 (2004). Tsung et al., Phys. Rev. Lett., 93, 185002 (2004) Blue et al., Phys. Rev. Lett., 90 214801 (2003) Code Moving window Parellized using domain decompostion Two charge conserving deposition schemes Current and field smoothing Field + Impact Ionization Static load balance. Well tested Modern (object-oriented, Fortran 95 techniques) Parallel (general domain decomposition) or Serial Cross-platform (UNIX, Linux, AIX, OS X, MacMPIC) Based on a well proven Fortran 77 code Sophisticated 3D data diagnostics OSIRIS development team UCLA(F. S. Tsung, J. W. Tonge), USC (S. Deng), IST (R. A. Fonseca and L. O. Silva), Ecolé Polytechnique (J. C. Adam), and RAL (R. G. Evans). See http://exodus.physics.ucla.edu/

VORPAL – parallel PIC & related algorithms for advanced accelerators 104 s(N) VORPAL scales well to 1,000’s of processors Colliding laser pulses Particle beams Successfully applied to various LWFA problems Geddes et al., Nature 431, 538 (2004). Cary et al., Phys. Plasmas (2005), in press (invited). Recently implemented algorithms Ponderomotive guiding center treatment of laser pulses PML (perfectly matched layer) absorbing BC’s implicit 2nd-order & explicit 4th-order EM Many other capabilites/algorithms (only a sample here): Impact & field ionization; secondary e- emission Fluid methods for plasmas; hybrid PIC/fluid Modern (object-oriented, C++ template techniques) Parallel (general domain decomposition) or Serial Cross-platform (Linux, AIX, OS X, Windows) VORPAL development team J. Cary (Tech-X/CU), C. Nieter, P. Messmer, D. Dimitrov, J. Carlson, D. Bruhwiler, P. Stoltz, R. Busby, W. Wang, N. Xiang (CU), P. Schoessow, R. Trines (RAL) See http://www.txcorp.com/technologies/VORPAL/ Highly leveraged via SBIR funds: DOE, AFOSR, OSD

Code development: QuickPIC Code features: Based on UPIC parallel object-oriented plasma simulation Framework. Underlying Fortran library is reliable and highly efficient Multi-platform, Mac OS 9/X, Linux/Unix. Dynamic load balancing Model features: Highly efficient quasi-static model for beam drivers Ponderomotive guiding center + envelope model for laser drivers. Can be 100+ times faster than conventional PIC with no loss in accuracy. ADK model for field ionization. Applications: Simulations for PWFA experiments, E157/162/164/164X/167 Study of electron cloud effect in LHC. Plasma afterburner design Scalability: Currently scales to ~32 processors With pipelining should scale to 10,000+ processors afterburner hosing E164X

QuickPIC loop: 2-D plasma slab Wake (3-D) Beam (3-D): Laser or particles

Quasi-static Model including a laser driver Maxwell equations in Lorentz gauge Reduced Maxwell equations Laser envelope equation: Initialize beam Call 2D routine Deposition 3D loop end Push beam particles 3D loop begin Initialize plasma Field Solver 2D loop begin 2D loop end Push plasma particles Iteration

Scales up to 16-32 CPUs for small problem size. Parallelization for QuickPIC z y x Node 3 Node 2 Node 1 Node 0 3D domain decomposition Communication Network Overhead Beam x y 2D domain decomposition with dynamic load balancing Node 3 Node 2 Node 0 Node 1 Beam Plasma Scales up to 16-32 CPUs for small problem size. Network overhead dominates on Dawson cluster (GigE). 4 times performance boost with infiniband hardware. With pipelining should scale to 10,000+ processors

Accomplishments and highlights: Physics Development of new reduced models (QuickPIC) and benchmarking of codes (OSIRIS vs. Vorpal, QuickPIC vs. OSIRIS, Vorpal vs. Vorpal PG) Code validation (by adding more realism): Modeling of PWFA experiments at SLAC in 3D: 4GeV energy gain in ~10cm (OSIRIS and QuickPIC). Identified self-ionization as a plasma source option in PWFA (OOPIC, Vorpal, OSIRIS) Modeling LWFA experiments at LBNL and RAL: 100MeV monoenergetic beams in ~1mm (OSIRIS and VORPAL). New physics: Modeling PWFA Afterburner (energy doubler) stages: From 50 to 100 GeV and from 500 to 1000 GeV (QuickPIC). Modeling possible 1GeV mono-energetic LWFA stages: with and without external optical guiding (OSIRIS and VORPAL).

QuickPIC Benchmark: Full PIC vs. Quasi-static PIC Benchmark for different drivers e- driver e+ driver e- driver with ionization laser driver Excellent agreement with full PIC code. More than 100 times time-savings. Successfully modeled current experiments. Explore possible designs for future experiments. Guide development on theory. 100+ CPU savings with “no” loss in accuracy

Code benchmarking: Vorpal fully explicit vs Code benchmarking: Vorpal fully explicit vs. ponderomotive guiding center Removes fast time-scale of laser pulse orders of magnitude faster than full PIC can simulate 3 cm LBNL plasma channel in 2D in a few processor-hours Excellent comparison w/ 2D PIC good agreement seen for a0~1 accelerating wake fields (upper fig.) normalized particle velocities (lower fig.) particle trapping seen at larger values of a0

Modeling self-ionized PWFA experiment with QuickPIC Located in the FFTB 25 m E164X experiment FFTB QuickPIC simulation

Full-scale simulation with ionization of E-164xx is possible using a new code QuickPIC Identical parameters to experiment including self-ionization: Agreement is very good! +2 +4 -4 -2 +5 -5 X (mm) Relative Energy (GeV) The cavity is a sphere.

Recent highlights: LWFA simulations using full PIC Phys. Rev. Lett. by Tsung et al. (September 2004) where a peak energy of 0.8 GeV and a mono-energetic beam with an central energy of 280 MeV were reported in full scale 3D PIC simulations. 3 Nature papers (September 2004) where mono-energetic electron beams with energy near 100 MeV were measured. Supporting PIC simulations were presented. SciDAC members were collaborators on two of these Nature publications and SciDAC codes were used. Cover is a Vorpal simulation

3D PIC Simulations with no fitting parameters: Nature papers, “agreement” with experiment: What is the metric for agreement? 3D Simulations for: Nature V431, 541 (S.P.D Mangles et al) In experiments, the # of electrons in the spike is 1.4 108. In our 3D simulations, we estimate of 0.9 108 electrons in the bunch.

Full scale 3D LWFA simulation using OSIRIS: 200TW, 40fs Simulation Parameters Laser: a0 = 4 W0=24.4 l=19.5 mm wl/wp = 33 Particles 2x1x1 particles/cell 500 million total Plasma length L=.7cm 300,000 timesteps 4000 cells 101.9 mm 256 cells 80.9 mm State-of- the- art ultrashort laser pulse 0 = 800 nm, Dt = 30 fs I = 3.4x1019 W/cm-2, W =19.5 mm Laser propagation Plasma Background ne = 1.5x1018 cm-3 Simulation ran for 75,000 hours on 200 G5 x-serve processors on DAWSON (~5 Rayleigh lengths)

Simulations are leading experiments: 200TW 30fs laser----1.5 GeV beam in ~cm Laser blows out all plasma electrons leading to an ideal accelerating structure Isolated beams are self-injected. Beams become mono-energetic as they outrun the wake. OSIRIS simulation

One goal is to build a virtual accelerator: A 100+ GeV-on-100+ GeV e-e+ Collider Based on Plasma Afterburners The afterburner is important because it shows the dream as well as helps transition from current experiments to the need/mission of ORION. 3 km 30 m Afterburners

Advanced accelerator milestone: Full-scale simulation of a 1TeV afterburner is possible using QuickPIC Before SciDAC: 5,000,000+ node hours at NERSC (was not done) Because of SciDAC: 5,000 node hours on the DAWSON Cluster (2.3 Ghz x-serves) We use parameters consistent with the International Linear Collider “design” We have modeled the beam propagating through ~25 meters of plasma! The cavity is a sphere.

Advanced accelerators: After SciDAC 3D modeling with realism-2 explicit PIC codes plus a parallel Framework Code benchmarking Code validation-Full scale 3D modeling of experiments Efficient and high fidelity reduced description models: Rapid construction of fully parallelized code Extension of plasma techniques to conventional accelerator issues: e-cloud Rapid progress has resulted from: Faster computers New algorithms Reuseable software Scientific discovery: 3 Nature articles and 8 Phys. Rev. Lett.’s

Vision for the future: High fidelity modeling of Vision for the future: High fidelity modeling of .1 to 1TeV plasma accelerator stages Physics Goals: A) Modeling 1to 10 GeV plasma accelerator stages: Predicting and designing near term experiments. B) Extend plasma accelerator stages to 250 GeV to 1TeV range: understand the physics and scaling laws C) Use plasma codes to definitively model e-cloud physics: 30 minutes of beam circulation time on LHC ILC damping ring Software goals: A) Add pipelining into QuickPIC: Allow QuickPIC to scale to 1000’s of processors. B) Add self-trapped particles into QuickPIC and ponderomotive guiding center Vorpal packages. C) Improve numerical dispersion* in OSIRIS and VORPAL. D) Scale OSIRIS, VORPAL, QuickPIC to 10,000+ processors. E) Merge reduced models and full models F) Add circular and elliptical pipes* into QuickPIC and UPIC for e-cloud. G) Add mesh refinement into* QuickPIC, OSIRIS, VORPAL,and UPIC. H) Investigate the utility of fluid and Vlasov models. * Working with APDEC ISIC

beam solve plasma response update beam beam 1 2 3 4 Pipelining: scaling quasi-static PIC to 10,000+ processors beam solve plasma response update beam Initial plasma slab Without pipelining: Beam is not advanced until entire plasma response is determined solve plasma response update beam Initial plasma slab beam 1 2 3 4 With pipelining: Each section is updated when its input is ready, the plasma slab flows in the pipeline.

Advanced accelerators: Goals Develop reusable software based on particle-in-cell methods that scales to 1000+ processors. Develop codes that use this reusable software and which include the necessary physics modules. Develop reduced description codes to reduce cpu and memory needs. Benchmark codes against each other and validate codes against experiments. Use validated codes to discover ways to scale plasma based accelerator methods to .1 to 1 TeV.