EPAC08 - Genova Participants > 1300 The last EPAC → IPAC (Kyoto IPAC10) Next PAC09 in Vancouver Three-year cycle: Asia, Europe, North America + PAC North.

Slides:



Advertisements
Similar presentations
Erdem Oz* USC E-164X,E167 Collaboration Plasma Dark Current in Self-Ionized Plasma Wake Field Accelerators
Advertisements

IIAA GPMAD A beam dynamics code using Graphics Processing Units GPMAD (GPU Processed Methodical Accelerator Design) utilises Graphics Processing Units.
Emittance dilution due to misalignment of quads and cavities of ILC main linac revised K.Kubo For beam energy 250 GeV,
Emittance dilution due to misalignment of quads and cavities of ILC main linac K.Kubo For beam energy 250 GeV, TESLA-type optics for 24MV/m.
1 Bates XFEL Linac and Bunch Compressor Dynamics 1. Linac Layout and General Beam Parameter 2. Bunch Compressor –System Details (RF, Magnet Chicane) –Linear.
1 MULTI-BUNCH INTEGRATED ILC SIMULATIONS Glen White, SLAC/QMUL Snowmass 2005.
Modeling narrow trailing beams and ion motion in PWFA Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting 2009.
Transport formalism Taylor map: Third order Linear matrix elementsSecond order matrix elements Truncated maps Violation of the symplectic condition !
SLAC is focusing on the modeling and simulation of DOE accelerators using high- performance computing The performance of high-brightness RF guns operating.
Heterogeneous Computing Dr. Jason D. Bakos. Heterogeneous Computing 2 “Traditional” Parallel/Multi-Processing Large-scale parallel platforms: –Individual.
1 IMPACT-Z and IMPACT-T: Code Improvements and Applications Ji Qiang Center of Beam Physics Lawrence Berkeley National Laboratory SciDAC II, COMPASS collaboration.
SciDAC Accelerator Simulation project: FNAL Booster modeling, status and plans Robert D. Ryne, P. Spentzouris.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
ComPASS Project Overview Panagiotis Spentzouris, Fermilab ComPASS PI.
25-26 June, 2009 CesrTA Workshop CTA09 Electron Cloud Single-Bunch Instability Modeling using CMAD M. Pivi CesrTA CTA09 Workshop June 2009.
Oliver Boine-FrankenheimSIS100-4: High current beam dynamics studies SIS 100 ‘high current’ design challenges o Beam loss in SIS 100 needs to be carefully.
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Alternative Processors Panel IDC, Tucson, Sept.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
1 BeamBeam3D: Code Improvements and Applications Ji Qiang Center of Beam Physics Lawrence Berkeley National Laboratory SciDAC II, COMPASS collaboration.
SAP Participants: Douglas Dechow, Tech-X Corporation Lois Curfman McInnes, Boyana Norris, ANL Physics Collaborators: James Amundson, Panagiotis Spentzouris,
Eric Prebys 10/28/2008.  There is a great deal of synergy between PS2 and the Fermilab Main Injector during the Project X era.  High energy ion transport,
Beam dynamics on damping rings and beam-beam interaction Dec 포항 가속기 연구소 김 은 산.
June 29 San FranciscoSciDAC 2005 Terascale Supernova Initiative Discovering New Dynamics of Core-Collapse Supernova Shock Waves John M. Blondin NC State.
CLIC RF manipulation for positron at CLIC Scenarios studies on hybrid source Freddy Poirier 12/08/2010.
NSDL Collections Based on DOE User Facilities Christopher Klaus 10/05/03.
Beam Dynamics: Planned Activities Code Development Intrabeam collisions Electron cooling Continued support for IMPACT Continued development of  beam-beam.
Simulation of Microbunching Instability in LCLS with Laser-Heater Linac Coherent Light Source Stanford Synchrotron Radiation Laboratory.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
Beam Modulation due to Longitudinal Space Charge Zhirong Huang, SLAC Berlin S2E Workshop 8/18/2003.
Beam Dynamics and FEL Simulations for FLASH Igor Zagorodnov and Martin Dohlus Beam Dynamics Meeting, DESY.
Office of Science U.S. Department of Energy 1 International Linear Collider In August 2004 ICFA announced their technology selection for an ILC: 1.The.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
IMPACT-T - A 3D Parallel Beam Dynamics Code for Modeling High Brightness Beams in Photo-Injectors Ji Qiang Lawrence Berkeley National Laboratory Work performed.
Longitudinal beam dynamic simulation of CTF3 (CL & CT) with MathCAD,Placet and Parmila and an initial bunch length measurement Seyd Hamed Shaker,IPM1.
J.-N. Leboeuf V.K. Decyk R.E. Waltz J. Candy W. Dorland Z. Lin S. Parker Y. Chen W.M. Nevins B.I. Cohen A.M. Dimits D. Shumaker W.W. Lee S. Ethier J. Lewandowski.
Beam-Beam Simulations Ji Qiang US LARP CM12 Collaboration Meeting Napa Valley, April 8-10, 2009 Lawrence Berkeley National Laboratory.
Summary report of the working group B Beam Dynamics in High Intensity Linacs by conveners Alexander Aleksandrov (ORNL), Ingo Hofmann (GSI), Jean-Michel.
A Satellite Meeting at IPAC-2010 SCRF Cavity Technology and Industrialization Date : May 23, 2010, a full-day meeting, prior to IPAC-2010 Place: Int. Conf.
LARP-CM10, BNL Feb., 2008 J. Byrd, Lawrence Berkeley National Laboratory LHC Crab Cavities: Proposed LBNL Contributions John Byrd.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
Simplified Modeling of Space Charge Losses in Booster at Injection Alexander Valishev June 17, 2015.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
13 September 2006 Global Design Effort 1 ML (x.7) Goals and Scope of Work to end FY09 Nikolay Solyak Fermilab Peter Tenenbaum SLAC.
GWENAEL FUBIANI L’OASIS GROUP, LBNL 6D Space charge estimates for dense electron bunches in vacuum W.P. LEEMANS, E. ESAREY, B.A. SHADWICK, J. QIANG, G.
Low Emittance Generation and Preservation K. Yokoya, D. Schulte.
Space Charge with PyHEADTAIL and PyPIC on the GPU Stefan Hegglin and Adrian Oeftiger Space Charge Working Group meeting –
Coupler Short-Range Wakefield Kicks Karl Bane and Igor Zagorodnov Wake Fest 07, 11 December 2007 Thanks to M. Dohlus; and to Z. Li, and other participants.
Isabell-A. Melzer-Pellmann LET Beam Dynamics Workshop, Lumi scans with wakefields in Merlin Lumi scans with wakefields in Merlin Isabell-A.
COMPASS all-hands meeting 9/17-18/2007 Robert Ryne Beam Dynamics Overview Robert D. Ryne COMPASS all-hands meeting Sept 17-18, 2007 Fermilab.
S. Bettoni, R. Corsini, A. Vivoli (CERN) CLIC drive beam injector design.
1 IMPACT: Benchmarking Ji Qiang Lawrence Berkeley National Laboratory CERN Space-Charge Collaboration Meeting, May 20-21, 2014.
김 귀년 CHEP, KNU Accelerator Activities in Korea for ILC.
Computational Needs for the XFEL Martin Dohlus DESY, Hamburg.
 Accelerator Simulation P. Spentzouris Accelerator activity coordination meeting 03 Aug '04.
1 Project X Workshop November 21-22, 2008 Richard York Chris Compton Walter Hartung Xiaoyu Wu Michigan State University.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Challenges in Electromagnetic Modeling Scalable Solvers
Benchmarking MAD, SAD and PLACET Characterization and performance of the CLIC Beam Delivery System with MAD, SAD and PLACET T. Asaka† and J. Resta López‡
Coupler kick and wake simulations upgrade
Parallel 3D Finite Element Particle-In-Cell Simulations with Pic3P*
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
Igor Zagorodnov and Martin Dohlus
Overview Multi Bunch Beam Dynamics at XFEL
K.B. Beard1#, S.A. Bogacz2, V.S. Morozov2, Y.R. Roblin2
All our thanks to Kevin Li and Michael Schenck BE/ABP-HSC CERN
Collective Effects and Beam Measurements in Particle Accelerators
Simulation Calculations
M. Pivi PAC09 Vancouver, Canada 4-8 May 2009
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
Presentation transcript:

EPAC08 - Genova Participants > 1300 The last EPAC → IPAC (Kyoto IPAC10) Next PAC09 in Vancouver Three-year cycle: Asia, Europe, North America + PAC North America in odd years (2011: Valencia & NY) 3 ILC talks Akira Yamamoto: Co-ordinated Global R&D Effort for the ILC Linac Technology James Clarke: Design of the Positron Source for the ILC Toshiaki Tauchi: The ILC Beam Delivery System Design and R&D Programme

Advanced Computing Tools and Models for Accelerator Physics “We cannot foresee what this kind of creativity in physics will bring…” Robert D. Ryne Lawrence Berkeley National Laboratory June 26, 2008 Genoa, Italy

Overview on High Performance Computingfor Accelerator Physics SciDAC ( ) (DOE program: Scientific Discovery through Advanced Computing) AST(Accelerator Science and Technology) SciDAC2 ( ) COMPASS: The Community Petascale Project for Accelerator Science and Simulation Results shown mainly from first SciDAC program National Energy Research Scientific Computing Center (NERSC), Berkley, e.g. Franklin, Seaborg (decom. Jan 08) 6080 CPUs ATLAS cluster at LLNL (~1000 nodes Linux cluster)

Overview on High Performance Computingfor Accelerator Physics SciDAC ( ) (DOE program: Scientific Discovery through Advanced Computing) AST(Accelerator Science and Technology) SciDAC2 ( ) COMPASS: The Community Petascale Project for Accelerator Science and Simulation Results shown mainly from first SciDAC program National Energy Research Scientific Computing Center (NERSC), Berkley, e.g. Franklin, Seaborg (decom. Jan 08) 6080 CPUs ATLAS cluster at LLNL (~1000 nodes Linux cluster) The new buzzword

Two weeks ago, petaflop announcement: IBM “roadrunner” 100 million times performance compared with computers at the time of the 1971 High Energy Accelerator Conference!

Two weeks ago, petaflop announcement: IBM “roadrunner” 100 million times performance compared with computers at the time of the 1971 High Energy Accelerator Conference! 6480 AMD-AMD- Dual-Core-Opteron + 1 Cell Processor per core Playstation3 Los Alamos N L

GPU’s gaining popularity 1 teraflop 4 GB 1.4 billion transistors 240 cores $1700 For comparison: Photo shown at PAC Tflops! It’s called TESLA and runs at 1.3 GHz Graphics Processing Unit NVDIA Tesla C1060 PCIe-card Presented June 2008 Seaborg

What to do with all that computing power? Beam dynamics Multiparticle interaction Beams in plasma Component design (e.g. Cavities) Codes e.g. IMPACT BeamBeam3d (Tevatron beam-beam) T3P/Omega3p (time, freq. domain solve … HPC, parallelisation Collaborative effort allows to combine codes and to define interfaces

Transport MaryLie Dragt-Finn MAD PARMILA 2D space charge PARMELA PARMTEQ IMPACT-Z IMPACT-T ML/I Synergia ORBIT BeamBeam3D Freq maps MXYZPTLK COSY-INF rms eqns Normal Forms Symp Integ DA GCPIC 3D space charge WARP SIMPSONSIMPACT MAD-X/PTC Partial list only; Many codes not shown Parallelization begins SINGLE PARTICLE OPTICS 1D, 2D COLLECTIVE 3D COLLECTIVE SELF-CONSISTENT MULTI-PHYSICS Examples->

Modeling Linac with IMPACT-Z Using 1 Billion Macroparticles 100MeV 1.2GeV Ji Qiang, LBNL

Accurate prediction of uncorrelated energy spread in a linac for a future light source Ji Qiang Final longitudinal phase space from IMPACT-Z simulation using 10M and 1B particles

Final Uncorrelated Energy Spread versus # of Macroparticles: 10M, 100M, 1B, 5B Ji QiangM. Venuturini IMPACT-Z results Microbunching instability gain function

BeamBeam3D simulation and visualization of beam-beam interaction at Tevatron 400 times usual intensity Eric Stern et al., FNAL

1.75 M quadratic elements, 10 M DOFs, 47 min per nsec on Seaborg 1024 CPUs with 173 GB memory – CG and incomplete Cholesky preconditioner Simulations of chains of cavities Full cryomodule Sorry, could not get the movies 1 hour CPU time, 1024 processors, 300 GB memory at NERSC

Cavity Coupler Kicks (Wakefield and RF) 6 posters Studies for ILC (main linac/RTML), FLASH 2 numerical calculations for RF kick M. Dohlus (Mafia) V.Yakolev (HFSS) MOPP042, N.Solyak et al. (Andrea Latina, PLACET) TUPP047, D. Kruecker et al. (MERLIN) ~30% different

Wakefields in periodic structures #cavities M.Dohlus et al. MOPP CPUs,7days 1CPU Without error estimate The discussions on wakefield kicks started at 20V/nC The effect becomes smaller

FLASH Simulation vs. Measurement 0.6 nC BPM11DBC2 OTR screens Coupler wakefield calculations From I. Zogorodnov M. Dohlus E. Prat et al. TUPP018 (ELEGANT)