Accelerator Modeling: Present capabilities, future prospects, and applications to the HEP Program (with emphasis on SciDAC) Presented to the HEPAP AARD.

Slides:



Advertisements
Similar presentations
Using the real lattice and an improved model for the wake field, the extraction jitter can now be calculated more accurately. Assuming an injection jitter.
Advertisements

The scaling of LWFA in the ultra-relativistic blowout regime: Generation of Gev to TeV monoenergetic electron beams W.Lu, M.Tzoufras, F.S.Tsung, C. Joshi,
Modeling narrow trailing beams and ion motion in PWFA Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting 2009.
Chengkun Huang | Compass meeting 2008 Chengkun Huang, I. Blumenfeld, C. E. Clayton, F.-J. Decker, M. J. Hogan, R. Ischebeck, R. Iverson, C. Joshi, T. Katsouleas,
Advancing Computational Science Research for Accelerator Design and Optimization Accelerator Science and Technology - SLAC, LBNL, LLNL, SNL, UT Austin,
SLAC is focusing on the modeling and simulation of DOE accelerators using high- performance computing The performance of high-brightness RF guns operating.
Linac Front-End R&D --- Systems Integration and Meson Lab Setup
T7/High Performance Computing K. Ko, R. Ryne, P. Spentzouris.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
HEPAP and P5 Report DIET Federation Roundtable JSPS, Washington, DC; April 29, 2015 Andrew J. Lankford HEPAP Chair University of California, Irvine.
1 IMPACT-Z and IMPACT-T: Code Improvements and Applications Ji Qiang Center of Beam Physics Lawrence Berkeley National Laboratory SciDAC II, COMPASS collaboration.
Kwok Ko (SLAC), Robert Ryne (LBNL) ADVANCED COMPUTING FOR 21 ST CENTURY ACCELERATOR SCIENCE & TECHONLOGY SciDAC PI Meeting – January 15-16, 2002.
SciDAC Accelerator Simulation project: FNAL Booster modeling, status and plans Robert D. Ryne, P. Spentzouris.
Recent advances in modeling advanced accelerators:
ComPASS Project Overview Panagiotis Spentzouris, Fermilab ComPASS PI.
Improved pipelining and domain decomposition in QuickPIC Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting.
25-26 June, 2009 CesrTA Workshop CTA09 Electron Cloud Single-Bunch Instability Modeling using CMAD M. Pivi CesrTA CTA09 Workshop June 2009.
Beam Dynamics Overview Outline Introduction Scientific Goals Beam Dynamics Team Overview of our beam dynamics approach Accelerator Physics Beam-Beam.
November 13, 2006 Performance Engineering Research Institute 1 Scientific Discovery through Advanced Computation Performance Engineering.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
FACET and beam-driven e-/e+ collider concepts Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting 2009 LA-UR.
W.B.Mori UCLA Orion Center: Computer Simulation. Simulation component of the ORION Center Just as the ORION facility is a resource for the ORION Center,
1 BeamBeam3D: Code Improvements and Applications Ji Qiang Center of Beam Physics Lawrence Berkeley National Laboratory SciDAC II, COMPASS collaboration.
I.V. Bazarov, Multivariate Optimization of High Brightness DC Gun Photoinjector, UCLA Workshop, 8-10 November CHESS / LEPP ERL DC Gun Injector.
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
Beam-Beam Simulations for RHIC and LHC J. Qiang, LBNL Mini-Workshop on Beam-Beam Compensation July 2-4, 2007, SLAC, Menlo Park, California.
Building an Electron Cloud Simulation using Bocca, Synergia2, TxPhysics and Tau Performance Tools Phase I Doe SBIR Stefan Muszala, PI DOE Grant No DE-FG02-08ER85152.
SAP Participants: Douglas Dechow, Tech-X Corporation Lois Curfman McInnes, Boyana Norris, ANL Physics Collaborators: James Amundson, Panagiotis Spentzouris,
Eric Prebys 10/28/2008.  There is a great deal of synergy between PS2 and the Fermilab Main Injector during the Project X era.  High energy ion transport,
Beam dynamics on damping rings and beam-beam interaction Dec 포항 가속기 연구소 김 은 산.
Beam Dynamics: Planned Activities Code Development Intrabeam collisions Electron cooling Continued support for IMPACT Continued development of  beam-beam.
1 1 What does Performance Across the Software Stack mean?  High level view: Providing performance for physics simulations meaningful to applications 
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
DOE HEP Physics Program Review June 14-16, Advanced Computations Department Kwok Ko * Work supported by U.S. DOE ASCR & HEP Divisions under.
DOE/HEP SciDAC AST Project: “Advanced Computing for 21 st Century Accelerator Science and Technology” Impact of SciDAC on Accelerator Projects Across SC.
US LHC Accelerator Research Program Jim Strait For the BNL-FNAL-LBNL LHC Accelerator Collaboration DOE Meeting 18 April 2003 brookhaven - fermilab - berkeley.
Office of Science U.S. Department of Energy 1 International Linear Collider In August 2004 ICFA announced their technology selection for an ILC: 1.The.
EPAC08 - Genova Participants > 1300 The last EPAC → IPAC (Kyoto IPAC10) Next PAC09 in Vancouver Three-year cycle: Asia, Europe, North America + PAC North.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
Physics of electron cloud build up Principle of the multi-bunch multipacting. No need to be on resonance, wide ranges of parameters allow for the electron.
IMPACT-T - A 3D Parallel Beam Dynamics Code for Modeling High Brightness Beams in Photo-Injectors Ji Qiang Lawrence Berkeley National Laboratory Work performed.
Beam-Beam Simulations Ji Qiang US LARP CM12 Collaboration Meeting Napa Valley, April 8-10, 2009 Lawrence Berkeley National Laboratory.
ILC Damping Rings Mini-Workshop, KEK, Dec 18-20, 2007 Status and Plans for Impedance Calculations of the ILC Damping Rings Cho Ng Advanced Computations.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
COMPASS All-Hands Meeting, FNAL, Sept , 2007 Accelerator Prototyping Through Multi-physics Analysis Volkan Akcelik, Lie-Quan Lee, Ernesto Prudencio,
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
The Performance Evaluation Research Center (PERC) Participating Institutions: Argonne Natl. Lab.Univ. of California, San Diego Lawrence Berkeley Natl.
1 Limits of Beam-Beam Interactions Ji Qiang Lawrence Berkeley National Laboratory Joint EIC2006 & Hot QCD Workshop, BNL, July
Global Design Effort: Controls & LLRF Americas Region Team WBS x.2 Global Systems Program Overview for FY08/09.
General remarks: I am impressed with the quantity and quality of the work presented here and the functioning of the organization. I thank ILC and FNAL.
Beam-beam compensation at RHIC LARP Proposal Tanaji Sen, Wolfram Fischer Thanks to Jean-Pierre Koutchouk, Frank Zimmermann.
COMPASS all-hands meeting 9/17-18/2007 Robert Ryne Beam Dynamics Overview Robert D. Ryne COMPASS all-hands meeting Sept 17-18, 2007 Fermilab.
1 IMPACT: Benchmarking Ji Qiang Lawrence Berkeley National Laboratory CERN Space-Charge Collaboration Meeting, May 20-21, 2014.
2 February 8th - 10th, 2016 TWIICE 2 Workshop Instability studies in the CLIC Damping Rings including radiation damping A.Passarelli, H.Bartosik, O.Boine-Fankenheim,
Steering Group Meeting 10:30 – 12:30 am CDT Monday, July 23, 2007 Y2K.
 Accelerator Simulation P. Spentzouris Accelerator activity coordination meeting 03 Aug '04.
Fundamental aspects of muon beams submitted to Accelerator R&D panel for GARD funding consideration by J.P.Delahaye/SLAC & Robert D. Ryne/LBNL.
1 Strong-Strong Beam-Beam Simulations Ji Qiang Lawrence Berkeley National Laboratory 3 rd JLEIC Collaboration Meeting March 31 st, Jlab 2016.
G. Cheng, R. Rimmer, H. Wang (Jefferson Lab, Newport News, VA, USA)
Challenges in Electromagnetic Modeling Scalable Solvers
Parallel 3D Finite Element Particle-In-Cell Simulations with Pic3P*
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
Explanation of the Basic Principles and Goals
EIC Collaborations EIC Collaboration Workshop, JLAB Oct 28-Nov 1st
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
Presentation transcript:

Accelerator Modeling: Present capabilities, future prospects, and applications to the HEP Program (with emphasis on SciDAC) Presented to the HEPAP AARD Subpanel December 21, 2005 Robert D. Ryne Lawrence Berkeley National Laboratory with contributions from Kwok Ko (SLAC) and Warren Mori (UCLA)

SciDAC Accelerator Science & Technology (AST) Project: Overview Goals: —Develop new generation of parallel accelerator modeling codes to solve the most challenging and important problems in 21st century accel S&T —Apply the codes to improve existing machines, design future facilities, help develop advanced accelerator concepts Sponsored by DOE/SC HEP in collaboration with ASCR Primary customer: DOE/SC, primarily its HEP, also NP programs —codes have also been applied to BES projects Funding: $1.8M/yr (HEP), $0.8M/yr (ASCR/SAPP) —Strong leveraging from SciDAC ISICs Duration: Currently in 5th (final) year Participants: —Labs: LBNL, SLAC, FNAL, BNL, LANL, SNL —Universities: UCLA, USC, UC Davis, RPI, Stanford —Industry: Tech-X Corp.

SciDAC Accelerator Science & Technology (AST) Project: Overview cont. Management: —K. Ko and R. Ryne, co-PIs —Senior mgmt team: K. Ko, R. Ryne, W. Mori, E. Ng Oversight and reviews by DOE/HEP program mgrs —Vicky White —Irwin Gaines —Craig Tull (present) The project must —advance HEP programs (R. Staffin) —through synergistic collaboration w/ ASCR that advances the state-of- the-art in advanced scientific computing (M. Strayer)

SciDAC AST Overview: Focus Areas Organized into 3 focus areas: —Beam Dynamics (BD), R. Ryne —Electromagnetics (EM), K. Ko —Advanced Accelerators (AA), W. Mori All supported by SciDAC Integrated Software Infrastructure Centers (ISICs) and ASCR Scientific Application Partnership Program (SAPP) Most funding goes to BD and EM; AA is very highly leveraged

Why do we need SciDAC??? Why can’t our community do code development just ourselves as we have done in the past? Why can’t it be done just as an activity tied to accelerator projects? Why can’t our community follow “business as usual?”

Computational Issues Large scale: —simulations approaching a billion particles, mesh points —Huge data sets —Advanced data mgmt & visualization Extremely complex 3D geometry (EM codes) Complicated hardware with multiple levels of memory heirarchy, > 100K processors Parallel issues —Load balancing —Parallel sparse linear solvers —parallel Poisson solvers —particle/field managers

Close collaboration w/ ASCR researchers (ISICs, SAPP) is essential A hallmark of the SciDAC project is that it built upon collaboration between applications/computational scientists with mathematicians, computer scientists, parallel performance experts, visualization specialists, and other IT experts. The AST project collaborates with several ISICs: —TOPS (Terascale Optimal PDE Solvers) —APDEC (Applied Partial Differential Equations Center) —TSTT (Terascale Simulation Tools & Technologies) —PERC (Performance Evaluation Research Center)

Overview of the 3 focus areas Beam Dynamics (BD) Electromagnetic Modeling (EM) Advanced Accelerators (AA)

Overview of the 3 focus areas Beam Dynamics (BD) Electromagnetic Modeling (EM) Advanced Accelerators (AA)

SciDAC Codes: Beam Dynamics Set of parallel, 3D multi-physics codes for modeling beam dynamics in linacs, rings, and colliders —IMPACT suite: includes 2 PIC codes (s-based, t-based); mainly for electron and ion linacs —BeamBeam3D: strong-weak, strong-strong, multi-slice, multi-bunch, multi-IP, head-on, crossing-angle, long-range —MaryLie/IMPACT: hybrid app combines MaryLie+IMPACT —Synergia: multi-language, extensible, framework; hybrid app involves portions of IMPACT+MXYZPTLK —Langevin3D: particle code for solving Fokker-Planck equation from first principles

IMPACT suite becoming widely used; > 300 contacts in FY05, > 100 already in FY06 SLAC LBNL LANL Tech-X FNAL ANL ORNL MSU BNL Jlab Cornell NIU RAL PSI GSI KEK

SciDAC code development involves large, multidisciplinary teams. Example: MaryLie/IMPACT code

Development, reuse, and synthesis of code components. Examples: Synergia, e-cloud capability

New algorithms and methodologies are key. Examples: (1) high aspect ratio Poisson solver; (2) self-consistent Langevin/Fokker-Planck Self-Consistent Diffusion Coefficients vs. velocity Spitzer approximation First-ever 3D self-consistent Langevin/Fokker-Planck simulation Electric field error vs. distance Error in the computed electric field of a Gaussian distribution of charge (  x =1mm and  y =500mm). Even using a grid size of 64x8192, the standard method (blue curve) is less accurate than the Integrated Green Function method (purple) on 64x64.

SciDAC beam dynamics applications benefit DOE/SC programs, esp. HEP Beam-Beam simulation of Tevatron, PEP-II, LHC, RHIC ILC damping rings (space-charge, wigglers) FNAL Booster losses CERN PS benchmark study RIA driver linac modeling SNS linac modeling LCLS photoinjector modeling CERN SPL (proposed proton driver) design J-PARC commissioning Publications: —23 refereed papers since 2001 (including 5 Phys Rev Lett., 10 PRST-AB, 4 NIM-A, 2 J. Comp. Phys., Computer Physics Comm.), numerous conf proceedings papers USPAS course on computational methods in beam dynamics

LHC beam-beam simulation x1 = x2 = y1 = y2 =0.31,  0 =– First-ever 1M particle, 1M turn strong- strong b-b simulation (J. Qiang, LBNL) Examples: Collider modeling using BeamBeam3D PEP-II luminosity calculation shows importance of multi- slice modeling (J. Qiang, Y. Cai, SLAC; K. Ohmi, KEK) Code scalability depends strongly on parallelization methodology (J. Qiang) Parameter studies of antiproton lifetime in Tevatron

ILC damping ring modeling using ML/I Results of MaryLie/IMPACT simulations of an ILC “dog-bone” damping ring (DR) design showing space-charge induced emittance growth using different space-charge models. Space charge is important for the ILC DR in spite of the high energy because of the combination of small emittance and large (16 km) circumference. Top (nonlinear space charge model): the beam exhibits small emittance growth. Bottom (linear space charge model): the beam exhibits exponential growth due to a synchro-betatron resonance. The instability is a numerical artifact caused by the simplified (linear) space-charge model. (M. Venturini, LBNL)

FNAL booster modeling using Synergia FNAL booster simulation results using Synergia showing the merging of 5 microbunches. SciDAC team members are working closely with experimentalists at the booster to help understand and improve machine performance. (P. Spentzouris and J. Amundson, FNAL; J. Qiang and R. Ryne, LBNL)

Beam Dynamics under SciDAC 2 (HEP program) Support/maintain/extend successful codes developed under SciDAC 1 (BD, EM, AA) Develop new capabilities to meet HEP priorities: LHC, ILC, Tevatron, PEP-II, FNAL main injector, booster, proton driver —Self-consistent 3D simulation of: e-cloud, e-cooling, IBS, CSR —Start-to-end modeling with all relevant physical effects Enable parallel, multi-particle beam dynamics design & optimization Performance and scalability optimization on platforms up to the petascale (available by the end of the decade) Couple parallel beam dynamics codes to commissioning, operations, and beam experiments

Overview of the 3 focus areas Beam Dynamics (BD) Electromagnetic Modeling (EM) Advanced Accelerators (AA)

SciDAC AST – Electromagnetics  Develop a comprehensive suite of parallel electromagnetic codes for the design and analysis of accelerators, (Ron’s talk)  Apply new simulation capability to accelerator projects across SC including those in HEP, NP and BES, (Ron’s talk)  Advance computational science to enable terascale computing through ISICs/SAPP collaborations. (this talk) Under SciDAC AST, the Advanced Computations SLAC is in charge of the Electromagnetics component to:

ACD is working with the TOPS, TSTT, PERC ISICs as well as SAPP researchers on 6 computational science projects involving 3 national labs and 6 universities. ACD’s ISICs/SAPP Collaborations  Parallel Meshing – TSTT (Sandia, U Wisconsin/PhD thesis)  Adaptive Mesh Refinement – TSTT (RPI)  Eigensolvers – TOPS (LBNL), SAPP (Stanford/PhD thesis, UC Davis)  Shape Optimization – TOPS (UT Austin, Columbia, LBNL), TSTT (Sandia, U Wisconsin)  Visualization – SAPP (UC Davis/PhD thesis)  Parallel Performance – PERC (LBNL, LLNL)

Parallel Meshing & Adaptive Mesh Refinement Processor: Parallel meshing is needed for generating LARGE meshes to model multiple cavities in the ILC superstructure & cryomodule Adaptive Mesh Refinement improves accuracy & convergence of frequency and wall loss calculations Frequency Wall loss Q RIA RFQ

Eigensolvers & Shape Optimization Omega3P Lossless Lossy Material Periodical Structure External Coupling ESIL ISIL w/ refinement Implicit Restarted Arnoldi SOAR Self-Consistent Loop Complex eigensolver for treating external coupling is essential for computing HOM damping in ILC cavities. Shape Optimization to replace manual, iterative process in designing cavities with specific goals subject to constraints. Omega3P Sensitivity meshing sensitivity optimization geometricmodel Omega3P meshing (only for discrete sensitivity)

Visualization & Parallel Performance Visualization is critical to mode analysis in complex 3D cavities, e.g. mode rotation effects Solve & Postprocess Breakdown Communication Pattern Parallel Performance studies are needed to maximize code efficiency and optimize use of computing resources.

Proposed Projects for SciDAC 2  Parallel adaptive h-p-q refinement where h is mesh size, p is order of FE basis and q is order of geometry model  Parallel shape optimization (goals w/ constraints) and prediction (cavity deformations from HOM measurements)  Parallel particle simulation on unstructured grids for accurate device modeling (RF guns, klystrons)  Integrated electromagnetics/thermal/mechanical modeling for complete design and engineering of cavities  Parallel, interactive visualization cluster for mode analysis and particle simulations SLAC will develop the NEXT level of simulation tools for NEXT generation SC accelerators (ILC, LHC, RIA, SNS) by continuing to advance Computational Science in collaborations with the ISICs/SAPP component of SciDAC

Overview of the 3 focus areas Beam Dynamics (BD) Electromagnetic Modeling (EM) Advanced Accelerators (AA)

Recent advances in modeling advanced accelerators: plasma based acceleration and e-clouds W.B.Mori, C.Huang, W.Lu, M.Zhou, M.Tzoufras, F.S.Tsung, V.K.Decyk (UCLA) D.Bruhwiler, J. Cary, P. Messner, D.A.Dimtrov, C. Neiter (Tech-X) T. Katsouleas, S.Deng, A.Ghalam (USC) E.Esarey, C.Geddes (LBL) J.H.Cooley, T.M.Antonsen (U. Maryland)

Accomplishments and highlights: Code development Four independent high-fidelity particle based codes —OSIRIS: Fully explicit PIC —VORPAL: Fully explicit PIC + ponderomotive guiding center —QuickPIC: quasi-static PIC + ponderomotive guiding center —UPIC: Framework for rapid construction of new codes--QuickPIC is based on UPIC: FFT based Each code or Framework is fully parallelized. They each have dynamic load balancing and particle sorting. Each production code has ionization packages for more realism. Effort was made to make codes scale to processors. Highly leveraged

Full PIC: OSIRIS and Vorpal Successfully applied to various LWFA and PWFA problems 10 4 s(N) Scale well to 1,000’s of processors Colliding laser pulses Particle beams Self-ionized particle beam wake 3D LWFA simulation

Quasi-static PIC: QuickPIC Code features: Based on UPIC parallel object-oriented plasma simulation Framework. Model features: Highly efficient quasi-static model for beam drivers Ponderomotive guiding center + envelope model for laser drivers. Can be 100+ times faster than conventional PIC with no loss in accuracy. ADK model for field ionization. Applications: Simulations for PWFA experiments, E157/162/164/164X/167 Study of electron cloud effect in LHC. Plasma afterburner design afterburner hosing E164X

3 Nature papers (September 2004) where mono-energetic electron beams with energy near 100 MeV were measured. Supporting PIC simulations were presented. SciDAC members were collaborators on two of these Nature publications and SciDAC codes OSIRIS and Vorpal were used. Phys. Rev. Lett. by Tsung et al. (September 2004) where a peak energy of 0.8 GeV and a mono-energetic beam with an central energy of 280 MeV were reported in full scale 3D PIC simulations. Recent highlights: LWFA simulations using full PIC Vorpal result on cover

Modeling self-ionized PWFA experiment with QuickPIC E164X experiment QuickPIC simulation Located in the FFTB 25 m FFTB

s = 0 m s = m Afterburner simulation: 0.5 TeV ~ 1 TeV in 28 meters Simulation done with QuickPIC in 5000 node-hours Full PIC run would have taken 5,000,000 node-hours!

Vision for the future: SciDAC 2 High fidelity modeling of.1 to 1TeV plasma accelerator stages Physics Goals: —A) Modeling 1to 10 GeV plasma acc stages: Predicting and designing near term experiments. —B) Extend plasma accelerator stages to 250 GeV - 1TeV range: understand physics & scaling laws —C) Use plasma codes to definitively model e-cloud physics: 30 minutes of beam circulation time on LHC ILC damping ring Software goals: —A) Add pipelining into QuickPIC: Allow QuickPIC to scale to 1000’s of processors. —B) Add self-trapped particles into QuickPIC and ponderomotive guiding center Vorpal packages. —C) Improve numerical dispersion* in OSIRIS and VORPAL. —D) Scale OSIRIS, VORPAL, QuickPIC to 10,000+ processors. —E) Merge reduced models and full models —F) Add circular and elliptical pipes* into QuickPIC and UPIC for e-cloud. —G) Add mesh refinement into* QuickPIC, OSIRIS, VORPAL,and UPIC. —H) Develop better data analysis and visualization tools for complicated phase space data** *Working with APDEC ISIC **Working with visualization center

In Conclusion… Q: What is the scope of our research in regard to HEP short/medium/long-range applications? A: It is mainly short/medium. —Capabilities have been developed, codes applied to: Short: PEP-II, Tevatron, FNAL Booster, LHC Medium: ILC Long: Exploration of advanced accelerator concepts –these activities are highly leveraged, represent 10% of the SciDAC AST budget

Final remarks Future HEP facilities will cost ~$0.5B to ~$10B —High end modeling is crucial to Optimize designs Reduce cost Reduce risk —Given the magnitude of the investment in the facility, the $1.8M investment in SciDAC is tiny, but the tools are essential Laser/plasma systems are extraordinarily complex —High fidelity modeling, used in concert with theory & experiment, is essential to understand the physics and help realize the promise of advanced accelerator concepts

Acronyms used SciDAC: Scientific Discovery through Advanced Computing AST: SciDAC Accelerator Science & Technology project ASCR: Office of Advanced Scientific Computing Research BD: Beam dynamics activities of SciDAC AST EM:Electromagnetics activities of SciDAC AST AA: Advanced Accelerator activities of SciDAC AST ISIC: SciDAC Integrated Software Infrastructure Center —TOPS: Terascale Optimal PDE Solvers center —TSTT: Terascale Simulation Tools and Technologies center —APDEC: Applied Partial Differential Equations center —PERC: Performance Evaluation Research Center SAPP: Scientific Application Partnership Program (ASCR-supported researchers affiliated w/ specific SciDAC projects)