LQCD Computing Project Overview

Slides:



Advertisements
Similar presentations
UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.
Advertisements

U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
U.S. Department of Energy Brookhaven Science Associates BNL’s Role in High Energy Physics Thomas B.W. Kirk Associate Director for High Energy and Nuclear.
EU-GRID Work Program Massimo Sgaravatto – INFN Padova Cristina Vistoli – INFN Cnaf as INFN members of the EU-GRID technical team.
Update on the DOE SciDAC Program Vicky White, DOE/HENP Lattice QCD Collaboration Meeting Jefferson Lab, Feb
The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.
Navigating the Maze How to sell to the public sector Adrian Farley Chief Deputy CIO State of California
Satish Babu Best practice license models in the context of the Cloud Date: 22 October 2013 Track 2: Reduce the cost of ICT and accelerating service delivery.
HEPiX Catania 19 th April 2002 Alan Silverman HEPiX Large Cluster SIG Report Alan Silverman 19 th April 2002 HEPiX 2002, Catania.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LQCD Project Overview Don Holmgren LQCD Project Progress Review May 25-26, 2006 Fermilab.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
Outline IT Organization SciComp Update CNI Update
1 Computing & Networking User Group Meeting Roy Whitney Andy Kowalski Sandy Philpott Chip Watson 17 June 2008.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Computing and IT Update Jefferson Lab User Group Roy Whitney, CIO & CTO 10 June 2009.
NLIT May 26, 2010 Page 1 Computing Jefferson Lab Users Group Meeting 8 June 2010 Roy Whitney CIO & CTO.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Scientific Computing Experimental Physics Lattice QCD Sandy Philpott May 20, 2011 IT Internal Review 12GeV Readiness.
N*Grid – Korean Grid Research Initiative Funded by Government (Ministry of Information and Communication) 5 Years from 2002 to million US$ Including.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
ILDG Middleware Status Chip Watson ILDG-6 Workshop May 12, 2005.
LQCD Project Review Response Germantown, MD Aug 8, 2005.
F Computing at Fermilab Fermilab Onsite Review Stephen Wolbers August 7, 2002.
Proposed 2007 Acquisition Don Holmgren LQCD Project Progress Review May 25-26, 2006 Fermilab.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
1 ILDG Status in Japan  Lattice QCD Archive(LQA) a gateway to ILDG Japan Grid  HEPNet-J/sc an infrastructure for Japan Lattice QCD Grid A. Ukawa Center.
May 25-26, 2006 LQCD Computing Review1 Jefferson Lab 2006 LQCD Analysis Cluster Chip Watson Jefferson Lab, High Performance Computing.
Lattice QCD and GPU-s Robert Edwards, Theory Group Chip Watson, HPC & CIO Jie Chen & Balint Joo, HPC Jefferson Lab TexPoint fonts used in EMF. Read the.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
SciDAC Software Infrastructure for Lattice Gauge Theory Richard C. Brower QCD Project Review May 24-25, 2005 Code distribution see
LQCD Computing Review Bakul Banerjee Fermi National Accelerator Laboratory May 24, 2005.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
SURA BOT 11/5/02 Lattice QCD Stephen J Wallace. SURA BOT 11/5/02 Lattice.
1 Cluster Development at Fermilab Don Holmgren All-Hands Meeting Jefferson Lab June 1-2, 2005.
Site Report on Physics Plans and ILDG Usage for US Balint Joo Jefferson Lab.
USQCD regional grid Report to ILDG /28/09ILDG14, June 5, US Grid Usage  Growing usage of gauge configurations in ILDG file format.  Fermilab.
High Performance Computing (HPC) Data Center Proposal Imran Latif, Facility Project Manager Scientific & Enterprise Computing Data Centers at BNL 10/14/2015.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
U.S. Department of Energy’s Office of Science Midrange Scientific Computing Requirements Jefferson Lab Robert Edwards October 21, 2008.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
Data Infrastructure foundation of the European Cloud Initiative
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
LARP Accelerator Systems D. Rice, J. Rosenzweig, M. White LARP 2009 review.
Chapter 1 Computer Technology: Your Need to Know
Project Cost Management
Project Management – Part I
CI Updates and Planning Discussion
Computational Requirements
Planning & System installation
Overview of Project Planning
System Backup IB Computer Science.
FP7 – ICT Theme a motor for growth, competiveness and social inclusion
Lattice QCD Computing Project Review
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
Project Management – Part II
Grid Computing Colton Lewis.
LQCD Computing Operations
Enterprise Content Management Owners Representative Contract Approval
New strategies of the LHC experiments to meet
Scientific Computing At Jefferson Lab
Energy Technology Policy Progress and Way Forward
Patrick Dreher Research Scientist & Associate Director
International Lattice Data Grid
Presentation transcript:

LQCD Computing Project Overview Chip Watson LQCD Computing Review Boston, May 24-25, 2005

Outline High Level Scope of the Project Context (related work) Mapping of topics to today’s schedule

Project Scope High Level View: Deployment of new resources Operations of resources

Project Scope (1) Deployment Annual installation of increasing computing capacity, matching the needs of increasingly challenging science problems Aggregate capacity after 4 years of this project: ~18+ TFlops (double precision, sustained, average of key algorithms) Each year, we will select the best platform for the planned physics Clusters are almost certain to be optimal in 2006 Clusters are most likely to be optimal throughout the next 4 years, but we will track alternatives

Project Scope (2) Operations Operate at three sites: BNL, FNAL, JLab Operate as a metafacility Operate multiple generations of new machines (an aggregate of ~12 TFlops new in this investment) Operate the existing dedicated USQCD machines (~6 TFlops) as part of this metafacility

LQCD Capacity Growth $/MF $M

Budget Breakdown

Project Context The following projects and funding sources significantly effect the current project: LQCD SciDAC (HEP, NP, ASCR for computer science) Base contributions (HEP, NP) QCDOC at BNL (HEP, NP, ASCR funding) FNAL cluster upgrade (HEP) ILDG – International Lattice Data Grid

Project Context (1) LQCD SciDAC Project Scope: National Computational Infrastructure for Lattice Gauge Theory Scope: Computing R&D (hardware platforms & software) API standardization, software implementation Performance optimization involving both physicists and computer scientists (2x gain on DWF algorithm!) Prototype clusters: platform evaluations & software testbeds Impact of SciDAC on this project: Lowers risk by reducing platform (node & network) uncertainty Increases platform independence, code (& user) portability

Project Context (2) Ongoing base HEP and NP programs at the 3 labs leveraged staffing and hardware (contributions) leveraged computing infrastructure (networking, tertiary storage, security, account management, ...) BNL QCDOC supercomputer, 4.2 TFlops (FY04-5) (will initially constitute >50% of total US LQCD capacity) FNAL cluster upgrade (FY05) (second largest USQCD resource) International Lattice Data Grid (ILDG) (significantly leveraged computational capacity; a data exchange)

Review Schedule Overview Project Computational requirements Technology: key parameters, trends, procurement strategy FY06 procurement details Operations Context QCDOC SciDAC cluster prototypes SciDAC software ILDG Management, Cost & Schedule