HEP and Non-HEP Computing at a Laboratory in Transition Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear Accelerator.

Slides:



Advertisements
Similar presentations
Director’s Welcome Jonathan Dorfan 32 nd Annual SSRL Users Meeting October 17, 2005.
Advertisements

9 v 2012Kavli IPMU1 Computational Astrophysics at the Kavli Institute for Particle Astrophysics and Cosmology at Stanford Roger Blandford.
SINBAD Ralph W. Aßmann Leading Scientist, DESY LAOLA Collaboration Meeting, Wismar
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Research Projects General Sciences Physical Sciences Energy Sciences Biosciences Ali Belkacem - Chemical Sciences MRC workshop: March 26, 2002.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
Research Opportunities at LCLS September 2011 Joachim Stöhr.
1 SLAC National Accelerator Laboratory Amber Boehnlein October 18, 2011.
1 SLAC Site Report By Les Cottrell for UltraLight meeting, Caltech October 2005.
Sept. 18, 2008SLUO 2008 Annual Meeting Vision for SLAC Science Persis S. Drell Director SLAC.
Data Analysis in High Energy Physics, Weird or Wonderful? Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear Accelerator.
February 19, 2008 FACET Review 1 Lab Overview and Future Onsite Facilities Persis S. Drell DirectorSLAC.
Planning for a Western Analysis Facility Richard P. Mount Planning for a Western Analysis FacilityPage 1.
Kian-Tat Lim Offline Computing November 12 th, LCLS Offline Data Management.
July 7, 2008SLAC Annual Program ReviewPage 1 Particle Astrophysics and Cosmology Theoretical Research Overview R. Blandford KIPAC Director, PPA Assistant.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
Future Planning for SLAC Persis S. Drell. December 5, 2003SLAC Scenarios2 Scenarios Study 2003: Process  Started early in 2003  Inclusive of SLAC faculty,
Scientific Computing at SLAC Richard P. Mount Director: Scientific Computing and Computing Services DOE Review June 15, 2005.
CHEP 2004 September 2004Richard P. Mount, SLAC Huge-Memory Systems for Data-Intensive Science Richard P. Mount SLAC CHEP, September 29, 2004.
SLAC National Accelerator Laboratory Site Report A National Lab in Transition Randy Melen, Deputy CIO Computing Division, Operations Directorate SLAC National.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
May Richard P. Mount, SLAC Advanced Computing Technology Overview Richard P. Mount Director: Scientific Computing and Computing Services Stanford.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
24 April 2015 FY 2016 Budget Request to Congress for DOE’s Office of Science Dr. Patricia M. Dehmer Acting Director, Office of Science
SLUO LHC Workshop: Closing RemarksPage 1 SLUO LHC Workshop: Closing Remarks David MacFarlane Associate Laboratory Directory for PPA.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
The BaBar Event Building and Level-3 Trigger Farm Upgrade S.Luitz, R. Bartoldus, S. Dasu, G. Dubois-Felsmann, B. Franek, J. Hamilton, R. Jacobsen, D. Kotturi,
Keith O. Hodgson SSRL Director Brief Update on the Linac Coherent Light Source - LCLS February 26, 2002 Basic Energy Sciences Advisory Committee Undulator.
Frontiers of THz Science ZX Shen SLAC Chief Scientist 1.
ATLAS Computing at SLAC Future Possibilities Richard P. Mount Western Tier 2 Users Forum April 7, 2009.
21 October 2010 Dietrich Liko Grid Tier-2 HEPHY Scientific Advisory Board.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Scientific Computing at SLAC: The Transition to a Multiprogram Future Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear.
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
SLAC and ILC Jonathan Dorfan, Director LCFOA, SLAC May 1, 2006 Particle & Particle Astrophysics.
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
DESY. Status and Perspectives in Particle Physics Albrecht Wagner Chair of the DESY Directorate.
HEP and Non-HEP Computing at a Laboratory in Transition Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear Accelerator.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
SLAC All Hands Briefing: FY2008 Budget Persis S. Drell SLAC Director 1/7/08.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
PetaCache: Data Access Unleashed Tofigh Azemoon, Jacek Becla, Chuck Boeheim, Andy Hanushevsky, David Leith, Randy Melen, Richard P. Mount, Teela Pulliam,
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
CPM 2012, Fermilab D. MacFarlane & N. Holtkamp The Snowmass process and SLAC plans for HEP.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Hall D Computing Facilities Ian Bird 16 March 2001.
Stanford Linear Accelerator
MSc-Student Activities at the European XFEL
Western Analysis Facility
Understanding the nature of matter -
Welcome Tom Himel April 25, 2012.
Particle Physics and Astrophysics Photon Science (X-Rays)
Nuclear Physics Data Management Needs Bruce G. Gibbard
SLAC Site Report.
Stanford Linear Accelerator
TeraScale Supernova Initiative
SLAC B-Factory BaBar Experiment WAN Requirements
LCLS History and Science Overview Keith Hodgson, SSRL Director April 23, 2002 LCLS DOE Review, April 23, 2002 Keith Hodgson, SSRL.
SCCS Strategic Planning: Overview
Stanford Linear Accelerator
Presentation transcript:

HEP and Non-HEP Computing at a Laboratory in Transition Richard P. Mount Director: Scientific Computing and Computing Services Stanford Linear Accelerator Center CHEP 2007 September 6, 2007

September 6, 2007Richard P. Mount, SLAC2 The Transition 1962: SLAC founded to construct and exploit a 2-mile linear accelerator; 1973: SSRL founded to exploit sychrotron radiation from SLAC’s SPEAR storage ring; 2003: KIPAC (Kavli Institute for Particle Astrophysics and Cosmology) founded at SLAC/Stanford; 2005: LCLS (Linac Coherent Light Source) high energy X-ray laser construction approved; 2008: Last year of BaBar data taking. Last year of operation of SLAC accelerators for experimental HEP; 2009: BES (Photon Science funding) takes ownership of SLAC accelerators. BES becomes the largest funding source for SLAC. LCLS operation begins; The future: –Vigorous Photon Science program; –Vigorous Astrophysics and Cosmology program; –Potentially vigorous HEP program.

September 6, 2007Richard P. Mount, SLAC3 Cultural Evolution in SLAC Computing Experimental HEP: –Large, organized collaborations –Computing recognized as vital –Detailed planning for computing –Expectation of highly professional computing Astrophysics and Cosmology (Theory) –Individual PIs or small collaborations –Computing recognized as vital –Desire for agility –“Professional” approach viewed as ponderous and costly Astronomy –Increasingly large collaborations and HEP-like timescales –Computing recognized as vital –Detailed (8 years in advance of data) planning for computing –Expectation of highly professional computing Photon Science –Light sources are costly facilities –Photon science has been “small science” up to now (“turn up for 3 days and take away a DVD”) –No large, organized collaborations –Computing not considered a problem by most scientists –Detailed planning would be a waste of time –“Wait for the crisis and then we will get the resources to fix the problems”

September 6, 2007Richard P. Mount, SLAC4 Technical Evolution in SLAC Computing Experimental HEP: –Throughput-oriented data processing, trivial parallelism –Commodity CPU boxes –Ethernet switches ideal –Experiment-managed bulk disk space –Challenging data-management needs Astrophysics and Cosmology (Theory) –Visualization –Parallel Computing –Shared-memory SMP, plus –Infiniband MPI Clusters –Lustre and other “fancy” file systems –Macs and Linux boxes Astronomy –Visulaization –Challenging data management needs –Lustre and other “fancy” file systems –Macs and Linux boxes Photon Science –Computer science challenges (extract information from a million noisy images) –MPI clusters needed now –Significant needs for bandwidth to storage –Other needs unclear

September 6, 2007Richard P. Mount, SLAC5 DOE Scientific Computing Annual Funding at SLAC Particle and Particle Astrophysics –$14M SCCS – $5M Other Photon Science – $0M SCCS (ramping up real soon now) – $1M SSRL? Computer Science – $1.5M

September 6, 2007Richard P. Mount, SLAC6 SLAC Scientific Computing ScienceScience GoalsComputing Techniques BaBar Experiment (winds down ) Measure billions of collisions to understand matter-antimatter asymmetry (why matter exists today) High-throughput data processing, trivially parallel computation, heavy use of disk and tape storage Experimental HEP (LHC/ATLAS, ILC …) Analyze petabytes of data to understand the origin of mass High-throughput data processing. trivially parallel computation, heavy use of disk and tape storage Accelerator Science Current: World-leading simulation of electromagnetic structures; Future: Simulate accelerator (e.g. ILC) behavior before construction and during operation Parallel computation, visual analysis of large data volumes Particle Astrophysics (mainly simulation) Star formation in the early universe, colliding black holes, … Parallel computation (SMP and cluster), visual analysis of growing volumes of data Particle Astrophysics Major Projects (GLAST, LSST …) Analyze terabytes to petabytes of data to understand the dark matter and dark energy riddles High-throughput data processing, very large databases, visualization Photon Science (biology, physics, chemistry, environment, medicine) Current: SSRL – state-of-the art synchrotron radiation lab; Future LCLS – Femtosecond x-ray pulses, “ultrafast” science, structure of individual molecules … Medium-scale data analysis and simulation. Visualization. High throughput data analysis and large-scale simulation. Advanced visualization PetaCache – New Architectures for SLAC Science Radical new approaches to computing for Stanford-SLAC data- intensive science Current focus: massive solid-state storage for high-throughput, low- latency data analysis

September 6, 2007Richard P. Mount, SLAC7 Power and Cooling Near future: difficult Medium term: very difficult Long term: totally scary

September 6, 2007Richard P. Mount, SLAC8 Move PetaCache Prototype Rittal Water- Cooled Racks (at used capacity) Sun BlackBox (installed load) New 2 nd Floor PMMS (dual- sourced loads) Sun BlackBox #2 (installed load) Near Future

September 6, 2007Richard P. Mount, SLAC9 New High Power Density Computer Center Medium Term

September 6, 2007Richard P. Mount, SLAC10 And the Longer-Term Future? Six major programs each justifying $Millions/year of hardware Assume modest level of funding success: $6M/years Current power consumed: –CPU: 0.08 Watts/$ –Disk: 0.05 Watts/$ –Cooling systems: ~50% on top of these Future evolution assumptions: 1.Watts/$ is flat 2.Watts/$ doubles every 5 years (consistent with recent experience) 3.Watts/$ doubles ever 2 years (à la James Sexton)

September 6, 2007Richard P. Mount, SLAC11 Watts/$ doubling every 5 years Watts/$ doubling every 2 years. Power bill exceeds SLAC budget by Power/Cooling installations cost 2*Power Bill. SLAC Computing MWatts $6M/year Equipment Purchases Hard to Imagine Imaginable

September 6, 2007Richard P. Mount, SLAC12 Short Term Fixes for Power and Cooling Rittal Water-Cooled Racks –Eight 40RU racks, up to ~25KW per rack –Online end June Sun BlackBox –Holds 252 1RU servers plus networking –Up to 200KW payload possible –Online September 17 Replace “old?” “junk?” –Old can mean > 2 years old –Junk can mean P4-based machines –Sun “Thumpers” are 4 to 5 times as power efficient per TB as our 2-year old sun disks + servers.

September 6, 2007Richard P. Mount, SLAC13 Rittal Water-Cooled Racks 30cm wide heat exchanger racks between each equipment rack 20KW maximum per equipment rack 8 racks installed (photograph shows four). Contents, CPU for: -ATLAS Tier 2 (June 2007) -GLAST (June 2007) -BaBar, Noma/Tori replacement (September 2007) -ATLAS Tier 2 (end 2007)

September 6, 2007Richard P. Mount, SLAC14

September 6, 2007Richard P. Mount, SLAC15

September 6, 2007Richard P. Mount, SLAC16

September 6, 2007Richard P. Mount, SLAC17

September 6, 2007Richard P. Mount, SLAC18

New Science Directions Astronomy, Astrophysics and Cosmology

September 6, 2007Richard P. Mount, SLAC20 Slide: Tom Abel/KIPAC

September 6, 2007Richard P. Mount, SLAC21 Slide: Tom Abel/KIPAC

September 6, 2007Richard P. Mount, SLAC22 Slide: Tom Abel/KIPAC

September 6, 2007Richard P. Mount, SLAC23 Slide: Phil Marshall/UCSB

New Science Directions Photon Science LCLS: Linac Coherent Light Source LUSI: LCLS Ultrafast Science Instruments

September 6, 2007Richard P. Mount, SLAC25 Photon Science at LCLS Opportunities on Several Fronts LCLS will begin operations in 2009 LCLS DAQ will have LHC-like rates to storage: –Design of the complete DAQ systems is just starting LCLS offline computing has many “opportunities”: –Opportunity to define where offline analysis will be done –Opportunity to define the scale of offline analysis (e.g. will Coherent Crystal Imaging need 40,000 cores?) –Opportunity to create science/computer science teams to write algorithms and framework(s) –Opportunity to acquire funding for these activities Opportunity to make a small science/big science transition successful, and fun

September 6, 2007Richard P. Mount, SLAC26

September 6, 2007Richard P. Mount, SLAC27

September 6, 2007Richard P. Mount, SLAC28

September 6, 2007Richard P. Mount, SLAC29

September 6, 2007Richard P. Mount, SLAC30

September 6, 2007Richard P. Mount, SLAC31 In Conclusion What’s exciting about scientific computing at SLAC? –The new adventure of the LCLS program with its many computing challenges –The new adventure of Particle Astrophysics and Cosmology with unbounded appetite for computing –The old adventure of high energy physics experimentation with constant new challenges –The dream of fully simulating accelerators –Allowing this excitement to inspire new developments and wild ideas in computing that will greatly increase the success of the science program.