Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..

Slides:



Advertisements
Similar presentations
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Advertisements

Computing Strategy Victoria White, Associate Lab Director for Computing and CIO Fermilab PAC June 24, 2011.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 1 st Virtual Forum on Global Research Communities Brussels, 12 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT and e-Infrastructures”
Victoria A. White Head, Computing Division Fermilab Global Scientific Collaborations and Grids: A View from Fermilab.
JIM Deployment for the CDF Experiment M. Burgon-Lyon 1, A. Baranowski 2, V. Bartsch 3,S. Belforte 4, G. Garzoglio 2, R. Herber 2, R. Illingworth 2, R.
Assessment of Core Services provided to USLHC by OSG.
F Run II Experiments and the Grid Amber Boehnlein Fermilab September 16, 2005.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
The SAMGrid Data Handling System Outline:  What Is SAMGrid?  Use Cases for SAMGrid in Run II Experiments  Current Operational Load  Stress Testing.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
CHEP 2003Stefan Stonjek1 Physics with SAM-Grid Stefan Stonjek University of Oxford CHEP th March 2003 San Diego.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
All Hands Meeting FY 2008 Budget Pier Oddone Fermilab December 20, 2007.
F COMPUTING DIVISION ALL-HANDS MEETING May 7, 2004.
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
A. Hoummada May Korea Moroccan ATLAS GRID MAGRID Abdeslam Hoummada University HASSAN II Ain Chock B.P Maarif CASABLANCA - MOROCCO National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
The Particle Physics Data Grid Collaboratory Pilot Richard P. Mount For the PPDG Collaboration DOE SciDAC PI Meeting January 15, 2002.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
Summary Comments and Discussion Pier Oddone 40 th Anniversary Users’ Meeting June 8, 2007.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
LHC Computing, CERN, & Federated Identities
Eileen Berman. Condor in the Fermilab Grid FacilitiesApril 30, 2008  Fermi National Accelerator Laboratory is a high energy physics laboratory outside.
F Tevatron Run II Computing Strategy Victoria White Head, Computing Division, Fermilab June 8, 2007.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Fermilab: Present and Future Young-Kee Kim Data Preservation Workshop May 16, 2011.
Run II Review Closeout 15 Sept., 2004 FNAL. Thanks! …all the hard work from the reviewees –And all the speakers …hospitality of our hosts Good progress.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
A Data Handling System for Modern and Future Fermilab Experiments Robert Illingworth Fermilab Scientific Computing Division.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
DØ Computing Model and Operational Status Gavin Davies Imperial College London Run II Computing Review, September 2005.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Gene Oleynik, Head of Data Storage and Caching,
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Open Science Grid at Condor Week
Grid Computing at Fermilab
Presentation transcript:

Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..

ICFA-HEP Networking, Grids and Digital Divide Vicky White 2 First - Apologies 1) For not being in Korea with you 2 DOE reviews in the US this week 2) For my substitute speaker, Don Petravick, having to cancel his trip at the last moment 3) For a poorly prepared, last-minute, talk as a result

ICFA-HEP Networking, Grids and Digital Divide Vicky White 3 Fermilab Ongoing Scientific Program, 2005 › CMS › International Linear Collider › LHC › MiniBoone › MINOS › Pierre Auger, CDMS › Run II (CDF, D0, Accelerator and potential upgrades) › Simulation for Accelerators › Sloan Digital Sky Survey › Test Beams and MIPP › Theory Programs and Lattice QCD computational program › R&D for SNAP › R&D for DES, Nova, Flare, Minerva › R&D for Linear Collider detectors * Grid related

ICFA-HEP Networking, Grids and Digital Divide Vicky White 4 Inclusive Worldwide Collaboration is essential for Particle Physics What are we doing at Fermilab to help this ?  Networks and Network research  Grids - for CMS, for Run II, SDSS, Lattice QCD Leadership role in Open Science Grid Lots of technical work with LCG  Guest Scientist program  Education and Outreach program  Experiment sociology and leadership – changes  Videoconferencing  Virtual Control Rooms  Physics Analysis Center for CMS  Public Relations

ICFA-HEP Networking, Grids and Digital Divide Vicky White 5 Grid Projects and Working groups Fermilab is involved in numerous Grid projects and coordination bodies  PPDG, GriPhyN and iVDGL - the Trillium US Grid projects  LHC Computing Grid bodies and working groups on Security, Operations Center, Grid Deployment, etc.  SRM – storage systems standards  Global Grid Forum – Physics Research Area  Global Grid Forum – Security Research Area  Open Science Grid  Interoperability of Grids and Operations of Grids  Ultralight Network research  Ultranet/Lambda-Station Network and Storage project  And probably some I forgot…..

ICFA-HEP Networking, Grids and Digital Divide Vicky White 6 Fermilab Grid Strategy › Strategy (from a Fermi-centric view)  Common approaches and technologies across our entire portfolio of experiments and projects  FermiGrid – so we can share all Fermilab resources › Strategy (from a global HEP view)  Work towards common standards and interoperability › At Fermilab we are working aggressively to put all of our Computational and Storage Fabric on “the Grid” and to contribute constructively to interoperability of various Grid infrastructures

ICFA-HEP Networking, Grids and Digital Divide Vicky White 7 CDF and D0 (and MINOS) › Running experiments with working Monte Carlo, Data Processing and Analysis systems, handling tens of TBs of data per day › CDF and D0 both now have global computing models › All 3 experiments use SAM for Data Handling - metadata, reliable file transfer, data delivery, bookkeeping › D0 use SAM-GRID (SAM + JIM workload manager) and have successfully reprocessed data offsite › CDF use CAFs and dCAFs and SAM and will be taking the next steps towards Grid job submission

ICFA-HEP Networking, Grids and Digital Divide Vicky White 8 SAM-Grid › SAM-Grid is fully functional distributed computing infrastructure in use by D0, CDF and MINOS › ~30 SAM stations worldwide active for D0 › ~20 SAM stations worldwide active for CDF › D0 successfully carried out reprocessing of data at 6 sites worldwide  01/p1.html FermiNews article February 01/p1.html  And in the latest reprocessing has already handled more events off-site › SAM-GRID uses many common (and evolving!) Grid standards, will evolve further, and will soon run routinely on top of LCG computing environment and Open Science Grid

ICFA-HEP Networking, Grids and Digital Divide Vicky White 9

ICFA-HEP Networking, Grids and Digital Divide Vicky White 10

ICFA-HEP Networking, Grids and Digital Divide Vicky White 11

ICFA-HEP Networking, Grids and Digital Divide Vicky White 12

ICFA-HEP Networking, Grids and Digital Divide Vicky White 13

ICFA-HEP Networking, Grids and Digital Divide Vicky White 14

ICFA-HEP Networking, Grids and Digital Divide Vicky White 15

ICFA-HEP Networking, Grids and Digital Divide Vicky White 16

ICFA-HEP Networking, Grids and Digital Divide Vicky White 17

ICFA-HEP Networking, Grids and Digital Divide Vicky White 18

ICFA-HEP Networking, Grids and Digital Divide Vicky White 19

ICFA-HEP Networking, Grids and Digital Divide Vicky White 20 CDF approach to Grids

ICFA-HEP Networking, Grids and Digital Divide Vicky White 21 CAF – successful and GUI popular

ICFA-HEP Networking, Grids and Digital Divide Vicky White 22

ICFA-HEP Networking, Grids and Digital Divide Vicky White 23

ICFA-HEP Networking, Grids and Digital Divide Vicky White 24

ICFA-HEP Networking, Grids and Digital Divide Vicky White 25

ICFA-HEP Networking, Grids and Digital Divide Vicky White 26

ICFA-HEP Networking, Grids and Digital Divide Vicky White 27

ICFA-HEP Networking, Grids and Digital Divide Vicky White 28 “Dedicated” needs to go to “Grid”

ICFA-HEP Networking, Grids and Digital Divide Vicky White 29

ICFA-HEP Networking, Grids and Digital Divide Vicky White 30 Global Grid Community

ICFA-HEP Networking, Grids and Digital Divide Vicky White 31 Federating Grids As the US representative to the LHC Computing Grid Grid Deployment Board (LCG GDB) I have repeatedly pushed the case for the LHC Computing Grid to be a “Grid of Grids” – not a distributed Computing center  Resources must be able to belong to >1 Grid E.g. NorduGrid, UK Grid, Open Science Grid, TeraGrid, Brazil Grid?, University ABC Grid and FermiGrid!  Grids infrastructures must learn to interoperate  Governance, policies, security, resource usage and service level agreements must be considered at many levels including at the level of a Federation of Grids › I recently called this a “Grid Bazaar” (ISGC2005)

ICFA-HEP Networking, Grids and Digital Divide Vicky White 32 Grid - its really about collaboration! › It’s about sharing and building a vision for the future  And it’s about getting connected › It’s about the democratization of science International Grid Workshop, Taiwan, May 2005