1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.

Slides:



Advertisements
Similar presentations
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Advertisements

Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAN DIEGO SUPERCOMPUTER CENTER Particle Physics Data Grid PPDG Data Handling System Reagan.
Foundations for an LHC Data Grid Stu Loken Berkeley Lab.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Data Grids for Next Generation Experiments Harvey B Newman California Institute of Technology ACAT2000 Fermilab, October 19, 2000
ATLAS users and SLAC Jason Nielsen Santa Cruz Institute for Particle Physics University of California, Santa Cruz SLAC Users Organization Meeting 7 February.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
CHEP 2000 (Feb. 7-11)Paul Avery (Data Grids in the LHC Era)1 The Promise of Computational Grids in the LHC Era Paul Avery University of Florida Gainesville,
1 Use of SRMs in Earth System Grid Arie Shoshani Alex Sim Lawrence Berkeley National Laboratory.
GlobusWorld 2012: Experience with EXPERIENCE WITH GLOBUS ONLINE AT FERMILAB Gabriele Garzoglio Computing Sector Fermi National Accelerator.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Ruth Pordes, Fermilab CD, and A PPDG Coordinator Some Aspects of The Particle Physics Data Grid Collaboratory Pilot (PPDG) and The Grid Physics Network.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
4/20/02APS April Meeting1 Database Replication at Remote sites in PHENIX Indrani D. Ojha Vanderbilt University (for PHENIX Collaboration)
1 High Energy Physics (HEP) Computing HyangKyu Park Kyungpook National University Daegu, Korea 2008 Supercomputing & KREONET Workshop Ramada Hotel, JeJu,
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
F Computing at Fermilab Fermilab Onsite Review Stephen Wolbers August 7, 2002.
Data Logistics in Particle Physics Ready or Not, Here it Comes… Prof. Paul Sheldon Vanderbilt University Prof. Paul Sheldon Vanderbilt University.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
1 Overview of IEPM-BW - Bandwidth Testing of Bulk Data Transfer Tools Connie Logg & Les Cottrell – SLAC/Stanford University Presented at the Internet 2.
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Andy Kowalski.
HEP-CCC Meeting, November 1999Grid Computing for HEP L. E. Price, ANL Grid Computing for HEP L. E. Price Argonne National Laboratory HEP-CCC Meeting CERN,
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Internet 2 Workshop (Nov. 1, 2000)Paul Avery (The GriPhyN Project)1 The GriPhyN Project (Grid Physics Network) Paul Avery University of Florida
…building the next IT revolution From Web to Grid…
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
The Particle Physics Data Grid Collaboratory Pilot Richard P. Mount For the PPDG Collaboration DOE SciDAC PI Meeting January 15, 2002.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
DoE NGI Program PI Meeting, October 1999Particle Physics Data Grid Richard P. Mount, SLAC Particle Physics Data Grid Richard P. Mount SLAC Grid Workshop.
PPDGLHC Computing ReviewNovember 15, 2000 PPDG The Particle Physics Data Grid Making today’s Grid software work for HENP experiments, Driving GRID science.
1D. Olson, SDM-ISIC Mtg, 26 Mar 2002 Scientific Data Management: An Incomplete Experimental HENP Perspective D. Olson, LBNL 26 March 2002 SDM-ISIC Meeting.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
SLAC Status, Les CottrellESnet International Meeting, Kyoto July 24-25, 2000 SLAC Update Les Cottrell & Richard Mount July 24, 2000.
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
LHC Computing, CERN, & Federated Identities
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Tier 1 at Brookhaven (US / ATLAS) Bruce G. Gibbard LCG Workshop CERN March 2004.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
1 Efficient Data Access for Distributed Computing at RHIC A. Vaniachine Efficient Data Access for Distributed Computing at RHIC A. Vaniachine Lawrence.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
IT-DSS Alberto Pace2 ? Detecting particles (experiments) Accelerating particle beams Large-scale computing (Analysis) Discovery We are here The mission.
Computing at Fermilab David J. Ritchie Computing Division May 2, 2006.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Hall D Computing Facilities Ian Bird 16 March 2001.
Grid site as a tool for data processing and data analysis
LHC Collisions.
Nuclear Physics Data Management Needs Bruce G. Gibbard
Grid Computing at Fermilab
Presentation transcript:

1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99

2 Who are we? Collaboration of 9 sites: 6 DoE Labs: ANL, BNL, FNAL, JLAB, LBNL, SLAC 3 universities: CalTech, U. Wisconsin, SDSC Collaboration of: particle physicists with long history of large collaborations actively promote WANs pioneers in use of networks/collaborations/large data sets computer scientists researching and developing tools & infrastructure

3 Project Objectives Delivery of infrastructure for: –widely distributed analysis of particle physics data –at multi petabyte scales –by thousands of physicists Acceleration of development of: –network & middleware infrastructure aimed at data intensive collaboration science

4 Problem Analysis of Particle Physics data Scale of accelerators & detectors means only a few sites So large collaboration (hundreds to thousands) of physicists from tens to hundreds of sites in tens of countries Create large data sets with hundreds of Terabytes to Petabytes of data over the next decade Some sites already taking data (Jlab, SLAC), some will turn on soon (FNAL), LHC at CERN data taking in 2005, creating simulation data today

5 Goal & relation to NGI Goal (impossible dream): –researchers at universities all over US could query & retrieve hundreds of Terabytes within seconds More than a bandwidth problem –requires creation of intelligent networks & network aware applications Stimulates NGI in 3 areas –DiffServ to allow bulk data to co-exist –Distributed caching –Robustness of distributed services

6 Contributions ANL: Globus grid middleware services SLAC: Objectivity Open File System Caltech: Globally Interconnected Object Databases project FNAL: A data access framework (Sequential Access Method) LBNL: Storage Access Coordination System Wisconsin: Condor distributed resource management system SDSC: Storage Resource Broker (SRM) 2Pbytes robotic storage 6 sites), ~1300 Giga- ops/sec (8 sites), ~ 40TB disk cache (7 sites)

7 Applications (physics data sources) BNL: RHIC data FNAL: RunII/D0 & CDF SLAC: PEP-II/BaBar Caltech, ANL: LHC/Atlas & CMS JLab: Nuclear physics experiments Needs: –bulk transfer of raw data to and reconstructed data between regional data centers - today use tapes, introduces delays of ~2 weeks –code builds at multiple sites –interactive access to analyzed events –many hundreds of Mbps growing to several Gbps in next few years