Download presentation
Presentation is loading. Please wait.
1
US-CMS Meeting (May 19, 2001)Paul Avery1 US-CMS Meeting (UC Riverside) May 19, 2001 Grids for US-CMS and CMS Paul Avery University of Florida avery@phys.ufl.edu
2
US-CMS Meeting (May 19, 2001)Paul Avery2 LHC Data Grid Hierarchy Tier 1 T2 3 3 3 3 3 3 3 3 3 3 3 Tier 0 (CERN) 4 4 4 4 3 3 Tier0 CERN Tier1 National Lab Tier2 Regional Center at University Tier3 University workgroup Tier4 Workstation Tasks R&D Tier2 centers Software integration Unified IT resources
3
US-CMS Meeting (May 19, 2001)Paul Avery3 CMS Grid Hierarchy Tier2 Center Online System CERN Computer Center > 20 TIPS USA Center France Center Italy Center UK Center Institute Institute ~0.25TIPS Workstations, other portals ~100 MBytes/sec 2.5-10 Gbits/sec 0.1-1 Gbits/sec Bunch crossing per 25 nsecs. 100 triggers per second ~1 MByte per event Physicists work on analysis “channels”. Each institute has ~10 physicists working on one or more channels Physics data cache ~PBytes/sec >10 Gbits/sec Tier2 Center 0.6-2.5 Gbits/sec Tier 0 +1 Tier 1 Tier 3 Tier 4 Tier2 Center Tier 2 Experiment CERN/Outside Resource Ratio ~1:2 Tier0/( Tier1)/( Tier2) ~1:1:1
4
US-CMS Meeting (May 19, 2001)Paul Avery4 Grid Projects è Funded projects GriPhyNUSANSF, $11.9M PPDG IUSADOE, $2M PPDG IIUSADOE, $9.5M EU DataGridEU$9.3M è Proposed projects iVDGLUSANSF, $15M + $1.8M + UK DTFUSANSF, $45M + $4M/yr DataTagEUEC, $? è Other national projects PPARC e-ScienceUKPPARC, $40M UK e-ScienceUK> $100M Italy, France, Japan? EU networking initiatives (Géant, Danté, SURFNet)
5
US-CMS Meeting (May 19, 2001)Paul Avery5 Major Grid News Since May 2000 Sep. 2000GriPhyN proposal approved ($11.9M) Nov. 2000First outline of US-CMS Tier2 plan Nov. 2000Caltech-UCSD proto-T2 hardware installed Dec. 2000Submit iVDGL preproposal to NSF Jan. 2001EU-DataGrid approved ($9.3M) Mar. 20011 st Grid coordination meeting Mar. 2001Submit PPDG proposal to DOE ($12M) Apr. 2001Submit iVDGL proposal to NSF ($15M) Apr. 2001Submit DTF proposal to NSF ($45M, $4M/yr) Apr. 2001Submit DataTag proposal to EU May 2001PPDG proposal approved ($9.5M) May 2001Initial hardware for Florida proto-T2 installed Jun. 20012 nd Grid coordination meeting Aug. 2001DTF approved? Aug. 2001iVDGL approved?
6
US-CMS Meeting (May 19, 2001)Paul Avery6 Grid Timeline Q2 00 Q3 00 Q4 00 Q1 01 Q2 01 Q3 01 GriPhyN approved, $11.9M Outline of US-CMS Tier plan Caltech-UCSD install proto-T2 Submit GriPhyN proposal, $12.5M Submit iVDGL preproposal EU DataGrid approved, $9.3M 1 st Grid coordination meeting Submit PPDG proposal, $12M Submit DTF proposal, $45M Submit iVDGL proposal, $15M PPDG approved, $9.5M 2 nd Grid coordination meeting Install initial Florida proto-T2 iVDGL approved? DTF approved?
7
US-CMS Meeting (May 19, 2001)Paul Avery7 Why Do We Need All These Projects? è Agencies see LHC Grid computing in wider context (next slide) è DOE priorities LHC, D0, CDF, BaBar, RHIC, JLAB Computer science ESNET è NSF priorities Computer science Networks LHC, other physics, astronomy Other basic sciences Education and outreach International reach Support for universities
8
US-CMS Meeting (May 19, 2001)Paul Avery8 Projects (cont.) è We must justify investment Benefit to wide scientific base Education and outreach Oversight from Congress always present Much more competitive funding environment è We have no choice anyway This is the mechanism by which we will get funds è Cons Diverts effort from mission, makes management more complex è Pros Exploits initiatives, brings new funds & facilities (e.g., DTF) Drives deployment of high-speed networks Brings many new technologies, tools Attracts attention/help from computing experts, vendors
9
US-CMS Meeting (May 19, 2001)Paul Avery9 US-CMS Grid Facilities è Caltech-UCSD implemented proto-Tier2 (Fall 2000) 40 dual PIII boxes in racks RAID disk Tape resources è Florida now implementing second proto-Tier2 72 dual PIII boxes in racks Inexpensive RAID Ready June 1, 2001 for production? è Fermilab about to purchase equipment (Vivian) è Distributed Terascale Facility (DTF) Not approved yet MOUs being signed with GriPhyN, CMS Massive CPU, storage resources at 4 sites, 10 Gb/s networks Early prototype of Tier1 in 2006
10
US-CMS Meeting (May 19, 2001)Paul Avery10 Particle Physics Data Grid è Recently funded @ $9.5M for 3 years (DOE MICS/HENP) è High Energy & Nuclear Physics projects (DOE labs) è Database/object replication, caching, catalogs, end-to-end è Practical orientation: networks, instrumentation, monitoring
11
US-CMS Meeting (May 19, 2001)Paul Avery11 PPDG: Remote Database Replication u First Round Goal: Optimized cached read access to 10-100 Gbytes drawn from a total data set of 0.1 to ~1 Petabyte u Matchmaking, Co-Scheduling: SRB, Condor, Globus services; HRM, NWS PRIMARY SITE Data Acquisition, CPU, Disk, Tape Robot SECONDARY SITE CPU, Disk, Tape Robot Site to Site Data Replication Service 100 Mbytes/sec Multi-Site Cached File Access Service University CPU, Disk, Users PRIMARY SITE DAQ, Tape, CPU, Disk, Robot Satellite Site Tape, CPU, Disk, Robot University CPU, Disk, Users University CPU, Disk, Users University CPU, Disk, Users University CPU, Disk, Users Satellite Site Tape, CPU, Disk, Robot
12
US-CMS Meeting (May 19, 2001)Paul Avery12 EU DataGrid Project
13
US-CMS Meeting (May 19, 2001)Paul Avery13 GriPhyN = App. Science + CS + Grids è GriPhyN = Grid Physics Network US-CMSHigh Energy Physics US-ATLASHigh Energy Physics LIGO/LSCGravity wave research SDSSSloan Digital Sky Survey Strong partnership with computer scientists è Design and implement production-scale grids Develop common infrastructure, tools and services (Globus based) Integration into the 4 experiments Broad application to other sciences via “Virtual Data Toolkit” è Multi-year project R&D for grid architecture (funded at $11.9M) “Tier 2” center hardware, personnel Integrate Grid infrastructure into experiments
14
US-CMS Meeting (May 19, 2001)Paul Avery14 GriPhyN Institutions U Florida U Chicago Boston U Caltech U Wisconsin, Madison USC/ISI Harvard Indiana Johns Hopkins Northwestern Stanford U Illinois at Chicago U Penn U Texas, Brownsville U Wisconsin, Milwaukee UC Berkeley UC San Diego San Diego Supercomputer Center Lawrence Berkeley Lab Argonne Fermilab Brookhaven
15
US-CMS Meeting (May 19, 2001)Paul Avery15 GriPhyN: PetaScale Virtual Data Grids Virtual Data Tools Request Planning & Scheduling Tools Request Execution & Management Tools Transforms Distributed resources (code, storage, computers, and network ) è Resource è Management è Services Resource Management Services è Security and è Policy è Services Security and Policy Services è Other Grid è Services Other Grid Services Interactive User Tools Production Team Individual Investigator Research group Raw data source
16
US-CMS Meeting (May 19, 2001)Paul Avery16 GriPhyN Progress è New hires 3 physicists at Florida (1 PD, 2 scientists) 0.5 Tier2 support person at Caltech è CMS requirements document 33 pages, K. Holtman è Major meetings held (http://www.griphyn.org/) Oct. 2000All-hands meeting Dec. 2000Architecture meeting Apr. 2001All-hands meeting Aug. 2001Applications meeting è CMS – CS groups will need more frequent meetings Further develop requirements, update architecture Distributed databases More discussion of integration
17
US-CMS Meeting (May 19, 2001)Paul Avery17 Common Grid Infrastructure è GriPhyN + PPDG + EU-DataGrid + national efforts France, Italy, UK, Japan è Have agreed to collaborate, develop joint infrastructure Initial meeting March 4 in Amsterdam to discuss issues Next meeting June 23 è Preparing management document Joint management, technical boards + steering committee Coordination of people, resources An expectation that this will lead to real work è Collaborative projects Grid middleware Integration into applications Grid “laboratory”: iVDGL Network testbed: T 3 = Transatlantic Terabit Testbed
18
US-CMS Meeting (May 19, 2001)Paul Avery18 iVDGL è International Virtual-Data Grid Laboratory A place to conduct Data Grid tests “at scale” A mechanism to create common Grid infrastructure National, international scale Data Grid tests, operations è Components Tier1 sites (laboratories) Tier2 sites (universities and others) Selected Tier3 sites (universities) Distributed Terascale Facility (DTF) Fast networks: US, Europe, transatlantic è Who Initially US-UK-EU Japan, Australia Other world regions later Discussions w/ Russia, China, Pakistan, India, South America
19
US-CMS Meeting (May 19, 2001)Paul Avery19 iVDGL Proposal to NSF è Submitted to NSF ITR2001 program April 25 ITR2001 program is more application oriented than ITR2000 $15M, 5 years @ $3M per year (huge constraint) CMS + ATLAS + LIGO + SDSS/NVO + Computer Science è Scope of proposal Tier2 hardware, Tier2 support personnel Integration of Grid software into applications CS support teams (+ 6 UK Fellows) Grid Operations Center (iGOC) Education and outreach (3 minority institutions) è Budget (next slide) Falls short of US-CMS Tier2 needs (Tier2 support staff) Need to address problem with NSF (Lothar, Irwin talks)
20
US-CMS Meeting (May 19, 2001)Paul Avery20 iVDGL Budget
21
US-CMS Meeting (May 19, 2001)Paul Avery21 iVDGL Map Circa 2002-2003 Tier0/1 facility Tier2 facility 10 Gbps link 2.5 Gbps link 622 Mbps link Other link Tier3 facility
22
US-CMS Meeting (May 19, 2001)Paul Avery22 iVDGL as a Laboratory è Grid Exercises “Easy”, intra-experiment tests first (20-40%, national, transatlantic) “Harder” wide-scale tests later (50-100% of all resources) CMS is already conducting transcontinental productions è Local control of resources vitally important Experiments, politics demand it Resource hierarchy: (1) National + experiment, (2) inter-expt. è Strong interest from other disciplines HENP experiments Virtual Observatory (VO) community in Europe/US Gravity wave community in Europe/US/Australia/Japan Earthquake engineering Bioinformatics Our CS colleagues (wide scale tests)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.