Presentation is loading. Please wait.

Presentation is loading. Please wait.

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.

Similar presentations


Presentation on theme: "GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External."— Presentation transcript:

1 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida http://www.phys.ufl.edu/~avery/ avery@phys.ufl.edu Opening and Overview GriPhyN External Advisory Meeting Marina del Rey, April 12, 2001

2 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery2 Who We Are  U Florida  U Chicago  Boston U  Caltech  U Wisconsin, Madison  USC/ISI  Harvard  Indiana  Johns Hopkins  Northwestern  Stanford  U Illinois at Chicago  U Penn  U Texas, Brownsville  U Wisconsin, Milwaukee  UC Berkeley  UC San Diego  San Diego Supercomputer Center  Lawrence Berkeley Lab  Argonne  Fermilab  Brookhaven

3 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery3 GriPhyN = App. Science + CS + Grids è GriPhyN = Grid Physics Network  US-CMSHigh Energy Physics  US-ATLASHigh Energy Physics  LIGO/LSCGravity wave research  SDSSSloan Digital Sky Survey  Strong partnership with computer scientists è Design and implement production-scale grids  Investigation of “Virtual Data” concept (fig)  Integration into 4 major science experiments  Develop common infrastructure, tools and services  Builds on existing foundations: Globus tools è Multi-year project  Grid R&D  Development, deployment of “Tier 2” hardware, personnel (fig)  Education & outreach

4 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery4 GriPhyN Data Grid Challenge “Global scientific communities, served by networks with bandwidths varying by orders of magnitude, need to perform computationally demanding analyses of geographically distributed datasets that will grow by at least 3 orders of magnitude over the next decade, from the 100 Terabyte to the 100 Petabyte scale.”

5 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery5 Data Grid Hierarchy Tier 1 T2 3 3 3 3 3 3 3 3 3 3 3 Tier 0 (CERN) 4 4 4 4 3 3 Tier0 CERN Tier1 National Lab Tier2 Regional Center at University Tier3 University workgroup Tier4 Workstation GriPhyN: è R&D è Tier2 centers è Unify all IT resources

6 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery6 LHC Global Grid Hierarchy Tier2 Center Online System CERN Computer Center > 20 TIPS USA Center France Center Italy Center UK Center Institute Institute ~0.25TIPS Workstations, other portals ~100 MBytes/sec 2.5 Gbits/sec 100 - 1000 Mbits/sec Bunch crossing per 25 nsecs. 100 triggers per second Event is ~1 MByte in size Physicists work on analysis “channels”. Each institute has ~10 physicists working on one or more channels Physics data cache ~PBytes/sec 2.5 Gbits/sec Tier2 Center ~622 Mbits/sec Tier 0 +1 Tier 1 Tier 3 Tier 4 Tier2 Center Tier 2 Experiment

7 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery7 GriPhyN I Funded (R&D) è NSF results announced Sep. 13, 2000  $11.9M from NSF Information Technology Research Program  $ 1.4M in matching from universities  Largest of all ITR awards è Scope of ITR funding  Major costs for people, esp. students, postdocs  2/3 CS + 1/3 application science  Industry partnerships needed to realize scope Still being pursued è Education and outreach  Reach non-traditional students and other constituencies  University partnerships  Grids “natural” for integrating intellectual resources from all locations  E/O led by UT Brownsville (Romano, Campanelli)

8 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery8 GriPhyN Management Needs è GriPhyN is a complex project  17 universities, SDSC, 3 labs, >40 active participants  4 physics experiments providing frontier challenges è GriPhyN I funded primarily as an IT research project  2/3 CS + 1/3 physics è Need to balance and coordinate  Research creativity with project goals and deliverables  GriPhyN schedule with 4 experiment schedules  GriPhyN design and architecture with that of other projects whose work will be used by LHC or other experiments PPDG, EU DataGrid  GriPhyN deliverables with those of other datagrid projects

9 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery9 GriPhyN Management Organization è Project Leadership  Project Directors: Paul Avery, Ian Foster  Project Coordinator (active search) è Advisory Committees  Project Coordination Group(weekly meetings)  Collaboration Board(not met yet)  External Advisory Board(1-2 times per year) è Coordinators  Industrial Programs  Outreach/Education  System Integration è NSF Review Committee

10 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery10 External Advisory Board VD Toolkit Development Coord.: M. Livny Requirements Definition & Scheduling (Miron Livny) Integration & Testing (Carl Kesselman?) Documentation & Support (TBD) CS Research Coord.: I. Foster Execution Management (Miron Livny) Performance Analysis (Valerie Taylor) Request Planning & Scheduling (Carl Kesselman) Virtual Data (Reagan Moore) Applications Coord.: H. Newman ATLAS (Rob Gardner) CMS (Harvey Newman) LSC(LIGO) (Bruce Allen) SDSS (Alexander Szalay) NSF Review Committee Major Physics Experiments Technical Coordination Committee Chair: J. Bunn H. Newman + T. DeFanti (Networks) A. Szalay + M. Franklin (Databases) T. DeFanti (Visualization) R. Moore (Digital Libraries) C. Kesselman (Grids) P. Galvez + R. Stevens (Collaborative Systems) Project Directors Paul Avery, Ian Foster Internet 2DOE ScienceNSF PACIs Collaboration Board Chair: Paul Avery Project Coordination Group Outreach/Education Joseph Romano Industrial Programs Alex Szalay Other Grid Projects System Integration Project Coordinator

11 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery11 GriPhyN Management Organization è Technical Organization  Computer Science Research  Virtual Data Toolkit Development  Application (Physics experiment) Projects è Liaison with Experiments  Reps on Project Coordination Group  Subgroups in Application Projects organization  Directors have direct contact with experiment computing leaders è Liaison with Other Datagrid Projects  Common participants with PPDG  Cross committee memberships with EU Datagrid  Datagrid Coordination meetings First was March 4 in Amsterdam Next June 23 in Rome

12 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery12 A Common Infrastructure Opportunity è Particle Physics Data Grid (US, DOE)  Data Grid applications for HENP  Funded 2000, 2001  http://www.ppdg.net/ è GriPhyN (US, NSF)  Petascale Virtual-Data Grids  Funded 9/2000 – 9/2005  http://www.griphyn.org/ è European Data Grid (EU)  Data Grid technologies, EU deployment  Funded 1/2001 – 1/2004  http://www.eu-datagrid.org/  HEP in common  Focus: infrastructure development & deployment  International scope

13 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery13 Data Grid Project Collaboration è GriPhyN + PPDG + EU-DataGrid + national efforts  France, Italy, UK, Japan è Have agreed to collaborate, develop joint infrastructure  Initial meeting March 4 in Amsterdam to discuss issues  Future meetings in June, July è Preparing management document  Joint management, technical boards + steering committee  Coordination of people, resources  An expectation that this will lead to real work è Collaborative projects  Grid middleware  Integration into applications  Grid testbed: iVDGL  Network testbed: T 3 = Transatlantic Terabit Testbed

14 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery14 iVDGL è International Virtual-Data Grid Laboratory  A place to conduct Data Grid tests at scale  A concrete manifestation of world-wide grid activity  A continuing activity that will drive Grid awareness  A basis for further funding è Scale of effort  For national, international scale Data Grid tests, operations  Computationally and data intensive computing  Fast networks è Who  Initially US-UK-EU  Other world regions later  Discussions w/ Russia, Japan, China, Pakistan, India, South America

15 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery15 Status of Data Grid Projects è GriPhyN  $12M funded by NSF/ITR 2000 program (5 year R&D)  2001 supplemental funds requested for initial deployments  Submitting 5-year proposal ($15M) to NSF to deploy iVDGL è Particle Physics Data Grid  Funded in 1999, 2000 by DOE ($1.2 M per year)  Submitting 3-year Proposal ($12M) to DOE Office of Science è EU DataGrid  €10M funded by EU (3 years, 2001 – 2004)  Submitting proposal in April for additional funds è GridPP in UK  Submitted proposal April 3 ($30M) è Japan, others?

16 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery16 GriPhyN Activities Since Sept. 2000 è All-hands meeting Oct. 2-3, 2000 è Architecture meeting Dec. 20 è Smaller meetings between CS-experiments è Preparation of requirements documents by experiments è Architecture document(s) è Included in architecture definition for EU DataGrid è Mar. 4 meeting to discuss collaboration of Grid projects è All-hands meeting April 9, 2001 è Hiring still proceeding (2/3 finished) è Submitting new proposal Apr. 25, 2001

17 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery17 Discussion Points è Maintaining the right balance between research and development è Maintaining focus vs. accepting broader scope  E.g., international collaboration  E.g., GriPhyN in the large (GriPhyN II)  E.g., Terascale è Creating a national cyberinfrastructure  What is our appropriate role

18 GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery18 Discussion Points è Outreach to other disciplines  Biology, NEES, … è Outreach to other constituencies  Small universities, K-12, public, international, … è Virtual data toolkit  Inclusive or focused?  Resource issue, again è Achieving critical mass of resources to deliver on the complete promise


Download ppt "GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External."

Similar presentations


Ads by Google