GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
University of Illinois at Chicago Annual Update Thomas A. DeFanti Principal Investigator, STAR TAP Director, Electronic Visualization Laboratory.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
US-CMS Meeting (May 19, 2001)Paul Avery1 US-CMS Meeting (UC Riverside) May 19, 2001 Grids for US-CMS and CMS Paul Avery University of Florida
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Overview of the GLUE Project (Grid Laboratory Unified Environment) Author: Piotr Nowakowski, M.Sc. Cyfronet, Kraków.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 University of Florida GriPhyN External Advisory Committee.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Outreach Workshop (Mar. 1, 2002)Paul Avery1 University of Florida Global Data Grids for 21 st Century.
LIGO- G Z E/O Meeting, March 1 (2002)LSC Member Institution (UT Brownsville) 1 Education & Outreach Activities of the GriPhyN & iVDGL projects.
CHEP 2000 (Feb. 7-11)Paul Avery (Data Grids in the LHC Era)1 The Promise of Computational Grids in the LHC Era Paul Avery University of Florida Gainesville,
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Ruth Pordes, Fermilab CD, and A PPDG Coordinator Some Aspects of The Particle Physics Data Grid Collaboratory Pilot (PPDG) and The Grid Physics Network.
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Brussels Grid Meeting (Mar. 23, 2001)Paul Avery1 University of Florida Extending the Grid Reach in Europe.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
DOE/NSF Review (Nov. 15, 2000)Paul Avery (LHC Data Grid)1 LHC Data Grid The GriPhyN Perspective DOE/NSF Baseline Review of US-CMS Software and Computing.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
GriPhyN Project Overview Paul Avery University of Florida GriPhyN NSF Project Review January 2003 Chicago.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
Internet 2 Workshop (Nov. 1, 2000)Paul Avery (The GriPhyN Project)1 The GriPhyN Project (Grid Physics Network) Paul Avery University of Florida
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Education and Outreach Joe Romano University of.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
US Grid Efforts Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
29/1/2002A.Ghiselli, INFN-CNAF1 DataTAG / WP4 meeting Cern, 29 January 2002 Agenda  start at  Project introduction, Olivier Martin  WP4 introduction,
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
GriPhyN Management Mike Wilde University of Chicago, Argonne Paul Avery University of Florida GriPhyN NSF Project.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
G G riPhyN Project Review Criteria l Relevance to Information Technology l Intellectual Merit l Broader Impacts l ITR Evaluation Criteria (innovation in.
Management & Coordination Paul Avery, Rick Cavanaugh University of Florida Ian Foster, Mike Wilde University of Chicago, Argonne
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
LIGO- G Z iVDGL Kick-off Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli, Joe.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli,
Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
LHC Computing at RAL PPD Dave Newbold RAL PPD / University of Bristol The LHC computing challenge PPD and the Grid Computing for physics PPD added value.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Hall D Computing Facilities Ian Bird 16 March 2001.
] Open Science Grid Ben Clifford University of Chicago
LHC Computing Grid Project
Preliminary Project Execution Plan
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External Advisory Meeting Marina del Rey, April 12, 2001

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery2 Who We Are  U Florida  U Chicago  Boston U  Caltech  U Wisconsin, Madison  USC/ISI  Harvard  Indiana  Johns Hopkins  Northwestern  Stanford  U Illinois at Chicago  U Penn  U Texas, Brownsville  U Wisconsin, Milwaukee  UC Berkeley  UC San Diego  San Diego Supercomputer Center  Lawrence Berkeley Lab  Argonne  Fermilab  Brookhaven

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery3 GriPhyN = App. Science + CS + Grids è GriPhyN = Grid Physics Network  US-CMSHigh Energy Physics  US-ATLASHigh Energy Physics  LIGO/LSCGravity wave research  SDSSSloan Digital Sky Survey  Strong partnership with computer scientists è Design and implement production-scale grids  Investigation of “Virtual Data” concept (fig)  Integration into 4 major science experiments  Develop common infrastructure, tools and services  Builds on existing foundations: Globus tools è Multi-year project  Grid R&D  Development, deployment of “Tier 2” hardware, personnel (fig)  Education & outreach

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery4 GriPhyN Data Grid Challenge “Global scientific communities, served by networks with bandwidths varying by orders of magnitude, need to perform computationally demanding analyses of geographically distributed datasets that will grow by at least 3 orders of magnitude over the next decade, from the 100 Terabyte to the 100 Petabyte scale.”

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery5 Data Grid Hierarchy Tier 1 T Tier 0 (CERN) Tier0 CERN Tier1 National Lab Tier2 Regional Center at University Tier3 University workgroup Tier4 Workstation GriPhyN: è R&D è Tier2 centers è Unify all IT resources

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery6 LHC Global Grid Hierarchy Tier2 Center Online System CERN Computer Center > 20 TIPS USA Center France Center Italy Center UK Center Institute Institute ~0.25TIPS Workstations, other portals ~100 MBytes/sec 2.5 Gbits/sec Mbits/sec Bunch crossing per 25 nsecs. 100 triggers per second Event is ~1 MByte in size Physicists work on analysis “channels”. Each institute has ~10 physicists working on one or more channels Physics data cache ~PBytes/sec 2.5 Gbits/sec Tier2 Center ~622 Mbits/sec Tier 0 +1 Tier 1 Tier 3 Tier 4 Tier2 Center Tier 2 Experiment

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery7 GriPhyN I Funded (R&D) è NSF results announced Sep. 13, 2000  $11.9M from NSF Information Technology Research Program  $ 1.4M in matching from universities  Largest of all ITR awards è Scope of ITR funding  Major costs for people, esp. students, postdocs  2/3 CS + 1/3 application science  Industry partnerships needed to realize scope Still being pursued è Education and outreach  Reach non-traditional students and other constituencies  University partnerships  Grids “natural” for integrating intellectual resources from all locations  E/O led by UT Brownsville (Romano, Campanelli)

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery8 GriPhyN Management Needs è GriPhyN is a complex project  17 universities, SDSC, 3 labs, >40 active participants  4 physics experiments providing frontier challenges è GriPhyN I funded primarily as an IT research project  2/3 CS + 1/3 physics è Need to balance and coordinate  Research creativity with project goals and deliverables  GriPhyN schedule with 4 experiment schedules  GriPhyN design and architecture with that of other projects whose work will be used by LHC or other experiments PPDG, EU DataGrid  GriPhyN deliverables with those of other datagrid projects

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery9 GriPhyN Management Organization è Project Leadership  Project Directors: Paul Avery, Ian Foster  Project Coordinator (active search) è Advisory Committees  Project Coordination Group(weekly meetings)  Collaboration Board(not met yet)  External Advisory Board(1-2 times per year) è Coordinators  Industrial Programs  Outreach/Education  System Integration è NSF Review Committee

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery10 External Advisory Board VD Toolkit Development Coord.: M. Livny Requirements Definition & Scheduling (Miron Livny) Integration & Testing (Carl Kesselman?) Documentation & Support (TBD) CS Research Coord.: I. Foster Execution Management (Miron Livny) Performance Analysis (Valerie Taylor) Request Planning & Scheduling (Carl Kesselman) Virtual Data (Reagan Moore) Applications Coord.: H. Newman ATLAS (Rob Gardner) CMS (Harvey Newman) LSC(LIGO) (Bruce Allen) SDSS (Alexander Szalay) NSF Review Committee Major Physics Experiments Technical Coordination Committee Chair: J. Bunn H. Newman + T. DeFanti (Networks) A. Szalay + M. Franklin (Databases) T. DeFanti (Visualization) R. Moore (Digital Libraries) C. Kesselman (Grids) P. Galvez + R. Stevens (Collaborative Systems) Project Directors Paul Avery, Ian Foster Internet 2DOE ScienceNSF PACIs Collaboration Board Chair: Paul Avery Project Coordination Group Outreach/Education Joseph Romano Industrial Programs Alex Szalay Other Grid Projects System Integration Project Coordinator

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery11 GriPhyN Management Organization è Technical Organization  Computer Science Research  Virtual Data Toolkit Development  Application (Physics experiment) Projects è Liaison with Experiments  Reps on Project Coordination Group  Subgroups in Application Projects organization  Directors have direct contact with experiment computing leaders è Liaison with Other Datagrid Projects  Common participants with PPDG  Cross committee memberships with EU Datagrid  Datagrid Coordination meetings First was March 4 in Amsterdam Next June 23 in Rome

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery12 A Common Infrastructure Opportunity è Particle Physics Data Grid (US, DOE)  Data Grid applications for HENP  Funded 2000, 2001  è GriPhyN (US, NSF)  Petascale Virtual-Data Grids  Funded 9/2000 – 9/2005  è European Data Grid (EU)  Data Grid technologies, EU deployment  Funded 1/2001 – 1/2004   HEP in common  Focus: infrastructure development & deployment  International scope

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery13 Data Grid Project Collaboration è GriPhyN + PPDG + EU-DataGrid + national efforts  France, Italy, UK, Japan è Have agreed to collaborate, develop joint infrastructure  Initial meeting March 4 in Amsterdam to discuss issues  Future meetings in June, July è Preparing management document  Joint management, technical boards + steering committee  Coordination of people, resources  An expectation that this will lead to real work è Collaborative projects  Grid middleware  Integration into applications  Grid testbed: iVDGL  Network testbed: T 3 = Transatlantic Terabit Testbed

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery14 iVDGL è International Virtual-Data Grid Laboratory  A place to conduct Data Grid tests at scale  A concrete manifestation of world-wide grid activity  A continuing activity that will drive Grid awareness  A basis for further funding è Scale of effort  For national, international scale Data Grid tests, operations  Computationally and data intensive computing  Fast networks è Who  Initially US-UK-EU  Other world regions later  Discussions w/ Russia, Japan, China, Pakistan, India, South America

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery15 Status of Data Grid Projects è GriPhyN  $12M funded by NSF/ITR 2000 program (5 year R&D)  2001 supplemental funds requested for initial deployments  Submitting 5-year proposal ($15M) to NSF to deploy iVDGL è Particle Physics Data Grid  Funded in 1999, 2000 by DOE ($1.2 M per year)  Submitting 3-year Proposal ($12M) to DOE Office of Science è EU DataGrid  €10M funded by EU (3 years, 2001 – 2004)  Submitting proposal in April for additional funds è GridPP in UK  Submitted proposal April 3 ($30M) è Japan, others?

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery16 GriPhyN Activities Since Sept è All-hands meeting Oct. 2-3, 2000 è Architecture meeting Dec. 20 è Smaller meetings between CS-experiments è Preparation of requirements documents by experiments è Architecture document(s) è Included in architecture definition for EU DataGrid è Mar. 4 meeting to discuss collaboration of Grid projects è All-hands meeting April 9, 2001 è Hiring still proceeding (2/3 finished) è Submitting new proposal Apr. 25, 2001

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery17 Discussion Points è Maintaining the right balance between research and development è Maintaining focus vs. accepting broader scope  E.g., international collaboration  E.g., GriPhyN in the large (GriPhyN II)  E.g., Terascale è Creating a national cyberinfrastructure  What is our appropriate role

GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery18 Discussion Points è Outreach to other disciplines  Biology, NEES, … è Outreach to other constituencies  Small universities, K-12, public, international, … è Virtual data toolkit  Inclusive or focused?  Resource issue, again è Achieving critical mass of resources to deliver on the complete promise