GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 University of Florida GriPhyN External Advisory Committee.

Slides:



Advertisements
Similar presentations
International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
Advertisements

The DataTAG Project 25 March, Brussels FP6 Information Day Peter Clarke, University College London.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LHC Computing Review (Jan. 14, 2003)Paul Avery1 University of Florida GriPhyN, iVDGL and LHC Computing.
US-CMS Meeting (May 19, 2001)Paul Avery1 US-CMS Meeting (UC Riverside) May 19, 2001 Grids for US-CMS and CMS Paul Avery University of Florida
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN GriPhyN: Grid Physics Network and iVDGL: International Virtual Data Grid Laboratory.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Overview of the GLUE Project (Grid Laboratory Unified Environment) Author: Piotr Nowakowski, M.Sc. Cyfronet, Kraków.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
XCAT Science Portal Status & Future Work July 15, 2002 Shava Smallen Extreme! Computing Laboratory Indiana University.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Outreach Workshop (Mar. 1, 2002)Paul Avery1 University of Florida Global Data Grids for 21 st Century.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
LIGO- G Z E/O Meeting, March 1 (2002)LSC Member Institution (UT Brownsville) 1 Education & Outreach Activities of the GriPhyN & iVDGL projects.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
GriPhyN Status and Project Plan Mike Wilde Mathematics and Computer Science Division Argonne National Laboratory.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Ruth Pordes, Fermilab CD, and A PPDG Coordinator Some Aspects of The Particle Physics Data Grid Collaboratory Pilot (PPDG) and The Grid Physics Network.
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
1 Grid Related Activities at Caltech Koen Holtman Caltech/CMS PPDG meeting, Argonne July 13-14, 2000.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
DOE/NSF Review (Nov. 15, 2000)Paul Avery (LHC Data Grid)1 LHC Data Grid The GriPhyN Perspective DOE/NSF Baseline Review of US-CMS Software and Computing.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
GriPhyN Project Overview Paul Avery University of Florida GriPhyN NSF Project Review January 2003 Chicago.
Atlas Grid Status - part 1 Jennifer Schopf ANL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
LIGO- G Z All-Hands April (2002)LSC Member Institution (UT Brownsville) 1 Education & Outreach Activities of the GriPhyN & iVDGL projects Manuela.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
The Particle Physics Data Grid Collaboratory Pilot Richard P. Mount For the PPDG Collaboration DOE SciDAC PI Meeting January 15, 2002.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Education and Outreach Joe Romano University of.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
US Grid Efforts Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Manuela Campanelli The University of Texas at Brownsville GriPhyN NSF Project Review January 2003 Chicago Education & Outreach.
29/1/2002A.Ghiselli, INFN-CNAF1 DataTAG / WP4 meeting Cern, 29 January 2002 Agenda  start at  Project introduction, Olivier Martin  WP4 introduction,
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
GriPhyN Management Mike Wilde University of Chicago, Argonne Paul Avery University of Florida GriPhyN NSF Project.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
G G riPhyN Project Review Criteria l Relevance to Information Technology l Intellectual Merit l Broader Impacts l ITR Evaluation Criteria (innovation in.
Management & Coordination Paul Avery, Rick Cavanaugh University of Florida Ian Foster, Mike Wilde University of Chicago, Argonne
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
LIGO- G Z iVDGL Kick-off Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli, Joe.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli,
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
1 CMS Virtual Data Overview Koen Holtman Caltech/CMS GriPhyN all-hands meeting, Marina del Rey April 9, 2001.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
US Grid Efforts Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Status of Grids for HEP and HENP
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 University of Florida GriPhyN External Advisory Committee Meeting Gainesville, Florida Jan. 7, 2002 GriPhyN Project Overview

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery2 Topics è Overview of Project è Some observations è Progress to date è Management issues è Budget and hiring è GriPhyN and iVDGL è Upcoming meetings è EAC Issues

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery3 Overview of Project è GriPhyN basics  $11.9M (NSF) + $1.6M (matching)  17 universities, SDSC, 3 labs, ~80 participants  4 physics experiments providing frontier challenges è GriPhyN funded primarily as an IT research project  2/3 CS + 1/3 physics è Must balance and coordinate  Research creativity with project goals and deliverables  GriPhyN schedule, priorities and risks with those of 4 experiments  Data Grid design and architecture with other Grid projects  GriPhyN deliverables with those of other Grid projects

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery4 GriPhyN Institutions  U Florida  U Chicago  Boston U  Caltech  U Wisconsin, Madison  USC/ISI  Harvard  Indiana  Johns Hopkins  Northwestern  Stanford  U Illinois at Chicago  U Penn  U Texas, Brownsville  U Wisconsin, Milwaukee  UC Berkeley  UC San Diego  San Diego Supercomputer Center  Lawrence Berkeley Lab  Argonne  Fermilab  Brookhaven

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery5 GriPhyN: PetaScale Virtual Data Grids Virtual Data Tools Request Planning & Scheduling Tools Request Execution & Management Tools Transforms Distributed resources (code, storage, computers, and network ) è Resource è Management è Services Resource Management Services è Security and è Policy è Services Security and Policy Services è Other Grid è Services Other Grid Services Interactive User Tools Production Team Individual Investigator Research group Raw data source

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery6 Some Observations è Progress since April 2001 EAC meeting  Major architecture document draft (with PPDG): Foster talk  Major planning cycle completed: Wilde talk  First Virtual Data Toolkit release in Jan. 2002: Livny talk  Research progress at or ahead of schedule: Wilde talk  Integration with experiments: (SC2001 demos): Wilde/Cavanaugh/Deelman talks è Management is “challenging”: more later è Much effort invested in coordination: Kesselman talk  PPDG: shared personnel, many joint projects  EU DataGrid + national projects (US, Europe, UK, …) è Hiring almost complete, after slow start  Funds being spent, adjustments being made è Outreach effort is taking off: Campanelli talk è iVDGL funded: more later

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery7 Progress to Date: Meetings è Major GriPhyN meetings  Oct. 2-3, 2000All-handsChicago  Dec. 20, 2000ArchitectureChicago  Apr , 2001All-hands, EACUSC/ISI  Aug. 1-2, 2001PlanningChicago  Oct , 2001All-hands, iVDGLUSC/ISI  Jan. 7-9, 2002EAC, Planning, iVDGLFlorida  Feb./Mar. 2002OutreachBrownsville  Apr All-handsChicago  May/Jun. 2002Planning??  Sep , 2002All-hands, EACUCSD/SDSC è Numerous smaller meetings  CS-experiment  CS research  Liaisons with PPDG and EU DataGrid

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery8 Progress to Date: Conferences è International HEP computing conference (Sep. 3-7, 2001)  Foster plenary on Grid architecture, GriPhyN  Richard Mount plenary on PPDG, iVDGL  Avery parallel talk on iVDGL  Several talks on GriPhyN research & application work  Several PPDG talks  Grid coordination meeting with EU, CERN, Asia è SC2001 (Nov. 2001)  Major demos demonstrate integration with expts  LIGO, 2 CMS demos (GriPhyN) + CMS demo (PPDG)  Professionally-made flyer for GriPhyN, PPDG & iVDGL  SC Global (Foster) + International Data Grid Panel (Avery) è HPDC (July 2001)  Included GriPhyN-related talks + demos è GGF now features GriPhyN updates

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery Data: 0.5 MB 175 MB 275 MB 105 MB SC2001 Demo Version: pythia cmsim writeHits writeDigis 1 run = 500 events 1 run 1 event CPU: 2 min 8 hours 5 min 45 min truth.ntpl hits.fz hits.DB digis.DB Production Pipeline GriphyN-CMS Demo

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery10 GriPhyN-LIGO SC2001 Demo

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery11 GriPhyN CMS SC2001 Demo         Full Event Database of ~100,000 large objects Full Event Database of ~40,000 large objects “Tag” database of ~140,000 small objects Request Parallel tuned GSI FTP Bandwidth Greedy Grid-enabled Object Collection Analysis for Particle Physics

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery12 Management Issues è Basic challenges from large, dispersed, diverse project  ~80 people  12 funded institutions + 9 unfunded ones  “Multi-culturalism”: CS, 4 experiments  Several large software projects with vociferous salespeople  Different priorities and risk equations è Co-Director concept really works  Share work, responsibilities, blame  CS (Foster)  Physics (Avery)  early reality check  Good cop / bad cop useful sometimes è Project coordinators have helped tremendously  Starting in late July  Mike Wilde Coordinator (Argonne)  Rick CavanaughDeputy Coordinator (Florida)

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery13 Management Issues (cont.) è Overcoming internal coordination challenges  Conflicting schedules for meetings  Experiments in different stages of software development  Joint milestones require negotiation  We have overcome these (mostly) è Addressing external coordination challenges  National:PPDG, iVDGL, TeraGrid, Globus, NSF, SciDAC, …  International:EUDG, LCGP, GGF, HICB, GridPP, …  Networks:Internet2, ESNET, STAR-TAP, STARLIGHT, SURFNet, DataTAG  Industry trends:IBM announcement, SUN, …  Highly time dependent  Requires lots of travel, meetings, energy

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery14 Management Issues (cont.) è GriPhyN + PPDG: excellent working relationship  GriPhyN: CS research, prototypes  PPDG:Deployment  Overlapping personnel (particularly Ruth Pordes)  Overlapping testbeds  Jointly operate iVDGL è Areas where we need improvement / advice  Reporting system Monthly reports not yet consistent Better carrot? Bigger stick? Better system? Personal contact?  Information dissemination Need more information flow from top to bottom  Web page Already addressed, changes will be seen soon  Using our web site for real collaboration

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery15 GriPhyN’s Place in the Grid Landscape è GriPhyN  Virtual data research & infrastructure  Advanced planning & execution  Fault tolerance  Virtual Data Toolkit (VDT) è Joint/PPDG  Architecture definition  Integrating with GDMP and MAGDA  Jointly developing replica catalog and reliable data transfer  Performance monitoring è Joint/others  Testbeds, performance monitoring, job languages, … è Needs from others  Databases, security, network QoS, … (Kesselman talk) è Focus of meetings over next two months

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery16 Management Issues (cont.) è Reassessing our work organization (Fig.)  ~1 year of experience  Rethink breakdown of tasks, responsibilities in light of experience  Discussions during next 2 months è Exploit iVDGL resources and close connection  Testbeds handled by iVDGL  Common assistant to iVDGL and GriPhyN Coordinators  Common web site development for iVDGL and GriPhyN  Common Outreach effort for iVDGL and GriPhyN  Additional support from iVDGL for software integration

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery17 External Advisory Board Physics Experiments Project Directors Paul Avery Ian Foster Internet 2DOE Science NSF PACIs Collaboration Board Project Coordination Group Outreach/Education Manuela Campanelli Industrial Connections Alex Szalay Other Grid Projects System Integration Carl Kesselman Project Coordinators M. Wilde, R. Cavanaugh VD Toolkit Development Coord.: M. Livny Requirements Definition & Scheduling (Miron Livny) Integration & Testing (Carl Kesselman – NMI GRIDS Center) Documentation & Support (TBD) CS Research Coord.: I. Foster Execution Management (Miron Livny) Performance Analysis (Valerie Taylor) Request Planning & Scheduling (Carl Kesselman) Virtual Data (Reagan Moore) Applications Coord.: H. Newman ATLAS (Rob Gardner) CMS (Harvey Newman) LSC(LIGO) (Bruce Allen) SDSS (Alexander Szalay) Technical Coord. Committee Chair: J. Bunn H. Newman + T. DeFanti (Networks) A. Szalay + M. Franklin (Databases) R. Moore (Digital Libraries) C. Kesselman (Grids) P. Galvez + R. Stevens (Collaborative Systems) GriPhyN Management

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery18 Budget and Hiring è (Show figures for personnel) è (Show spending graphs) è Budget adjustments  Hiring delays give us some extra funds  Fund part time web site person  Fund deputy Project Coordinator  Jointly fund (with iVDGL) assistant to Coordinators

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery19 GriPhyN and iVDGL è International Virtual Data Grid Laboratory  NSF 2001 ITR program $13.65M + $2M (matching)  Vision: deploy global Grid laboratory (US, EU, Asia, …) è Activities  A place to conduct Data Grid tests “at scale”  A place to deploy a “concrete” common Grid infrastructure  A facility to perform tests and productions for LHC experiments  A laboratory for other disciplines to perform Data Grid tests è Organization  GriPhyN + PPDG “joint project”  Avery + Foster co-Directors  Work teams aimed at deploying hardware/software  GriPhyN testbed activities handled by iVDGL

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery20  U FloridaCMS  CaltechCMS, LIGO  UC San DiegoCMS, CS  Indiana UATLAS, iGOC  Boston UATLAS  U Wisconsin, MilwaukeeLIGO  Penn StateLIGO  Johns HopkinsSDSS, NVO  U ChicagoCS  U Southern CaliforniaCS  U Wisconsin, MadisonCS  Salish KootenaiOutreach, LIGO  Hampton UOutreach, ATLAS  U Texas, BrownsvilleOutreach, LIGO  FermilabCMS, SDSS, NVO  BrookhavenATLAS  Argonne LabATLAS, CS US iVDGL Proposal Participants T2 / Software CS support T3 / Outreach T1 / Labs (not funded)

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery21 iVDGL Map Circa Tier0/1 facility Tier2 facility 10 Gbps link 2.5 Gbps link 622 Mbps link Other link Tier3 facility SURFNet DataTAG

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery22 US-iVDGL Summary Information è Principal components (as seen by USA)  Tier1 + proto-Tier2 + selected Tier3 sites  Fast networks: US, Europe, transatlantic (DataTAG), transpacific?  Grid Operations Center (GOC)  Computer Science support teams  Coordination with other Data Grid projects è Experiments  HEP:ATLAS, CMS + (ALICE, CMS Heavy Ion, BTEV)  Non-HEP:LIGO, SDSS, NVO, biology (small) è Proposed international participants  6 Fellows funded by UK for 5 years, work in US  US, UK, EU, Japan, Australia (discussions with others)

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery23 iVDGL Work Team Breakdown è Work teams  Facilities  Software Integration  Laboratory Operations  Applications  Outreach

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery24 Initial iVDGL Org Chart Project Directors Project Steering Group Project Coordination Group External Advisory Committee Facilities Team Operations TeamApplications Team E/O Team Software Integration Team Collaboration Board?

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery25 US iVDGL Budget Units = $1K

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery26 Upcoming Meetings è February / March  Outreach meeting, date to be set è April  All-hands meeting  Perhaps joint with iVDGL è May or June  Smaller, focused meeting  Difficulty setting date so far è Sep  All-hands meeting  Sep. 13 as EAC meeting?

GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery27 EAC Issues è Can we use common EAC for GriPhyN/iVDGL?  Some members in common  Some specific to GriPhyN  Some specific to iVDGL è Interactions with EAC: more frequent advice  More frequent updates from GriPhyN / iVDGL?  Phone meetings with Directors and Coordinators?  Other ideas? è Industry relations  Several discussions (IBM, Sun, HP, Dell, SGI, Microsoft, Intel)  Some progress (Intel?), but not much  We need help here è Coordination  Hope to have lots of discussion about this