Mardi Gras Conference (February 3, 2005)Paul Avery1 University of Florida Grid3 and Open Science Grid Mardi Gras Conference Louisiana.

Slides:



Advertisements
Similar presentations
International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
Advertisements

SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Lishep2004 (February 17, 2004)Paul Avery1 University of Florida Grid Computing in High Energy Physics Enabling Data Intensive Global.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Other servers Java client, ROOT (analysis tool), IGUANA (CMS viz. tool), ROOT-CAVES client (analysis sharing tool), … any app that can make XML-RPC/SOAP.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Internet2 Presentation (Oct. 11, 2007)Paul Avery 1 Paul Avery University of Florida Internet2 Meeting San Diego, CA October 11, 2007.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
Korean HEP Grid Workshop (August 27, 2004)Paul Avery1 University of Florida Trillium and Open Science Grid 3 rd International Workshop.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
CASC Meeting (July 14, 2004)Paul Avery1 University of Florida Physics Grids and Open Science Grid CASC Meeting Washington, DC July 14,
GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
D0SAR Workshop (March 30, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.
Brussels Grid Meeting (Mar. 23, 2001)Paul Avery1 University of Florida Extending the Grid Reach in Europe.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
DOE/NSF Review (Nov. 15, 2000)Paul Avery (LHC Data Grid)1 LHC Data Grid The GriPhyN Perspective DOE/NSF Baseline Review of US-CMS Software and Computing.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Ian Bird GDB Meeting CERN 9 September 2003
Presentation transcript:

Mardi Gras Conference (February 3, 2005)Paul Avery1 University of Florida Grid3 and Open Science Grid Mardi Gras Conference Louisiana State University Baton Rouge February 3, 2005

Mardi Gras Conference (February 3, 2005)Paul Avery2 Data Grids & Collaborative Research  Scientific discovery increasingly dependent on collaboration  Computationally & data intensive analyses  Resources and collaborations distributed internationally  Dominant factor: data growth (1 Petabyte = 1000 TB)  2000~0.5 Petabyte  2005~10 Petabytes  2012~100 Petabytes  2018~1000 Petabytes?  Drives need for powerful linked resources: “Data Grids”  ComputationMassive, distributed CPU  Data storage and accessDistributed hi-speed disk and tape  Data movementInternational optical networks  Collaborative research and Data Grids  Data discovery, resource sharing, distributed analysis, etc. How to collect, manage, access and interpret this quantity of data?

Mardi Gras Conference (February 3, 2005)Paul Avery3 Data Intensive Disciplines  High energy & nuclear physics  Belle, BaBar, Tevatron, RHIC, JLAB  Large Hadron Collider (LHC)  Astronomy  Digital sky surveys, “Virtual” Observatories  VLBI arrays: multiple- Gbps data streams  Gravity wave searches  LIGO, GEO, VIRGO, TAMA, ACIGA, …  Earth and climate systems  Earth Observation, climate modeling, oceanography, …  Biology, medicine, imaging  Genome databases  Proteomics (protein structure & interactions, drug delivery, …)  High-res brain scans (1-10  m, time dependent)

Mardi Gras Conference (February 3, 2005)Paul Avery4  U.S. Projects  GriPhyN (NSF)  iVDGL (NSF)  Particle Physics Data Grid (DOE)  Open Science Grid  UltraLight  TeraGrid (NSF)  DOE Science Grid (DOE)  NEESgrid (NSF)  NSF Middleware Initiative (NSF) Background: Data Grid Projects  EU, Asia projects  EGEE (EU)  LCG (CERN)  DataGrid  EU national Projects  DataTAG (EU)  CrossGrid (EU)  GridLab (EU)  Japanese, Korea Projects  Not exclusively HEP (but many driven/led by HEP)  Many 10s x $M brought into the field  Large impact on other sciences, education HEP led projects

Mardi Gras Conference (February 3, 2005)Paul Avery5 U.S. “Trillium” Grid Consortium  Trillium = PPDG + GriPhyN + iVDGL  Particle Physics Data Grid:$12M (DOE)(1999 – 2004+)  GriPhyN:$12M (NSF)(2000 – 2005)  iVDGL:$14M (NSF)(2001 – 2006)  Basic composition (~150 people)  PPDG:4 universities, 6 labs  GriPhyN:12 universities, SDSC, 3 labs  iVDGL:18 universities, SDSC, 4 labs, foreign partners  Expts:BaBar, D0, STAR, Jlab, CMS, ATLAS, LIGO, SDSS/NVO  Complementarity of projects  GriPhyN:CS research, Virtual Data Toolkit (VDT) development  PPDG:“End to end” Grid services, monitoring, analysis  iVDGL:Grid laboratory deployment using VDT  Experiments provide frontier challenges  Unified entity when collaborating internationally

Mardi Gras Conference (February 3, 2005)Paul Avery6 Goal: Peta-scale Data Grids for Global Science Virtual Data Tools Request Planning & Scheduling Tools Request Execution & Management Tools Transforms Distributed resources (code, storage, CPUs, networks) Resource Management Services Security and Policy Services Other Grid Services Interactive User Tools Production Team Single Researcher Workgroups Raw data source  PetaOps  Petabytes  Performance

Mardi Gras Conference (February 3, 2005)Paul Avery7 Trillium Science Drivers  Experiments at Large Hadron Collider  100s of Petabytes ?  High Energy & Nuclear Physics expts  ~1 Petabyte (1000 TB)1997 – present  LIGO (gravity wave search)  100s of Terabytes2002 – present  Sloan Digital Sky Survey  10s of Terabytes2001 – present Data growth Community growth Future Grid resources  Massive CPU (PetaOps)  Large distributed datasets (>100PB)  Global communities (1000s)

Mardi Gras Conference (February 3, 2005)Paul Avery8 Sloan Digital Sky Survey (SDSS) Using Virtual Data in GriPhyN Galaxy cluster size distribution Sloan Data

Mardi Gras Conference (February 3, 2005)Paul Avery9 The LIGO Scientific Collaboration (LSC) and the LIGO Grid LIGO Grid: 6 US sites * LHO, LLO: observatory sites * LSC - LIGO Scientific Collaboration - iVDGL supported iVDGL has enabled LSC to establish a persistent production grid  Cardiff AEI/Golm + 3 EU sites (Cardiff/UK, AEI/Germany) Birmingham

Mardi Gras Conference (February 3, 2005)Paul Avery10 Search for Origin of Mass & Supersymmetry (2007 – ?) TOTEM LHCb ALICE  27 km Tunnel in Switzerland & France CMS ATLAS Large Hadron Collider CERN

Mardi Gras Conference (February 3, 2005)Paul Avery11 LHC Data Rates: Detector to Storage Level 1 Trigger: Special Hardware 40 MHz 75 KHz 75 GB/sec 5 KHz 5 GB/sec Level 2 Trigger: Commodity CPUs 100 Hz 0.1 – 1.5 GB/sec Level 3 Trigger: Commodity CPUs Raw Data to storage (+ simulated data) Physics filtering ~TBytes/sec

Mardi Gras Conference (February 3, 2005)Paul Avery collisions/sec, selectivity: 1 in Complexity: Higgs Decay into 4 muons

Mardi Gras Conference (February 3, 2005)Paul Avery13 LHC: Petascale Global Science  Complexity:Millions of individual detector channels  Scale:PetaOps (CPU), 100s of Petabytes (Data)  Distribution:Global distribution of people & resources CMS Example Physicists 250+ Institutes 60+ Countries BaBar/D0 Example Physicists 100+ Institutes 35+ Countries

Mardi Gras Conference (February 3, 2005)Paul Avery14 ATLAS CMS LHC Global Collaborations  1000 – 4000 per experiment  USA is 20 – 25% of total

Mardi Gras Conference (February 3, 2005)Paul Avery15 CMS Experiment LHC Global Data Grid (2007+) Online System CERN Computer Center USA Korea Russia UK Maryland GB/s >10 Gb/s Gb/s Gb/s Tier 0 Tier 1 Tier 3 Tier 2 Physics caches PCs Iowa UCSDCaltech U Florida  5000 physicists, 60 countries  10s of Petabytes/yr by 2008  1000 Petabytes in < 10 yrs? FIU Tier 4

Mardi Gras Conference (February 3, 2005)Paul Avery16 CMS: Grid Enabled Analysis (GAE) Architecture Scheduler Catalogs Grid Services Web Server Execution Priority Manager Grid Wide Execution Service Data Manage- ment Fully- Concrete Planner Fully- Abstract Planner Analysis Client Virtual Data Replica Applications Monitoring Partially- Abstract Planner Metadata HTTP, SOAP, XML- RPC Chimera Sphinx MonALISA ROOT (analysis tool) Python Cojac (detector viz)/ IGUANA (cms viz) Clarens MCRunjob BOSS RefDB POOL ORCA ROOT FAMOS VDT-Server MOPDB Discovery ACL management Cert. based access uClients talk standard protocols to “Grid Services Web Server” uSimple Web service API allows simple or complex analysis clients uTypical clients: ROOT, Web Browser, …. uClarens portal hides complexity uKey features: Global Scheduler, Catalogs, Monitoring, Grid-wide Execution service Analysis Client

Mardi Gras Conference (February 3, 2005)Paul Avery17 Collaborative Research by Globally Distributed Teams  Non-hierarchical: Chaotic analyses + productions  Superimpose significant random data flows

Mardi Gras Conference (February 3, 2005)Paul Avery18 Trillium Grid Tools: Virtual Data Toolkit Sources (CVS) Patching GPT src bundles NMI Build & Test Condor pool (37 computers) … Build Test Package VDT Build Contributors (VDS, etc.) Build Pacman cache RPMs Binaries Test Use NMI processes later

Mardi Gras Conference (February 3, 2005)Paul Avery19 VDT Growth Over 3 Years VDT 1.1.3, & pre-SC 2002 VDT 1.0 Globus 2.0b Condor VDT Switch to Globus 2.2 VDT Grid3 VDT First real use by LCG VDT May 10

Mardi Gras Conference (February 3, 2005)Paul Avery20  Language: define software environments  Interpreter: create, install, configure, update, verify environments  Version released Jan  LCG/Scram  ATLAS/CMT  CMS DPE/tar/make  LIGO/tar/make  OpenSource/tar/make  Globus/GPT  NPACI/TeraGrid/tar/make  D0/UPS-UPD  Commercial/tar/make Combine and manage software from arbitrary sources. % pacman –get iVDGL:Grid3 “1 button install”: Reduce burden on administrators Remote experts define installation/ config/updating for everyone at once VDT ATLA S NPAC I D- Zero iVDGL UCHEP % pacman VDT CMS/DPE LIGO Packaging of Grid Software: Pacman

Mardi Gras Conference (February 3, 2005)Paul Avery21 Collaborative Relationships CS Perspective Computer Science Research Virtual Data Toolkit Partner science projects Partner networking projects Partner outreach projects Larger Science Community Globus, Condor, NMI, iVDGL, PPDG EU DataGrid, LHC Experiments, QuarkNet, CHEPREO, Dig. Divide Production Deployment Tech Transfer Techniques & software Requirements Prototyping & experiments Other linkages  Work force  CS researchers  Industry U.S.Grids Int’l Outreach

Mardi Gras Conference (February 3, 2005)Paul Avery22 Grid3: An Operational National Grid  35 sites, 3500 CPUs: Universities + 4 national labs  Part of LHC Grid  Running since October 2003  Applications in HEP, LIGO, SDSS, Genomics, CS

Mardi Gras Conference (February 3, 2005)Paul Avery23 Grid3 Applications  High energy physics  US-ATLAS analysis (DIAL),  US-ATLAS GEANT3 simulation (GCE)  US-CMS GEANT4 simulation (MOP)  BTeV simulation  Gravity waves  LIGO: blind search for continuous sources  Digital astronomy  SDSS: cluster finding (maxBcg)  Bioinformatics  Bio-molecular analysis (SnB)  Genome analysis (GADU/Gnare)  CS Demonstrators  Job Exerciser, GridFTP, NetLogger-grid2003

Mardi Gras Conference (February 3, 2005)Paul Avery24 Grid3 Shared Use Over 6 months cms dc04 atlas dc2 Sep 10 Usage: CPUs

Mardi Gras Conference (February 3, 2005)Paul Avery25 Grid3  Open Science Grid  Iteratively build & extend Grid3, to national infrastructure  Shared resources, benefiting broad set of disciplines  Realization of the critical need for operations  More formal organization needed because of scale  Grid3  Open Science Grid  Build OSG from laboratories, universities, campus grids, etc.  Argonne, Fermilab, SLAC, Brookhaven, Berkeley Lab, Jeff. Lab  UW Madison, U Florida, Purdue, Chicago, Caltech, Harvard, etc.  Further develop OSG  Partnerships and contributions from other sciences, universities  Incorporation of advanced networking  Focus on general services, operations, end-to-end performance

Mardi Gras Conference (February 3, 2005)Paul Avery26

Mardi Gras Conference (February 3, 2005)Paul Avery27 Open Science Grid Basics  OSG infrastructure  A large CPU & storage Grid infrastructure supporting science  Grid middleware based on Virtual Data Toolkit (VDT)  Loosely coupled, consistent infrastructure: “Grid of Grids”  Emphasis on “end to end” services for applications  OSG collaboration builds on Grid3  Computer and application scientists  Facility, technology and resource providers  Grid3  OSG-0  OSG-1  OSG-2  …  Fundamental unit is the Virtual Organization (VO)  E.g., an experimental collaboration, a research group, a class  Simplifies organization and logistics  Distributed ownership of resources  Local facility policies, priorities, and capabilities must be supported

Mardi Gras Conference (February 3, 2005)Paul Avery28 OSG Integration: Applications, Infrastructure, Facilities

Mardi Gras Conference (February 3, 2005)Paul Avery29 Enterprise Technical Groups Research Grid Projects VOs Researchers Sites Service Providers Universities, Labs activity 1 activity 1 activity 1 Activities Advisory Committee Core OSG Staff (few FTEs, manager) OSG Council (all members above a certain threshold, Chair, officers) Executive Board (8-15 representatives Chair, Officers) OSG Organization

Mardi Gras Conference (February 3, 2005)Paul Avery30 OSG Technical Groups and Activities  Technical Groups address and coordinate a technical area  Propose and carry out activities related to their given areas  Liaise & collaborate with other peer projects (U.S. & international)  Participate in relevant standards organizations.  Chairs participate in Blueprint, Grid Integration and Deployment activities  Activities are well-defined, scoped set of tasks contributing to the OSG  Each Activity has deliverables and a plan  … is self-organized and operated  … is overseen & sponsored by one or more Technical Groups

Mardi Gras Conference (February 3, 2005)Paul Avery31 OSG Technical Groups (7 currently)  Governance  Charter, organization, by-laws, agreements, formal processes  Policy  VO & site policy, authorization, priorities, privilege & access rights  Security  Common security principles, security infrastructure  Monitoring and Information Services  Resource monitoring, information services, auditing, troubleshooting  Storage  Storage services at remote sites, interfaces, interoperability  Support Centers  Infrastructure and services for user support, helpdesk, trouble ticket  Education and Outreach  Networking?

Mardi Gras Conference (February 3, 2005)Paul Avery32 OSG Activities (5 currently)  Blueprint  Defining principles and best practices for OSG  Deployment  Deployment of resources & services  Incident Response  Plans and procedures for responding to security incidents  Integration  Testing & validating & integrating new services and technologies  Data Resource Management (DRM)  Deployment of specific Storage Resource Management technology

Mardi Gras Conference (February 3, 2005)Paul Avery33 OSG Short Term Plans  Maintain Grid3 operations  In parallel with extending Grid3 to OSG  OSG technology advances for Spring 2005 deployment  Add full Storage Elements  Extend Authorization services  Extend Data Management services  Interface to sub-Grids  Extend monitoring, testing, accounting  Add new VOs + OSG-wide VO Services  Add Discovery Service  Service challenges & collaboration with the LCG  Make the switch to “Open Science Grid” in Spring 2005

Mardi Gras Conference (February 3, 2005)Paul Avery34 Open Science Grid Meetings  Sep. 17, NSF  Strong interest of NSF education people  Jan. 12, Fermilab  Initial stakeholders meeting, 1 st discussion of governance  May 20-21, Univ. of Chicago  Joint Trillium Steering meeting to define OSG program  July Wisconsin  First attempt to define OSG Blueprint (document)  Sep. 9-10, Harvard  Major OSG workshop: Technical, Governance, Sciences  Dec , UCSD  Major meeting for Technical Groups  Feb , U Chicago  Integration meeting

Mardi Gras Conference (February 3, 2005)Paul Avery35 Networks

Mardi Gras Conference (February 3, 2005)Paul Avery36 Networks and Grids for Global Science  Network backbones and major links are advancing rapidly  To the 10G range in < 3 years; faster than Moore’s Law  New HENP and DOE Roadmaps: a factor ~1000 BW Growth per decade  We are learning to use long distance 10 Gbps networks effectively  2004 Developments: to Gbps flows with TCP over kkm  Transition to community-operated optical R&E networks  US, CA, NL, PL, CZ, SK, KR, JP …  Emergence of a new generation of “hybrid” optical networks  We must work to close to digital divide  To allow scientists in all world regions to take part in discoveries  Regional, last mile, local bottlenecks and compromises in network quality are now on the critical path  Important examples on the road to closing the digital divide  CLARA, CHEPREO, and the Brazil HEPGrid in Latin America  Optical networking in Central and Southeast Europe  APAN Links in the Asia Pacific: GLORIAD and TEIN  Leadership and Outreach: HEP Groups in Europe, US, Japan, & Korea

Mardi Gras Conference (February 3, 2005)Paul Avery37 HEP Bandwidth Roadmap (Gb/s)

Mardi Gras Conference (February 3, 2005)Paul Avery38 Evolving Science Requirements for Networks (DOE High Perf. Network Workshop) Science Areas Today End2End Throughput 5 years End2End Throughput 5-10 Years End2End Throughput Remarks High Energy Physics 0.5 Gb/s100 Gb/s 1000 Gb/s High bulk throughput Climate (Data & Computation) 0.5 Gb/s Gb/s N x 1000 Gb/s High bulk throughput SNS NanoScience Not yet started 1 Gb/s1000 Gb/s + QoS for Control Channel Remote control and time critical throughput Fusion Energy0.066 Gb/s (500 MB/s burst) 0.2 Gb/s (500MB/ 20 sec. burst) N x 1000 Gb/s Time critical throughput Astrophysics0.013 Gb/s (1 TB/week) N*N multicast 1000 Gb/s Computational steering and collaborations Genomics Data & Computation Gb/s (1 TB/day) 100s of users1000 Gb/s + QoS for Control Channel High throughput and steering See /

Mardi Gras Conference (February 3, 2005)Paul Avery39 UltraLight: Advanced Networking in Applications 10 Gb/s+ network Caltech, UF, FIU, UM, MIT SLAC, FNAL Int’l partners Level(3), Cisco, NLR Funded by ITR2004

Mardi Gras Conference (February 3, 2005)Paul Avery40 Education and Outreach

Mardi Gras Conference (February 3, 2005)Paul Avery41 NEWS: Bulletin: ONE TWO WELCOME BULLETIN General Information Registration Travel Information Hotel Registration Participant List How to Get UERJ/Hotel Computer Accounts Useful Phone Numbers Program Contact us: Secretariat Chairmen Grids and the Digital Divide Rio de Janeiro, Feb , 2004 Background  World Summit on Information Society  HEP Standing Committee on Inter-regional Connectivity (SCIC) Themes  Global collaborations, Grids and addressing the Digital Divide Next meeting: May 2005 (Korea)

Mardi Gras Conference (February 3, 2005)Paul Avery42 Second Digital Divide Grid Meeting Prof. Dongchul Son Center for High Energy Physics Kyungpook National University International Workshop on HEP Networking, Grids and Digital Divide Issues for Global e-Science May 23-27, 2005 Daegu, Korea

Mardi Gras Conference (February 3, 2005)Paul Avery43 iVDGL, GriPhyN Education / Outreach Basics  $200K/yr  Led by UT Brownsville  Workshops, portals  Partnerships with CHEPREO, QuarkNet, …

Mardi Gras Conference (February 3, 2005)Paul Avery44 June Grid Summer School  First of its kind in the U.S. (South Padre Island, Texas)  36 students, diverse origins and types (M, F, MSIs, etc)  Marks new direction for U.S. Grid efforts  First attempt to systematically train people in Grid technologies  First attempt to gather relevant materials in one place  Today:Students in CS and Physics  Later:Students, postdocs, junior & senior scientists  Reaching a wider audience  Put lectures, exercises, video, on the web  More tutorials, perhaps 2+/year  Dedicated resources for remote tutorials  Create “Grid book”, e.g. Georgia Tech  New funding opportunities  NSF: new training & education programs

CHEPREO: Center for High Energy Physics Research and Educational Outreach Florida International University  Physics Learning Center  CMS Research  iVDGL Grid Activities  AMPATH network (S. America)  Funded September 2003  $4M initially (3 years)  4 NSF Directorates!

Mardi Gras Conference (February 3, 2005)Paul Avery46 QuarkNet/GriPhyN e-Lab Project

Mardi Gras Conference (February 3, 2005)Paul Avery47 Chiron/QuarkNet Architecture

Mardi Gras Conference (February 3, 2005)Paul Avery48 Muon Lifetime Analysis Workflow

Mardi Gras Conference (February 3, 2005)Paul Avery49 QuarkNet Portal Architecture  Simpler interface for non-experts  Builds on Chiron portal

Mardi Gras Conference (February 3, 2005)Paul Avery50 Summary  Grids enable 21 st century collaborative science  Linking research communities and resources for scientific discovery  Needed by LHC global collaborations pursuing “petascale” science  Grid3 was an important first step in developing US Grids  Value of planning, coordination, testbeds, rapid feedback  Value of building & sustaining community relationships  Value of learning how to operate Grid as a facility  Value of delegation, services, documentation, packaging  Grids drive need for advanced optical networks  Grids impact education and outreach  Providing technologies & resources for training, education, outreach  Addressing the Digital Divide  OSG: a scalable computing infrastructure for science?  Strategies needed to cope with increasingly large scale

Mardi Gras Conference (February 3, 2005)Paul Avery51 Grid Project References  Grid3   Open Science Grid   GriPhyN   iVDGL   PPDG   CHEPREO   UltraLight  ultralight.cacr.caltech.edu  Globus   LCG   EU DataGrid   EGEE 