Open Science Grid Linking Universities and Laboratories in National Cyberinfrastructure ICFA Digital Divide Workshop Cracow, Poland Oct. 9, 2006 Paul Avery.

Slides:



Advertisements
Similar presentations
The Open Science Grid: Bringing the power of the Grid to scientific research
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Ted Hesselroth Nordugrid 2007 September 24-28, 2007 Abhishek Singh Rana and Frank Wuerthwein UC San Diego The Open Science Grid Ted Hesselroth Fermilab.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
RIT Colloquium (May 23, 2007)Paul Avery 1 Paul Avery University of Florida Physics Colloquium RIT (Rochester, NY) May 23, 2007 Open.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Internet2 Presentation (Oct. 11, 2007)Paul Avery 1 Paul Avery University of Florida Internet2 Meeting San Diego, CA October 11, 2007.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
D0SAR Workshop (March 30, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Designing Cyberinfrastructure (Jan , 2007)Paul Avery 1 Paul Avery University of Florida Cyberinfrastructure Workshop National.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
OSG User Group August 14, Progress since last meeting OSG Users meeting at BNL (Jun 16-17) –Core Discussions on: Workload Management; Security.
R. Ruchti DPF October 2006 Broader Impacts in Particle Physics A Responsibility and an Opportunity R. Ruchti National Science Foundation and University.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
SA1 Execution Plan Status and Issues
Ian Bird GDB Meeting CERN 9 September 2003
Long-term Grid Sustainability
EGEE support for HEP and other applications
LCG Operations Workshop, e-IRG Workshop
Connecting the European Grid Infrastructure to Research Communities
Leigh Grundhoefer Indiana University
Open Science Grid Overview
LHC Data Analysis using a worldwide computing grid
Open Science Grid at Condor Week
Status of Grids for HEP and HENP
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Open Science Grid Linking Universities and Laboratories in National Cyberinfrastructure ICFA Digital Divide Workshop Cracow, Poland Oct. 9, 2006 Paul Avery University of Florida avery@phys.ufl.edu Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

The Open Science Grid Consortium Other science applications Large US grid projects LHC experiments University facilities Open Science Grid Education communities Multi-disciplinary facilities Computer Science Laboratory centers Technologists Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Open Science Grid: July 20, 2005 Partnership of many organizations Production Grid: 60+ sites, 20,000+ CPUs “present” Sites in US, Korea, Brazil, Taiwan Taiwan, S.Korea Sao Paolo Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Site Map (Sep. 2006) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

General Comments About OSG OSG is a Consortium of projects, facilities, providers Stakeholders represent important interests OSG Project was recently funded for $30M (2006 – 2011) OSG Consortium manages the OSG Project Value of constituent resources and operations far greater OSG was formed by bottoms-up activity, informed by history Grid projects: GriPhyN, iVDGL, PPDG, UltraLight, CHEPREO, DISUN Grid testbeds (2002 – 2004), Grid3 (2003 – 2005) OSG interfaces to Virtual Organizations (VO) VOs responsible for support, authentication of members (scalability) OSG does not own resources CPU, storage owned and managed by projects or sites OSG integrates technologies & middleware Relies on S/W & technology creation by member projects or partners Exploits NSF + DOE investments (NMI, Globus, Condor, …) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Participating Projects Computer Science Condor, Globus, SRM, SRB, dCache Physics LIGO, Nuclear Physics, Tevatron, LHC Astrophysics Sloan Digital Sky Survey, future astro projects Nanoscience NanoHUB @ Purdue Bioinformatics GADU @ Argonne Dartmouth Psychological & Brain Sciences Comp. Chemistry ChemGrid University, laboratory & regional Grids GRASE Crimson Grid GLOW FermiGrid TACC GROW MGRID SURA UFGRID DOSAR Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Member Virtual Organizations CDF HEP experiment at FermiLab CMS HEP experiment at CERN DES Dark Energy Survey (Astro) DOSAR Regional grid in Southwest US DZero FermiLab HEP laboratory fMRI Functional MRI (Dartmouth) GADU Bioinformatics effort at Argonne Geant4 Simulation project GLOW Campus grid (University of Wisconsin, Madison) GRASE Regional grid in Upstate NY GridChem Quantum chemistry grid GridEx Grid Exerciser GROW Campus grid (University of Iowa) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Member Virtual Organizations (2) I2U2 E/O effort (Interactions in Understanding the Universe) iVDGL Generic VO LIGO Gravitational wave experiment Mariachi Ultra-high energy cosmic ray experiment MIS OSG monitoring? nanoHUB Nanotechnology grid at Purdue NWICG Northwest Indiana regional grid Ops OSG Operations OSG Generic VO? OSGEDU OSG education/outreach SDSS Sloan Digital Sky Survey (Astro) STAR Nuclear physics experiment at Brookhaven US-ATLAS HEP experiment at CERN LSU/CCT Center for Computation and Technology Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Grid Partners TeraGrid EGEE caBIG “DAC2005”: run LHC apps on TeraGrid resources TG Science Portals for other applications Discussions on joint activities: Security, Accounting, Operations, Portals EGEE EGEE/OSG part of Worldwide LHC Computing Grid Joint Operations Workshops, defining mechanisms to exchange support tickets Joint Security working group US contributions to EGEE middleware gLITE caBIG Cancer BioInformatics Grid; still being developed. More in late Fall 2006 Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Sciences Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Scale of OSG Resources & Services Set by LHC Experiments 27 km Tunnel in Switzerland & France TOTEM CMS LHC @ CERN ALICE LHCb Search for Origin of Mass New fundamental forces Supersymmetry Other new particles 2007 – ? ATLAS Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LHC Data and CPU Requirements CMS ATLAS Storage Raw recording rate 0.2 – 1.5 GB/s Large Monte Carlo data samples 100 PB by ~2012 1000 PB later in decade? Processing PetaOps (> 300,000 3 GHz PCs) Users 100s of institutes 1000s of researchers LHCb Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LHC Global Data Grid (2007+) 5000 physicists, 60 countries 10s of Petabytes/yr by 2008 CERN / Outside = 10-20% CMS Experiment Online System 200 - 1500 MB/s CERN Computer Center Tier 0 10-40 Gb/s Tier 1 Korea UK Russia USA >10 Gb/s Tier 2 U Florida Caltech UCSD 2.5-10 Gb/s Tier 3 FIU Iowa Maryland Tier 4 Physics caches PCs Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LHC Resource Requirements vs Time (U.S. portion has similar increase) CERN ~ 10-20% Tier-2 2008: ~100,000 4 GHz P4s Tier-1 3 GHz P4 ~ kSI2000 CERN Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LIGO: Search for Gravity Waves LIGO Grid 6 US sites 3 EU sites (UK & Germany) Cardiff AEI/Golm • Birmingham• * LHO, LLO: LIGO observatory sites * LSC: LIGO Scientific Collaboration Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Sloan Digital Sky Survey: Mapping the Sky Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Bioinformatics: GADU / GNARE GADU Performs: Acquisition: to acquire Genome Data from a variety of publicly available databases and store temporarily on the file system. Analysis: to run different publicly available tools and in-house tools on the Grid using Acquired data & data from Integrated database. Storage: Store the parsed data acquired from public databases and parsed results of the tools and workflows used during analysis. Public Databases Genomic databases available on the web. Eg: NCBI, PIR, KEGG, EMP, InterPro, etc. GADU using Grid Applications executed on Grid as workflows and results are stored in integrated Database. TeraGrid OSG DOE SG Bidirectional Data Flow SEED (Data Acquisition) Shewanella Consortium (Genome Analysis) Others.. Integrated Database Services to Other Groups Applications (Web Interfaces) Based on the Integrated Database Integrated Database Includes: Parsed Sequence Data and Annotation Data from Public web sources. Results of different tools used for Analysis: Blast, Blocks, TMHMM, … Chisel Protein Function Analysis Tool. TARGET Targets for Structural analysis of proteins. PUMA2 Evolutionary Analysis of Metabolism PATHOS Pathogenic DB for Bio-defense research Phyloblocks Evolutionary analysis of protein families GNARE – Genome Analysis Research Environment Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Nanoscience Simulations Real users and real usage >10,100 users collaboration 1881 sim. users >53,000 simulations learning modules nanoHUB.org seminars online simulation courses, tutorials Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Strategy to Incorporate More Disciplines: OSG Engagement Effort Purpose: Bring non-physics applications to OSG Led by RENCI (UNC + NC State + Duke) Specific targeted opportunities Develop relationship Direct assistance with technical details of connecting to OSG Feedback and new requirements for OSG infrastructure (To facilitate inclusion of new communities) More & better documentation More automation Bio Services and framework RENCI Bioportal for biology applications Workflow nodes Coordination with other OSG Bio activities Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG and the Virtual Data Toolkit Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Common Middleware: Virtual Data Toolkit NMI (NSF supported) RPMs Build & Test Condor pool (~100 computers, > 20 Op. Systems) Test Sources (CVS) Users … Pacman Cache Package Patching Build Binaries Binaries Build Test Contributors Globus, Condor, myProxy, … VDT: Package, test, deploy, support, upgrade, troubleshoot Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Far better than downloading individual components! What the VDT Provides An integrated process for middleware integration Figures out dependencies between software components Works with providers for bug fixes Provides automatic configuration Packages it Tests everything on multiple platforms Far better than downloading individual components! Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

VDT Growth Over 4 Years (1.3.11 now) www.griphyn.org/vdt/ # of Major Components Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

VDT Release Process (Subway Map) Gather requirements Time Day 0 Day N Build software Test Validation test bed VDT Release ITB Release Candidate Integration test bed OSG Release From Alain Roy Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Operations And Usage Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Partner with EGEE operations OSG Operations Distributed model Scalability! VOs, sites, providers Rigorous problem tracking & routing Security Provisioning Monitoring Reporting Partner with EGEE operations Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Running Jobs/VO (Nov. 2005) 1300 running jobs Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Running Jobs/VO (Aug. – Oct. 2006) CMS ATLAS CDF Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Running Jobs/Site (Aug. – Oct. 2006) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Completed Jobs/Week 300K June July Aug Sep Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Collaborative Work with Optical Networks Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Collaboration with Internet2 www.internet2.edu Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Collaboration with National Lambda Rail www.nlr.net Optical, multi-wavelength community owned or leased “dark fiber” (10 GbE) networks for R&E Spawning state-wide and regional networks (FLR, SURA, LONI, …) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Integrating Advanced Networking in Applications UltraLight Integrating Advanced Networking in Applications http://www.ultralight.org Funded by NSF 10 Gb/s+ network Caltech, UF, FIU, UM, MIT SLAC, FNAL Int’l partners Level(3), Cisco, NLR Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

REDDnet: National Networked Storage NSF funded project Vandebilt 8 initial sites Multiple disciplines Satellite imagery HEP Terascale Supernova Initative Structural Biology Bioinformatics Storage 500TB disk 200TB tape Brazil? Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LHCNet: Transatlantic Link to CERN NSF/IRNC, DOE/ESnet (2006/2007) AsiaPac SEA ESnet Science Data Network (2nd Core – 30-50 Gbps, National Lambda Rail) Europe Europe Aus. BNL Japan Japan CHI SNV NYC DEN GEANT2 SURFNet IN2P3 DC Metropolitan Area Rings FNAL ESnet IP Core (≥10 Gbps) Aus. CERN ALB SDG ATL ESnet hubs New ESnet hubs ELP Metropolitan Area Rings Major DOE Office of Science Sites LHCNet Data Network (4 x 10 Gbps to the US) High-speed cross connects with Internet2/Abilene Production IP ESnet core, 10 Gbps enterprise IP traffic Science Data Network core, 40-60 Gbps circuit based transport 10Gb/s 30Gb/s 2 x 10Gb/s Lab supplied Major international LHCNet Data Network NSF/IRNC circuit; GVA-AMS connection via Surfnet or Geant2 Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Training, Outreach, Communications Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Grid Summer Schools Sponsored by iVDGL + UT Brownsville (2004, 2005, 2006) 1 week @ South Padre Island, Texas Lectures plus hands-on exercises to ~40 students Students of differing backgrounds (physics + CS), minorities Aim to reach a wider audience Experiment specific Lectures, exercises, video, on web Students, postdocs, scientists More tutorials, 3-4/year Agency specific tutorials Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

UUEO Initiative Federation of projects Secondary & “informal” education I2U2 funded ~$1M (2005-2007) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

QuarkNet/GriPhyN e-Lab Project Analysis of high school cosmic ray data Now part of I2U2 program (www.i2u2.org) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Additional initiatives CHEPREO: Center for High Energy Physics Research and Educational Outreach Florida International University Additional initiatives CyberBridges Global CyberBridges Networking initiatives Etc. www.chepreo.org Physics Learning Center CMS Research Cyberinfrastructure WHREN network (S. America) 2003 – 2008 + more? $4M: MPS, CISE, EHR, OISE New faculty, postdocs

Science Grid Communications: Science Grid This Week (Katie Yurkewicz) 1.5 years: >1000 subscribers Going international in Jan. 2007: “iSGTW” www.interactions.org/sgtw Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

www.opensciencegrid.org/ osgnews OSG Newsletter Monthly newsletter (Katie Yurkewicz) 9 issues now www.opensciencegrid.org/ osgnews Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Grid Technology Cookbook A guide to building and using grid resources iVDGL + TATRC funded (Mary Trauner, Mary Fran Yafchak) Current Timetable (2005 – 06) Acknowledgements Preface Introduction What Grids Can Do For You Grid Case Studies Technology For Grids Standards & Emerging Technologies Programming Concepts & Challenges Building Your Own Grid Installation Procedure Examples Typical Usage Examples Practical Tips Glossary Appendices Outline Development, Vetting September-October Assemble Writing Teams October-December Develop Web Structure November-December Writing Process Underway November-March Material Edited and Entered December-April Review of First Draft May Edits to First Draft Entered Early June Review of Final Draft Late June Release of Version 1 July 2006 Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Alleviating the Digital Divide Background ICFA/SCIC (Standing Committee on Inter-regional Connectivity) Themes Global collaborations, Grids and addressing the Digital Divide Focus on poorly connected regions Brazil (2004), Korea (2005), Poland (2006) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Today Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG: Funding History & Milestones Grid Communications First US-LHC Grid Testbeds UltraLight, $2M DISUN, $10M I2U2, $1M GriPhyN, $12M Grid3 start LHC start iVDGL, $14M OSG start LIGO Grid VDT 1.3 2000 2001 2003 2004 2005 2006 2007 2002 VDT 1.0 CHEPREO, $4M Grid Summer Schools 2004, 2005, 2006 PPDG, $9.5M OSG, $30M NSF, DOE Digital Divide Workshops 04, 05, 06 Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Project funding (Sep. 7): $30M = $6.1M x 5 yrs ~ 50% NSF: MPS, OISE, OCI ~ 50% DOE: SciDAC II www.scidac.gov/physics/petascale.html Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Organization OSG Consortium Program Oversight Contributors Universities Laboratories Sites Service Providers VOs Researchers Computer Science Grid Projects … Partners Campus Grids EGEE TeraGrid OSG Consortium OSG Council Council Chair Scientific Advisory Group OSG Users Group Applications Coordinator Executive Board Executive Director Applications Coordinator Education Coordinator Facility Coordinator Resources Manager Finance Board Resources Manager OSG Facility Facility Coordinator Engagement Coordinator Operations Coordinator Middleware Coordinator Security Officer Council Chair Engagement Coordinator Middleware Coordinator Operations Coordinator Security Officer Liaison to EU Grid Projects Liaison to TeraGrid/USGrid Projects Project Technical Managers Contributor Technical Managers Executive Team Line reporting Advisory Projects Project Mangers Resource Managers … Contributing & Interfacing, (MOUs, etc) Projects Project Managers Resource Managers … Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Project Execution Plan (PEP) - FTEs 33 Total FTEs 3.0 Staff 9.0 Extensions in capability and scale. 1.0 Facility management 2.0 Education, outreach & training Engagement 6.5 Software release and support 4.5 Security and troubleshooting 5.0 Facility operations FTEs Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Project Effort Distribution: Year 1 Developing procedures and structures for coherent project Each institution must sign Statement of Work. Taking place now. Each individual submits open monthly written reports. Fall 2006 Finance Board reviews the accounts and deliverables. FB exists Executive Board reviews plans and achievements. EB exists Activities covered by the Project Plan and WBS. PEP & WBS exist Effort distribution reviewed & potentially modified each year. Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG PEP - High Level Milestones 2006Q3 Release OSG software stack version 0.6.0 Project baseline review 2006Q4 Sign off on OSG Security Plan. Meet operational metrics for 2006. 2007Q1 Accounting reports available for users and resource owners. 2007Q2 Production use of OSG by one additional science community. OSG-TeraGrid: software releases based on same NMI software base. Release OSG software version 0.8.0: Complete extensions for LHC data taking. Support for ATLAS and CMS data taking. 2007Q3 1 year Project Review. 2007Q4 Meet 2007 deliverables as defined by science stakeholders. Meet operational metrics for 2007. Release OSG software version 1.0 2008Q2 Production use of OSG by 2 additional science communities. 2008Q3 OSG-TeraGrid: production service interoperation. 2nd year Project Review. 2008Q4 Meet 2008 deliverables as defined by science stakeholders. Meet operational metrics for 2008. 2009Q2 Support for all STAR analysis (10,000 jobs/day). 2010Q1 Support for data taking with order of magnitude increase in LIGO sensitivity. Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Security, Safety, Risk Management Assess, monitor & respond to security issues  Security Officer Each site responsible for local security and incident reporting OSG security plan modeled on NIST process Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Scaling of LHC & LIGO in 2008-2009 Data distribution Routinely >1 GB/Sec at ~10-20 sites Workflows >10,000 batch jobs per client Jobs/Day >20,000 per VO with >99% success rate Accessible Storage >10PB Facility Availability/Uptime >99.x% with no single points of failure Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Continued focus on OSG Core Competencies Integration Software, Systems, Virtual Organizations Operations Common support & grid services Inter-Operation with other grids TeraGrid, EGEE, caBIG, … Bridging administrative & technical boundaries With validation, verification and diagnosis at each step With integrated security operations and management Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Core Operations and Common Support Join OSG: 1-2-3 The OSG VO VO registers with Operations Center, users register with VO Sites register with Operations Center VOs and sites provide Support Center Contact and join Ops groups Individuals & small groups Managed by OSG Good learning environment Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

END Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Future Astronomy OSG Projects Fermilab Experimental Astrophysics Group (EAG) has 4 projects planned for Open Science Grid Fitting SDSS Quasar Spectra by genetic algorithm Simulation effort for Dark Energy Survey (DES) Search for Near Earth Asteroids (NEOs) in the SDSS Imaging data The Co-addition of the SDSS Southern Stripe (COADD) Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Integral Role of Computing at LHC: TDRs 100s of pages apiece CPU Storage International optical networks Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

CPU, Storage Projections of Current HEP Expts. Primarily driven by increasing datataking rates Similar increases in other disciplines 2008 Data volume: 5.7 PB 2008: ~8,000 3 GHz CPUs Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

Long Term Trends in Network Traffic Volumes: 300-1000X/10Yrs ESnet Accepted Traffic 1990 – 2005 Exponential Growth: Avg. +82%/yr for the Last 15 Years  L. Cottrell 700 W. Johnston 2 x 10 Gbit/s 600 500 400 TERABYTES Per Month 300 Progress in Steps 200 100 2005 SLAC Traffic ~400 Mbps Growth in steps (ESNet Limit): ~ 10X/4 years Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

LHC Tier0–Tier1–Tier2 Network Connections Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

OSG Roots: “Trillium Consortium” Trillium Grid projects: PPDG + GriPhyN + iVDGL PPDG: $12M (DOE) (1999 – 2006) GriPhyN: $12M (NSF) (2000 – 2005) iVDGL: $14M (NSF) (2001 – 2006) Large science experiments: ATLAS, CMS, LIGO, SDSS Supplements + new projects : UltraLight, CHEPREO, DISUN ($17M) Total ~150 people with many overlaps between projects Universities, labs, SDSC, foreign partners Historically, a strong driver for funding agency collaboration Inter-agency (NSF – DOE) + intra-agency (Directorate – Directorate) Coordination vital for meeting broad goals CS research, developing/supporting Virtual Data Toolkit (VDT) Multiple Grid deployments, using VDT-based middleware Unified entity when collaborating internationally Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery

GLORIAD: 10 Gbps Global Optical Ring (Complete by March 2007) GLORIAD Circuits Today 10 Gbps Hong Kong-Daejon-Seattle 10 Gbps Seattle-Chicago-NYC 622 Mbps Moscow-AMS-NYC 2.5 Gbps Moscow-AMS 155 Mbps Beijing-Khabarovsk-Moscow 2.5 Gbps Beijing-Hong Kong 1 GbE NYC-Chicago (CANARIE) China, Russia, Korea, Japan, US, Netherlands Partnership US: NSF IRNC Program Cracow Digital Divide Workshop (Oct. 9-11, 2006) Paul Avery