GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.

Slides:



Advertisements
Similar presentations
Grids e HEP. Concorde (15 Km) Balloon (30 Km) CD stack with 1 year LHC data! (~ 20 Km) Mt. Blanc (4.8 Km) Bytes 10 3 Terabytes 1 Petabyte.
Advertisements

 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 IWLSC, Kolkata India 2006 Jérôme Lauret for the Open Science Grid consortium The Open-Science-Grid: Building a US based Grid infrastructure for Open.
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Internet2 Presentation (Oct. 11, 2007)Paul Avery 1 Paul Avery University of Florida Internet2 Meeting San Diego, CA October 11, 2007.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
D0SAR Workshop (March 30, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
EGEE is a project funded by the European Union under contract IST Roles & Responsibilities Ian Bird SA1 Manager Cork Meeting, April 2004.
Designing Cyberinfrastructure (Jan , 2007)Paul Avery 1 Paul Avery University of Florida Cyberinfrastructure Workshop National.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Mardi Gras Conference (February 3, 2005)Paul Avery1 University of Florida Grid3 and Open Science Grid Mardi Gras Conference Louisiana.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
R. Ruchti DPF October 2006 Broader Impacts in Particle Physics A Responsibility and an Opportunity R. Ruchti National Science Foundation and University.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Open Science Grid Linking Universities and Laboratories in National Cyberinfrastructure ICFA Digital Divide Workshop Cracow, Poland Oct. 9, 2006 Paul Avery.
Ian Bird GDB Meeting CERN 9 September 2003
Open Science Grid at Condor Week
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National Cyberinfrastructure GridChem Workshop University of Texas, Austin Austin, Texas March 9, 2006

GridChem Workshop (March 9, 2006)Paul Avery2 OSG Roots: “Trillium Consortium”  Trillium = PPDG + GriPhyN + iVDGL  PPDG:$12M (DOE)(1999 – 2006)  GriPhyN:$12M (NSF)(2000 – 2005)  iVDGL:$14M (NSF)(2001 – 2006)  Large science experiments (HEP/LHC, LIGO, SDSS)  Total ~150 people with many overlaps between projects  Universities, labs, foreign partners  Historically, a strong driver for funding agency collaboration  Inter-agency (NSF – DOE) + intra-agency (Directorate – Directorate)  Coordination vital for meeting broad goals  CS research, developing/supporting Virtual Data Toolkit (VDT)  Multiple Grid deployments, using VDT-based middleware  Deployment of Grid3, a general purpose, national Grid  Unified entity when collaborating internationally

GridChem Workshop (March 9, 2006)Paul Avery3 Search for  Origin of Mass  New fundamental forces  Supersymmetry  Other new particles  2007 – ? TOTEM LHCb ALICE  27 km Tunnel in Switzerland & France CMS ATLAS Scale of OSG Resources & Services Set by Large Hadron Collider (LHC) Expts. CERN

GridChem Workshop (March 9, 2006)Paul Avery4 LHC: Beyond Moore’s Law Moore’s Law (2000) LHC CPU Requirements

GridChem Workshop (March 9, 2006)Paul Avery5 LHC: Petascale Global Science  Complexity:Millions of individual detector channels  Scale:PetaOps (CPU), 100s of Petabytes (Data)  Distribution:Global distribution of people & resources CMS Example Physicists 250+ Institutes 60+ Countries BaBar/D0 Example Physicists 100+ Institutes 35+ Countries

GridChem Workshop (March 9, 2006)Paul Avery6 CMS Experiment LHC Global Data Grid (2007+) Online System CERN Computer Center USA Korea Russia UK Maryland MB/s >10 Gb/s Gb/s Gb/s Tier 0 Tier 1 Tier 3 Tier 2 Physics caches PCs Iowa UCSDCaltech U Florida  5000 physicists, 60 countries  10s of Petabytes/yr by 2008  1000 Petabytes in < 10 yrs? FIU Tier 4

GridChem Workshop (March 9, 2006)Paul Avery7 LIGO Grid  LIGO Grid: 6 US sites + 3 EU sites (UK & Germany) * LHO, LLO: LIGO observatory sites * LSC: LIGO Scientific Collaboration  Cardiff AEI/Golm Birmingham

GridChem Workshop (March 9, 2006)Paul Avery8 Common Middleware: Virtual Data Toolkit Sources (CVS) Patching GPT src bundles NMI Build & Test Condor pool 22+ Op. Systems Build Test Package VDT Build Many Contributors Build Pacman cache RPMs Binaries Test VDT: Package, test, deploy, support, upgrade, troubleshoot

GridChem Workshop (March 9, 2006)Paul Avery9 VDT Growth Over 4 Years ( now)

GridChem Workshop (March 9, 2006)Paul Avery10 Grid3: A National Grid Infrastructure  October 2003 – July 2005  32 sites, 3,500 CPUs: Universities + 4 national labs  Sites in US, Korea, Brazil, Taiwan  Applications in HEP, LIGO, SDSS, Genomics, fMRI, CS Brazil

GridChem Workshop (March 9, 2006)Paul Avery11 Grid3 Lessons Learned  How to operate a Grid as a facility  Security, services, error recovery, procedures, docs, organization  Delegation of responsibilities (Project, VO, service, site, …)  Crucial role of Grid Operations Center (GOC)  How to support people  people relations  Face-face meetings, phone cons, 1-1 interactions, mail lists, etc.  How to test and validate Grid tools and applications  Vital role of testbeds  How to scale algorithms, software, process  Some successes, but “interesting” failure modes still occur  How to apply distributed cyberinfrastructure  Successful production runs for several applications

GridChem Workshop (March 9, 2006)Paul Avery12 Sao Paolo Taiwan, S.Korea  VO based: Partnership of many organizations  Production Grid: 50+ sites, 19,000 CPUs “present” (available but not at one time)  Sites in US, Korea, Brazil, Taiwan  Integration Grid: ~15 sites Open Science Grid: July 20, 2005

GridChem Workshop (March 9, 2006)Paul Avery13 OSG Operations Snapshot (24 Hr)

GridChem Workshop (March 9, 2006)Paul Avery14 OSG Operations Snapshot (30 Day) November 7: 30 days

GridChem Workshop (March 9, 2006)Paul Avery15

GridChem Workshop (March 9, 2006)Paul Avery16 Registered VOs (incomplete)

GridChem Workshop (March 9, 2006)Paul Avery17 Creating & Registering a VO With OSG  To form a Virtual Organization (VO) that participates in the Open Science Grid one needs the following: 1. a Charter statement describing the purpose of the VO. This should be short yet concise enough to scope intended usage of OSG resources. 2. at least one VO participating Organization that is a member or partner with the Open Science Grid Consortium. 3. a VO Membership Service which meets the requirements of an OSG Release. This means being able to provide a full list of members' DNs to edg-mkgridmap. The currently recommended way to do this is to deploy the VDT VOMS from the OSG software package. 4. a support organization (called a Support Center in OSG parlance) that will support the VO in OSG Operations. The Support Center should provide at least the following: a written description of the registration process, instructions for the members of the VO on how to complete the VO registration process, instructions for the members of the VO on how to report problems and/or obtain help. 5. completion of the registration form located here using these instructionsherethese instructions

GridChem Workshop (March 9, 2006)Paul Avery18 Green:DNs are mapped to this VO and compute element [clickable] Yellow:No DNs are supported under this VO and compute element. Black:No information Vo Support Matrix

GridChem Workshop (March 9, 2006)Paul Avery19 OSG Integration Testbed Brazil Taiwan Korea  Test, validate new middleware & services  Test, validate new applications  Meets weekly (highly dynamic membership)

GridChem Workshop (March 9, 2006)Paul Avery20 Projects Project Mangers Resource Managers … Council Chair Engagement Coordinator Middleware Coordinator Operations Coordinator Security Officer Liaison to EU Grid Projects Liaison to TeraGrid/USGrid Projects Project Technical Managers Contributor Technical Managers Executive Board Line reporting Contributing & Interfacing, (MOUs, etc) Advisory Executive Director Applications Coordinator Education Coordinator Facility Coordinator Resources Manager Executive Team OSG Facility Facility Coordinator Engagement Coordinator Operations Coordinator Middleware Coordinator Security Officer Projects Project Managers Resource Managers … Finance Board Resources Manager OSG Users Group Applications Coordinator Program Oversight OSG Council Council Chair OSG Consortium Scientific Advisory Group Contributors Universities Laboratories Sites Service Providers VOs Researchers Computer Science Grid Projects … Partners Campus Grids EGEE TeraGrid OSG Organization

GridChem Workshop (March 9, 2006)Paul Avery21 Process for Deploying New OSG Service

GridChem Workshop (March 9, 2006)Paul Avery22 OSG Participating Disciplines Computer Science Condor, Globus, SRM, SRB Physics LIGO, Nuclear Physics, Tevatron, LHC Global Grids Astrophysics Sloan Digital Sky Survey Nanoscience Purdue Bioinformatics Argonne GADU project Dartmouth Psychological & Brain Sciences BLAST, BLOCKS, gene sequences, etc Functional MRI Comp. Chemistry ChemGrid University campus Resources, portals, apps  CCR(U Buffalo)  GLOW(U Wisconsin)  TACC(Texas Advanced Computing Center)  MGRID(U Michigan)  UFGRID(U Florida)  Crimson Grid(Harvard)  FermiGrid(FermiLab Grid)

GridChem Workshop (March 9, 2006)Paul Avery23 OSG Grid Partners TeraGrid “DAC2005”: run LHC apps on TeraGrid resources TG Science Portals for other applications Discussions on joint activities: Security, Accounting, Operations, Portals EGEE Joint Operations Workshops, defining mechanisms to exchange support tickets Joint Security working group US middleware federation contributions to core- middleware gLITE Worldwide LHC Computing Grid OSG contributes to LHC global data handling and analysis systems Other partners SURA, GRASE, LONI, TACC Representatives of VOs provide portals and interfaces to their user groups

GridChem Workshop (March 9, 2006)Paul Avery24 Example of Partnership: WLCG and EGEE

GridChem Workshop (March 9, 2006)Paul Avery25 OSG Activities BlueprintDefining principles and best practices for OSG DeploymentDeployment of resources & services ProvisioningConnected to deployment Incidence responsePlans and procedures for responding to security incidents IntegrationTesting & validating & integrating new services and technologies Data Resource Management (DRM) Deployment of specific Storage Resource Management technology DocumentationOrganizing the documentation infrastructure AccountingAccounting and auditing use of OSG resources InteroperabilityPrimarily interoperability between OperationsOperating Grid-wide services

GridChem Workshop (March 9, 2006)Paul Avery26 Networks

GridChem Workshop (March 9, 2006)Paul Avery27 Evolving Science Requirements for Networks (DOE High Performance Network Workshop) Science Areas Today End2End Throughput 5 years End2End Throughput 5-10 Years End2End Throughput Remarks High Energy Physics 0.5 Gb/s100 Gb/s 1000 Gb/s High bulk throughput Climate (Data & Computation) 0.5 Gb/s Gb/s N x 1000 Gb/s High bulk throughput SNS NanoScience Not yet started 1 Gb/s1000 Gb/s + QoS for Control Channel Remote control and time critical throughput Fusion Energy0.066 Gb/s (500 MB/s burst) 0.2 Gb/s (500MB/ 20 sec. burst) N x 1000 Gb/s Time critical throughput Astrophysics0.013 Gb/s (1 TB/week) N*N multicast 1000 Gb/s Computational steering and collaborations Genomics Data & Computation Gb/s (1 TB/day) 100s of users1000 Gb/s + QoS for Control Channel High throughput and steering See /

GridChem Workshop (March 9, 2006)Paul Avery28 UltraLight 10 Gb/s+ network Caltech, UF, FIU, UM, MIT SLAC, FNAL Int’l partners Level(3), Cisco, NLR Integrating Advanced Networking in Applications

GridChem Workshop (March 9, 2006)Paul Avery29 Training Outreach Communications

GridChem Workshop (March 9, 2006)Paul Avery30 Grid Summer Schools  June 2004: First US Grid Tutorial (South Padre Island, Tx)  36 students, diverse origins and types  July 2005: Second Grid Tutorial (South Padre Island, Tx)  42 students, simpler physical setup (laptops)  June 23-27: Third Grid Tutorial (South Padre Island, Tx)  Reaching a wider audience  Lectures, exercises, video, on web  Students, postdocs, scientists  Coordination of training activities  More tutorials, 3-4/year  Agency specific tutorials

GridChem Workshop (March 9, 2006)Paul Avery31 Current Timetable (2005 – 06) Outline Development, VettingSeptember-October Assemble Writing TeamsOctober-December Develop Web StructureNovember-December Writing Process UnderwayNovember-March Material Edited and EnteredDecember-April Review of First DraftMay Edits to First Draft EnteredEarly June Review of Final DraftLate June Release of Version 1July 2006 Grid Technology Cookbook A guide to building and using grid resources Acknowledgements Preface Introduction What Grids Can Do For You Grid Case Studies Technology For Grids Standards & Emerging Technologies Programming Concepts & Challenges Building Your Own Grid Installation Procedure Examples Typical Usage Examples Practical Tips Glossary Appendices

GridChem Workshop (March 9, 2006)Paul Avery32 QuarkNet/GriPhyN e-Lab Project

CHEPREO: Center for High Energy Physics Research and Educational Outreach Florida International University  Physics Learning Center  CMS Research  Cyberinfrastructure  WHREN network (S. America)  Funded September 2003  $MPS, CISE, EHR, INT

GridChem Workshop (March 9, 2006)Paul Avery34 Grids and the Digital Divide Background  World Summit on Information Society  HEP Standing Committee on Inter- regional Connectivity (SCIC) Themes  Global collaborations, Grids and addressing the Digital Divide  Focus on poorly connected regions  Brazil (2004), Korea (2005)

GridChem Workshop (March 9, 2006)Paul Avery35 Science Grid Communications Broad set of activities  (Katie Yurkewicz)  News releases, PR, etc.  Science Grid This Week  OSG Monthly Newsletter

GridChem Workshop (March 9, 2006)Paul Avery36 OSG Newsletter Monthly newsletter  (Katie Yurkewicz)  4 issues now osgnews

GridChem Workshop (March 9, 2006)Paul Avery37 Grid Timeline GriPhyN, $12M PPDG, $9.5M UltraLight, $2M CHEPREO, $4M DISUN, $10M Grid Communications Grid Summer Schools 04, 05 Grid3 operations OSG operations VDT 1.0 First US-LHC Grid Testbeds Digital Divide Workshops LIGO Grid Start of LHC iVDGL, $14M Grid Summer School 06 OSG funded? NSF, SciDAC

GridChem Workshop (March 9, 2006)Paul Avery38 Sao Paolo Taiwan, S.Korea OSG Consortium Meetings  July 20, 2006: University of Wisconsin, Milwaukee  Kickoff meeting, ~100 attendees  Focus on getting off the ground with running jobs  January 23, 2006: University of Florida (Gainesville)  ~110 people  Partnerships, organization, funding, operations, software infrastructure  August 21-24, 2006: University of Washington (Seattle)  January, 2007: TACC

GridChem Workshop (March 9, 2006)Paul Avery39 Jan OSG Meeting

GridChem Workshop (March 9, 2006)Paul Avery40 END

GridChem Workshop (March 9, 2006)Paul Avery41 Grid Project References  Open Science Grid   Grid3   Virtual Data Toolkit   GriPhyN   iVDGL   PPDG   CHEPREO   UltraLight   Globus   Condor   WLCG   EGEE 

GridChem Workshop (March 9, 2006)Paul Avery42 Sloan Digital Sky Survey (SDSS) Using Virtual Data in GriPhyN Galaxy cluster size distribution Sloan Data

GridChem Workshop (March 9, 2006)Paul Avery43 The LIGO Scientific Collaboration (LSC) and the LIGO Grid  LIGO Grid: 6 US sites + 3 EU sites (Cardiff/UK, AEI/Germany) * LHO, LLO: LIGO observatory sites * LSC: LIGO Scientific Collaboration  Cardiff AEI/Golm Birmingham