The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.

Slides:



Advertisements
Similar presentations
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Advertisements

CMS Applications Towards Requirements for Data Processing and Analysis on the Open Science Grid Greg Graham FNAL CD/CMS for OSG Deployment 16-Dec-2004.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Open Science Grid Use of PKI: Wishing it was easy A brief and incomplete introduction. Doug Olson, LBNL PKI Workshop, NIST 5 April 2006.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
A Model for Grid User Management Rich Baker Dantong Yu Tomasz Wlodek Brookhaven National Lab.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
VOX Project Status T. Levshina. Talk Overview VOX Status –Registration –Globus callouts/Plug-ins –LRAS –SAZ Collaboration with VOMS EDG team Preparation.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Mine Altunay OSG Security Officer Open Science Grid: Security Gateway Security Summit January 28-30, 2008 San Diego Supercomputer Center.
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks David Kelsey RAL/STFC,
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
Открытая решетка науки строя открытое Cyber- инфраструктура для науки GRID’2006 Dubna, Россия Июнь 26, 2006 Robertovich Gardner Университет Chicago.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Leigh Grundhoefer Indiana University
Open Science Grid at Condor Week
Presentation transcript:

The Open Science Grid OSG Ruth Pordes Fermilab

2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers using these resources by adapting their applications to run on the Grid and Software developers providing middleware and A project that provides the Operations, Support, Training and Help to make it effective.

3 Who is OSG ? Large global physics collaborations: US ATLAS, US CMS, LIGO, CDF, D0, STAR Research collaborations such as Mariachi, GROW, Grid technology groups: Condor, Globus, SRM, NMI Many DOE Labs and DOE/NSF sponsored University IT facilities and Partnerships with TeraGrid and EGEE, and Campus Grids such as TACC, GLOW etc.

4 OSG Consortium

5 When is OSG ? Grown from of grass-roots collaboration of GriPhyN, iVDGL and PPDG participants in years of funding starting ~9/2006 from DOE SciDAC-II and NSF MPS and OCI Deliver to US LHC and LIGO scales in 2008 and 2009 : — Need to routinely distribute data at 1-5 Gbps over sites. — Need to routinely exceed 10,000 running jobs per client — Need to reach 99% success rate for 10,000 jobs per day submission under heavy load Active engagement effort centered at RENCI to include new sciences.

6 The OSG Map Aug-2006

7 OSG’s world is flat - a Grid of Grids - from Local to Global Global Science Community Systems e.g. FermiGrid, NWIC Local Campus And Regional Grids National CyberInfrastructures for Science e.g. OSG - TeraGrid e.g. CMS, D0

8 From the Local (Campus) Grids to the Wide Area Grid Within an organization a Local Grid provides for simplicity in sharing and efficiency in purchasing and administration. However, researchers collaborate outside the bounds of a single campus. With a uniform environment acting globally is just an extension of acting locally. Open Science Grid focuses on interoperation of the local and the wide area.

9 A resource can be accessed by a user via the campus, community or national grid. A user can access a resource with a campus, community or national grid identity.

10 Example Uses High Energy Physics  ATLAS - >15 Million proton collision events simulated at 10 minutes each  CMS - >70 Million events simulated, reconstructed and analyzed Biology - GADU  Populates databases from search and analysis of similarities and differences among thousands of publicly available genome and protein sequences and metabolic pathways. Gravitational Wave Physics - LIGO Data grid  Eases grid tools to ensure that 9 computing sites have a copy of the interesting data, and researchers at 36 LSC institutions use the LDG to find the data they need." Math research Education  Grid Summer Workshop teaches students to run jobs on OSG and TeraGrid.

11 Running (and monitored) “OSG jobs” in 06/06.

12 Example GADU run in 04/06

13 Integration Testing of the System Multi-site Integration Grid tests new OSG Releases and Configurations. Software Readiness and Validations occur before deployment on the Integration Grid. Integration Grid Sites

14 CMS - US part of a Global Community Grid Germany Taiwan UK Italy Data & jobs moving locally, regionally & globally within CMS grid. Transparently across grid boundaries from campus to global. Florida USA CERN Caltech Wisconsin UCSD France Purdue MIT UNL OSG EGEE

15 How do People and Organizations Participate? VO Registers with with Operations Center  Signs VO Agreement User registers with VO  User added to VOMS of one or more VOs.  VO responsible for users to sign AUP.  VO responsible for VOMS service support. Sites Register with the Operations Center  Signs the Service Agreement.  Agree on which VOs to support (striving for default admit) VOs and Sites provide Support Center Contact and join Operations groups

16 Community Documentation..

17

18 OSG Principles Systems are Virtual Organization scoped -- groups of users working together with a shared, common environment. Sites maintain control, authority and management of use of their Grid accessible resources. Opportunistically available resources are beneficial. Priorities are governed by Policies (role based within a VO). The distributed system is heterogeneous. Information must be provided to allow applications to know which resources they can successfully use.

19 VO software stacks that Rely on the Virtual Data Toolkit NSF Middleware Initiative (NMI): Condor, Globus, Myproxy Virtual Data Toolkit (VDT) Common Services NMI + VOMS, CEMon (common EGEE components), MonaLisa, Clarens, AuthZ OSG Release Cache: VDT + Configuration, Validation, VO management LHC Services & Framework LIGO Data Grid OSG VO Framework Infrastructure Applications … Bio Services & Framework

20 What is the VDT? A collection of software  Grid software (Condor, Globus and lots more)  Virtual Data System (Origin of the name “VDT”)  Utilities An easy installation  Goal: Push a button, everything just works  Two methods: Pacman: installs and configures it all RPM: installs some of the software, no configuration A support infrastructure

21 Who uses the VDT? Open Science Grid LIGO Data Grid LCG  LHC Computing Grid, from CERN EGEE  Enabling Grids for E-Science

22 What software is in the VDT? Security  VOMS (VO membership)  GUMS (local authorization)  mkgridmap (local authorization)  MyProxy (proxy management)  GSI SSH  CA CRL updater Monitoring  MonaLISA  gLite CEMon Accounting  OSG Gratia Job Management  Condor (including Condor-G & Condor-C)  Globus GRAM Data Management  GridFTP (data transfer)  RLS (replication location)  DRM (storage management)  Globus RFT Information Services  Globus MDS  GLUE schema & providers

23 Client tools  Virtual Data System  SRM clients (V1 and V2)  UberFTP (GridFTP client) Developer Tools  PyGlobus  PyGridWare Testing  NMI Build & Test  VDT Tests What software is in the VDT? Support  Apache  Tomcat  MySQL (with MyODBC)  Non-standard Perl modules  Wget  Squid  Logrotate  Configuration Scripts And More!

24 Due diligence to Security Risk assessment, planning, Service auditing and checking Incident response, Awareness and Training, Configuration management, User access Authentication and Revocation, Auditing and analysis. End to end trust in quality of code executed on remote CPU -signatures? Identity and Authorization: Extended X509 Certificates  OSG is a founding member of the US TAGPMA.  DOEGrids provides script utilities for bulk requests of Host certs, CRL checking etc.  VOMS extended attributes and infrastructure for Role Based Access Controls.

25 Operations Model Real support organizations often play multiple roles Lines represent communication paths and, in our model, agreements. We have not progressed very far with agreements yet. Gray shading indicates that OSG Operations composed of effort from all the support centers

26 The OSG VO A VO for individual researchers and users. Managed by the OSG itself. Learn how to use the Grid!

27 In Summary… A production grid is the product of a complex interplay of many forces:  Resource providers  Users  Software providers  Hardware trends  Commercial offerings  Funding agencies  Culture of all parties involved  …

28 Where do you learn more?