Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.

Slides:



Advertisements
Similar presentations
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Open Science Grid Use of PKI: Wishing it was easy A brief and incomplete introduction. Doug Olson, LBNL PKI Workshop, NIST 5 April 2006.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Sergey Belov, LIT JINR 15 September, NEC’2011, Varna, Bulgaria.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
Sergey Belov, Tatiana Goloskokova, Vladimir Korenkov, Nikolay Kutovskiy, Danila Oleynik, Artem Petrosyan, Roman Semenov, Alexander Uzhinskiy LIT JINR The.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
GT Components. Globus Toolkit A “toolkit” of services and packages for creating the basic grid computing infrastructure Higher level tools added to this.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
G RID M IDDLEWARE AND S ECURITY Suchandra Thapa Computation Institute University of Chicago.
Mine Altunay OSG Security Officer Open Science Grid: Security Gateway Security Summit January 28-30, 2008 San Diego Supercomputer Center.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
June 24-25, 2008 Regional Grid Training, University of Belgrade, Serbia Introduction to gLite gLite Basic Services Antun Balaž SCL, Institute of Physics.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
Yet Another Grid Project: The Open Science Grid at SLAC Matteo Melani, Booker Bense and Wei Yang SLAC Hepix Conference 10/13/05, SLAC, Menlo Park, CA,
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
Eileen Berman. Condor in the Fermilab Grid FacilitiesApril 30, 2008  Fermi National Accelerator Laboratory is a high energy physics laboratory outside.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
An Introduction to Campus Grids 19-Apr-2010 Keith Chadwick & Steve Timm.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
Открытая решетка науки строя открытое Cyber- инфраструктура для науки GRID’2006 Dubna, Россия Июнь 26, 2006 Robertovich Gardner Университет Chicago.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Open Science Grid Interoperability
Accessing the VI-SEEM infrastructure
Open Science Grid Progress and Status
f f FermiGrid – Site AuthoriZation (SAZ) Service
Leigh Grundhoefer Indiana University
Open Science Grid at Condor Week
Presentation transcript:

Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab

2 Open Science Grid in a nutshell  Set of Collaborating Computing Farms - Compute Elements. Commodity Linux Farms; MSS & Disk. From ~20 CPUs in Department Computers To 10,000 CPU SuperComputer Any NWICG University Local Batch System OSG CE gateway OSG & the Wide Area Network

3  Set of Collaborating Storage Sites - Storage Elements. Mass Storage Systems And Disk Caches From 20 GBytes Disk Cache To 4 Petabyte Robotic Tape Systems Any NWICG Shared Storage OSG SE gateway OSG & the Wide Area Network

4 Supported Software Stacks  Integrated Supported Reference Software Services — Most Services on Linux PC “Gateways” -- minimal impact on compute nodes. — Loose coupling between services, heterogeneity in releases and functionality. Independent Collections for Client, Server, Administrator

5 Grid - of - Grids  Inter-Operating and Co-Operating Grids: Campus, Regional, Community, National, International  Open Consortium of Virtual Organizations doing Research & Education

6 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers using these resources by adapting their applications to run on the Grid and Software developers providing middleware and A project that provides the Operations, Support, Training and Help to make it effective.

7 NWICG - in OSG terminology 4 Compute Elements (CEs) and/or Storage Elements (SEs) A Regional Grid -- a shared common cyberinfrastructure A Virtual Organization with which to Partner.

8 Who is OSG ? Large global physics collaborations: US ATLAS, US CMS, LIGO, CDF, D0, STAR Education projects e.g. Mariachi,I2U2. Grid technology groups: Condor, Globus, SRM, NMI. Many DOE Labs and DOE/NSF sponsored University IT facilities. Partnerships e.g. TeraGrid, European Grids, Regional/Campus Grids e.g. Texas, Wisconsin…

9 OSG Consortium Partners Contributors Project

10 The OSG Map Aug-2006 Some OSG sites are also on TeraGrid or EGEE. 10 SEs & 50 CEs (about 30 very active)

11 Genome analyis (GADU) “bridged” GLOW jobs 2000 running jobs 500 waiting jobs Use - Daily Monitoring 04/23/2006

12 Use commodity networks - ESNet, Campus LANs Well network provisioned sites e.g. connected to Starlight to low bandwidth connections e.g. Taiwan Connectivity ranges from full-duplex, outgoing only, to fully behind firewalls. Network Connectivity

13 Bridging Campus Grid Jobs - GLOW Dispatch jobs from local security, job, storage infrastructure and “uploading” to wide-area infrastructure.

14 FermiGrid? Interfacing All Fermilab Resources to common Campus Infrastructure Gateway to Open Science Grid  Unified and reliable common interface and services through one FermiGrid gateway - security, job scheduling, user management, and storage. Sharing Resources  Policies and Agreements enable fast response to changes in resource needs by Fermilab users. More information is available at

15 Access to FermiGrid OSG General Users Fermilab Users CDF Farm FermiGrid Gateway OSG “agreed” Users D0 Farm Common Farms CMS Farms

16 OSG History and Goals Grown from of grass-roots collaboration of GriPhyN, iVDGL and PPDG participants in years of funding starting ~9/2006 from DOE SciDAC-II and NSF MPS and OCI Core Goal to Deliver to US LHC & LIGO scale in next 2 years: — Need to routinely distribute data at 1-5 Gbps over sites. — Need to routinely exceed 10,000 running jobs per client — Need to reach 99% success rate for 10,000 jobs per day submission under heavy load 1 GigaByte/sec

17 OSG Core Competencies Integration: software and systems. Operations: common support and procedures. Inter-Operation: across administrative and technical boundaries. Each open to technical work with NWICG

18 Integration Testing of the System A Core Part of OSG. Multi-site Integration Grid tests new OSG Releases and Configurations as a Community activity. Software Readiness and Validations occur before deployment on the Integration Grid. Integration Grid Sites - a parallel grid to the Production System Integration Grid Sites - a parallel grid to the Production System

19

20 Operations Model Real support organizations often play multiple roles Lines represent communication paths and, in our model, agreements. We have not progressed very far with agreements yet. Gray shading indicates that OSG Operations composed of effort from all the support centers Core Support Group + Community Contributions

21 Documentation..

22

23 Training - e.g. Grid Summer Workshop Year 4 Hands on. Technical trainers. Nice Setting (Padre Island). Students got their own applications to run on OSG!

24 What is the VDT? A collection of software  Grid software (Condor, Globus and lots more)  Virtual Data System (Origin of the name “VDT”)  Utilities An easy installation  Goal: Push a button, everything just works  Two methods: Pacman: installs and configures it all RPM: installs some of the software, no configuration A support infrastructure

25 What software is in the VDT? Security  VOMS (VO membership)  GUMS (local authorization)  mkgridmap (local authorization)  MyProxy (proxy management)  GSI SSH  CA CRL updater Monitoring  MonaLISA  gLite CEMon Accounting  OSG Gratia Job Management  Condor (including Condor-G & Condor-C)  Globus GRAM Data Management  GridFTP (data transfer)  RLS (replication location)  DRM (storage management)  Globus RFT Information Services  Globus MDS  GLUE schema & providers

26 Client tools  Virtual Data System  SRM clients (V1 and V2)  UberFTP (GridFTP client) Developer Tools  PyGlobus  PyGridWare Testing  NMI Build & Test  VDT Tests What software is in the VDT? Support  Apache  Tomcat  MySQL (with MyODBC)  Non-standard Perl modules  Wget  Squid  Logrotate  Configuration Scripts And More!

27 Due diligence to Security Risk assessment, planning, Service auditing and checking Incident response, Awareness and Training, Configuration management, User access Authentication and Revocation, Auditing and analysis. End to end trust in quality of code executed on remote CPU -signatures? Identity and Authorization: Extended X509 Certificates  OSG is a founding member of the US TAGPMA.  DOEGrids provides script utilities for bulk requests of Host certs, CRL checking etc.  VOMS extended attributes and infrastructure for Role Based Access Controls.

28 How do People and Organizations Participate?

29 VO Registers with with Operations Center. User registers with VO. Sites Register with the Operations Center. VOs and Sites provide Support Center Contact and join Operations groups We’re all fun people!

30 The OSG VO A VO for individual researchers and users. Managed by the OSG itself. Where one can learn how to use the Grid!

31 Where do you learn more?