1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab

Slides:



Advertisements
Similar presentations
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Advertisements

EGEE-II INFSO-RI Enabling Grids for E-sciencE The gLite middleware distribution OSG Consortium Meeting Seattle,
Dec 14, 20061/10 VO Services Project – Status Report Gabriele Garzoglio VO Services Project WBS Dec 14, 2006 OSG Executive Board Meeting Gabriele Garzoglio.
9/25/08DLP1 OSG Operational Security D. Petravick For the OSG Security Team: Don Petravick, Bob Cowles, Leigh Grundhoefer, Irwin Gaines, Doug Olson, Alain.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Jan 2010 Current OSG Efforts and Status, Grid Deployment Board, Jan 12 th 2010 OSG has weekly Operations and Production Meetings including US ATLAS and.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Production Coordination Staff Retreat July 21, 2010 Dan Fraser – Production Coordinator.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Job and Data Accounting on the Open Science Grid Ruth Pordes, Fermilab with thanks to Brian Bockelman, Philippe Canal, Chris Green, Rob Quick.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
1 Open Science Grid Update for DOSAR Keith Chadwick, Fermilab.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
OSG Area Coordinator’s Report: Workload Management Maxim Potekhin BNL May 8 th, 2008.
OSG Report for DOE/NSF Joint Oversight Group U.S. Large Hadron Collider Program OSG Report for DOE/NSF Joint Oversight Group U.S. Large Hadron Collider.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
Открытая решетка науки строя открытое Cyber- инфраструктура для науки GRID’2006 Dubna, Россия Июнь 26, 2006 Robertovich Gardner Университет Chicago.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
The Great Migration: From Pacman to RPMs Alain Roy OSG Software Coordinator.
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
Ruth Pordes Executive Director University of Washingon Seattle OSG Consortium Meeting 21st August University of Washingon Seattle.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
April 18, 2006FermiGrid Project1 FermiGrid Project Status April 18, 2006 Keith Chadwick.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
What is OSG? (What does it have to do with Atlas T3s?) What is OSG? (What does it have to do with Atlas T3s?) Dan Fraser OSG Production Coordinator OSG.
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Bob Jones EGEE Technical Director
Gene Oleynik, Head of Data Storage and Caching,
Open Science Grid Progress and Status
Leigh Grundhoefer Indiana University
Open Science Grid at Condor Week
gLite The EGEE Middleware Distribution
Presentation transcript:

1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab

2 Outline Cover status/happenings on OSG: Current deployments & statistics OSG Funding Project Management Community Engagement Middleware Enhancements Trust Relationships Outreach

3 Current OSG deployment 96 Resources across production & integration infrastructures - increase of about 10 in last 6 months. Note first UK site : Star in Birmingham 27 Virtual Organizations including operations and monitoring groups - increase of about 5 in last 6 months >15,000 CPUs ~6 PB MSS ~4 PB disk

4 Some Statistics OSG Production: 70 sites (59 CE, 11 SE) > CPUs 111 TBytes of rotating disk OSG Integration: 21 sites (20 CE, 1 SE) ~5000 CPUs 20 TBytes of rotating disk

5 OSG Funding OSG will be co-funded by DOE and NSF for 5 years at $6M/year starting in Sept ‘06 Includes major deliverables for US LHC, LIGO and STAR; well used by CDF and D0; and potentially other experiment major deliverables in the future. Commitment to engage communities beyond Physics. Commitment to collaborate with EGEE and TeraGrid. Reminder:  Consortium policies are to be open to participation by all researchers.  Project responsibilities are to operate, protect, extend and support the Distributed Facility for the Consortium.

6 Project Management CS/technology providers part of management teams. LHC well represented in the management structure. External project technical leads (Globus, CDIGS, SOS, LIGO-PIF) part of the Executive Board. Partners (WLCG, TeraGrid, EGEE etc.) represented through liaisons on the Council. # # # # # # # # # # # # #

7 Will review distribution each year 34.3Total FTEs 4.0Management & Staff 8.5Extensions in capability and scale. 2.5Education, outreach & training 2.5Engagement (Year 1 only) 6.5Software release and support 4.5Security and troubleshooting 5.8Facility operations FTEs Effort Distribution in Year 1

8 OSG support for non-physics communities.. update Seeing DOSAR VO jobs at OU submitted through OSG services and accounted on MonaLisa. Alan Blatecky’s group at RENCI is porting the “award winning Bioportal” to OSG. >100 Nanotechnology jobs -- that run from days -- are being executed on LIGO, ATLAS and CMS sites. OSG is discussing partnership with the Northwest Indiana Computing Grid (NWICG). Unfortunately their main focus is Computational Chemistry, which quickly runs into Gaussian licensing issues. Yes, we also say it is the responsibility of the project/VO... But there are more that 50 sites on OSG. The P-Grade portal has been interfaced to a version of CHARMM molecular dynamics simulation package. Some versions of this also have licencing issues. Work on Campus Grids enabling Crimson Grid, NWICG, New York State Grid (NYSG), GPN (Nebraska Education/Training grid) partnerships. (Note: Partners do not have to contribute resources; collaboration can equally be in software, procedures, education, training, security etc.)

9 OSG Middleware Infrastructure Applications VO Middleware Core grid technology distributions: Condor, Globus, Myproxy Virtual Data Toolkit (VDT) core technologies + software needed by stakeholders: e.g.VOMS, CEMon VDS, MonaLisa, VO-Privilege. OSG Release Cache: VDT + OSG specific configuration + utilities. ATLAS Panda, DQ etc CMS cmssw, LFC, etc. Bio blast, charmm etc. LIGO LDR, Catalogs etc. User Science Codes and Interfaces Existing Operating, Batch systems and Utilities. OSG Middleware is deployed on existing farms and storage systems. OSG Middleware interfaces to the existing installations of OS, utilities and batch systems. VOs have VO scoped environments in which they deploy applications (and other files), execute code and store data. VOs are responsible for and have control over their end-to-end distributed system using the OSG infrastructure.

10 OSG Software Releases OSG released in May. Security patch for Globus released in September. Still supporting some sites at OSG ITB (VDT 1.4.0) released earlier this week Future releases:  OSG Release Schedule and function priorities driven by needs of US LHC.  Balanced by need to prepare for medium-longer term sustainability and evolution.  Goal of 2 major releases a year with easier minor updates in between.

11 Upcoming VDT Work Provide smooth updates to a running production service:  In-place vs. on-the-side  Preserve old configuration while making big changes.  Make updates an hour not a days work. Support for more platforms?  More linux variants and needs of EGEE.  AIX needed by NERSC.  Mac OS X being tested by STAR, TIGRE, …  Solaris? Add dCache/SRM packaged in RPMs and tested on existing storage/data management testbed.

12 OSG TaskPriority Gratia1 Improve ease of upgrade of one VDT version to another1 BDII2 About 50 small bug fixes and enhancements2 New GIPs2 gLite client tools for interoperability purposes2 New VDS2 dCache/SRM as "RPM"3 tclGlobus3 Update platforms we build on3 New Globus - WSGRAM and GT43 CEMON 4 SQUID 4 Globus Virtual Workspaces,for Edge Services Framework5 Job managers/GIPs for Sun Grid Engine5 Switch from MySQL to MySQL-Max5 Upgrade Monalisa to the latest version5 MDS 45 VOMRS5 glexec6 Grid Exerciser6 Production release of the VDT (1.4.0)6 Expect to rely on VDT Release early Some components being tested for earlier installation:  Gratia accounting  CEmon for Resource Selection Service.

13 Risk based view of the world Organizations implement controls over their activities so as to obtain acceptable residual risk. Organizations: Sites, VOs and Grids.  Each has a security process lifecycle.  Satisfaction jointly and severally. Each organization is captain of its own ship.  However, constrained to interoperate. Standards (e.g. User AUP, Service AUP, VO AUP) aid interoperation.

14 Site-VO Interoperation Perceived Risk Actual Risk? Implement controls OperateOperate n y Perceived Risk Actual Risk? Implement controls OperateOperate n y SiteVO

15 Scaling :-( Every organization has to reduce its risk to acceptable residual risk, and the security has to interoperate.  Identified trust.  Something specific is relied on  Is not checked each time.  Apropos Managerial, Operational and Technical Controls, for trust items. Scaling is a problem, work needs to be done  Common standards -- I.e OSG AUP.  Fewer entities  Aggregate small organizations into larger organizations

16 Science Grid Communications Broad set of activities  (Katie Yurkewicz)  News releases, PR, etc.  Science Grid This Week, 1.5 years (>1000 subscribers)

17 OSG Newsletter Monthly newsletter  (Katie Yurkewicz)  7 issues now osgnews

18 OSG Summary The Open Science Grid infrastructure is a part of the WLCG Collaboration. OSG supports and evolves the VDT core middleware in support of the EGEE and the WLCG as well as other users. The OSG software stack is deployed to interoperate with the EGEE and provide effective systems for the US LHC experiments among others.