1 Open Science Grid in Adolescence: 2012-2016 Ruth Pordes Fermilab.

Slides:



Advertisements
Similar presentations
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
Advertisements

1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
OSG Council Meeting, August 2nd Executive Director Report to the OSG Council Executive Director Report to the OSG Council August 2 nd 2011, Ruth.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
Contributors & Users Science Communities Campuses Universities & Laboratories Service & Software Providers Scientists, Researchers, Educators, Students.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Kathy Benninger, Pittsburgh Supercomputing Center Workshop on the Development of a Next-Generation Cyberinfrastructure 1-Oct-2014 NSF Collaborative Research:
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Jan 2010 Current OSG Efforts and Status, Grid Deployment Board, Jan 12 th 2010 OSG has weekly Operations and Production Meetings including US ATLAS and.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
NEES, NEES2, and OSG Thomas Hacker Purdue University.
Assessment of Core Services provided to USLHC by OSG.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
SCD FIFE Workshop - GlideinWMS Overview GlideinWMS Overview FIFE Workshop (June 04, 2013) - Parag Mhashilkar Why GlideinWMS? GlideinWMS Architecture Summary.
A. Mohapatra, HEPiX 2013 Ann Arbor1 UW Madison CMS T2 site report D. Bradley, T. Sarangi, S. Dasu, A. Mohapatra HEP Computing Group Outline  Infrastructure.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
OSG Council, Aug 22 nd -23 rd 2012 Ruth Pordes, Council Chair.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
DV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science DOE: Scientific Collaborations at Extreme-Scales:
Evolution of the Open Science Grid Authentication Model Kevin Hill Fermilab OSG Security Team.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 8/15/2012.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
9 Oct Overview Resource & Project Management Current Initiatives  Generate SOWs  8 written and 6 remain;  drafts will be complete next week 
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
Julia Andreeva on behalf of the MND section MND review.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
LHCONE meeting Welcome Cambridge Meeting 9-10 Feb 2015 Matthew Scott; General Manager GEANT Ltd; GÉANT Programme Manager.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Monitoring of the LHC Computing Activities Key Results from the Services.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
OSG Report for DOE/NSF Joint Oversight Group U.S. Large Hadron Collider Program OSG Report for DOE/NSF Joint Oversight Group U.S. Large Hadron Collider.
Identity Management in Open Science Grid Identity Management in Open Science Grid Challenges, Needs, and Future Directions Mine Altunay, James Basney,
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
POW MND section.
Long-term Grid Sustainability
Presentation transcript:

1 Open Science Grid in Adolescence: Ruth Pordes Fermilab

Ruth New York, 2012 The OSG Eco-System The Open Science Grid (OSG) advances science through open distributed computing. The OSG is a multi-disciplinary partnership to federate local, regional, community and national cyberinfrastructures to meet the needs of research and academic communities at all scales. 2 ✦ Consortium ★ Sites/Resource Providers, Science Communities, Software Providers/Projects, other eco-systems. ✦ Project ★ staff, deliverables, operations ✦ Satellites ★ research, development, extensions, loosely coupled

Ruth New York, 2012 Maturing ✦ OSG is being supported for another 5 years. ★ Strong support from DOE and NSF. ✦ Endorsement to not only continue focus on physics but also continue broad engagement of other sciences making use of OSG. ★ Sustain services for LHC and make significant contributions to LHC upgrade. ★ Extend, simplify, and adapt services and technologies for other sciences. ★ Continue community partnerships and facilitation of peer infrastructures in Europe, South America, Asia, Africa. 3

Ruth New York, 2012 Accepted in the National Landscape ✦ Regarded as Effective Partner by Funding Agencies ★ A Service Provider in NSF XD- XSEDE. ★ Strong NSF OCI funding for growth into the Campuses. ★ Extend support for collaborative extreme-scale science in DOE. ★ Accept responsibility for “orphaned” software and services. 4

Ruth New York, 2012 Growing In Size 5 ~100 Sites in US and abroad

Ruth New York, 2012 Growing in Use 6 ~50 M hours/month (~70K hours/hour) Will Add explanation of colors

Ruth New York, % usage by non-HEP* e.g. Protein Structure - a Key Application 7 * Frank Wuerthwein, OSG AHM, Mar 2012

Ruth New York, 2012 Diverse Communities 8 * Frank Wuerthwein, OSG AHM, Mar 2012 U of Wisconsin Madison SDSC RENCI Harvard Medical School U of Washington U of Chicago

Ruth New York, 2012 OSG Leadership Strengthened 9 ✦ The Leadership Team remains in place and extends: ★ Miron, Michael Ernst, Frank. ★ Lothar as Executive Director. ★ Ruth as Council Chair. ★ Alain, Brian, Chander, Dan, Mine with significant responsibilities and authorities as Area/Team Leads. ✦ Added new/old faces: ★ Von Welch as lead of the OSG PKI / Identity Management project (more later). ★ Computer Scientists at ISI - Ewa, Mats - and Notre Dame – Doug Thain – increasingly in the conversations.

Ruth New York, 2012 An Anchor for Distributed High Throughput Computing (DHTC) 10 Blueprints, Outreach Computer Science Research Technology and Software Deploy, Commission and Use Science Needs Satellite Projects Technology Investigation, Software Integration, Production, Operations OSG

Ruth New York, 2012 Job Overlay Infrastructure in place - Key Usability Improvement ✦ Approaching 100% job overlays for all Vos ★ >50% of OSG cycles through glideinWMS, ★ ~25% through Atlas PanDA WMS ✦ tremendous boost to usability, usage and effectiveness ✦ Easier for VOs to access cycles, global queue to implement priorities ✦ Site failures are not seen by the end user ✦ OSG provides distributed operations ★ “Glidein Factories” at IU, CERN and UCSD ★ perform routine maintenance on the Factory as well as monitor glideins to ensure they are running on sites without error ✦ Provide future path to flexibly provision compute resources ✦ Supports access to clouds including Amazon EC2 11

Ruth New York, 2012 Sustain Services & Expand Science Communities ✦ LHC ★ Continued focus on LHC – support for ATLAS, CMS, ALICE USA distributed computing in the US. ★ Active /proactive contributions on behalf of US LHC to WLCG – to TEG reports and implementation follow ons. ★ Prepare for LHC shutdown and upgrade. ✦ Embrace future physics, nuclear physics, astrophysics experiments: Belle II, DES, EIC, LSST, SuperB… (will explain these..) 12

Ruth New York, 2012 Transition of some support from External Projects ✦ Bestman support now under the OSG Software Area. ✦ ESNET DOEGrid PKI infrastructure/Certificate Authority -> OSG ★ Receiving additional support from ESNET to implement. ★ Need to have no disruption in production service. ★ Deploy OSG PKI infrastructure for User and Host/Service certificates based on DigiCert commercial backend by Oct ★ US ATLAS and US CMS contributing to the 1 st 2 year license to a defined threshold. It is not yet known if this is sufficient. ★ Feb DOEGrids ceases to issue new certificates. ★ Feb DOEGrids ceases to publish CRL lists. ✦ Understanding of support needed for Globus Toolkit ★ GRAM, GridFTP ✦ Funding agencies see benefit in OSG anchorfor these needs. 13

Ruth New York, 2012 Extending support in the Campuses ✦ Intra- and inter- campus sharing of resources. ✦ US LHC Tier-3s “beachheads” for their campus. ✦ Support the revolution in biology sequencer on every desktop. ✦ Integration of popular applications frameworks e.g. “galaxy”. 14

Ruth New York, 2012 Interface into New Resource Types ✦ High-core count nodes ★ Condor moving to support: Resource reservation, Access to partial nodes, Automated node draining. ★ Applications being adapted. ★ Operations/administrative services being extended. ✦ Support for applications using GPUs ✦ Science Clouds ✦ Commercial Clouds – Amazon EC2 15

Ruth New York, 2012 Data Services for non-LHC experiments ✦ Large communities typically have their own implementations and deployments of Data Services. ✦ XROOTD deployment (Anydata Anytime Anywhere project) scope includes generalizable Data Access/Remote I/O services for use by other OSG communities/ ✦ Globus Online being evaluated by several VOs. ✦ IRODS being readied for deployment to manage shared storage/data spaces. 16

Ruth New York, 2012 Part of the eXtreme Digital (XD) Generation ✦ As an XSEDE Service Provider support users given allocations by the (NSF supported) XSEDE request and allocation process. ✦ First users in summer 2012 through dedicated Submit interface. ✦ By 2015 expect this to be in full production and to support multi- stage workflows across OSG and other XSEDE Service Providers. 17

Ruth New York, 2012 Looking towards and beyond 2015 – Computer Science Research ✦ OSG’s existing capabilities are effective but basic and primitive. ★ Improvements will rely on external research, development and contributions. ✦ Integrate static resources with dynamically allocated resources (like clouds). ✦ New globally capable, usable, and integrated frameworks for collaborative environments : data, security, workflows, tools for transparency, diverse resource resources. ✦ CSresearchNeeds.pdf CSresearchNeeds.pdf 18

Ruth New York, 2012 Summary Offers to participate in this unique adventure welcomed! 19