OSG Area Report Production – Operations – Campus Grids June 19, 2012 Dan Fraser Rob Quick.

Slides:



Advertisements
Similar presentations
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
Advertisements

Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
SCD FIFE Workshop - GlideinWMS Overview GlideinWMS Overview FIFE Workshop (June 04, 2013) - Parag Mhashilkar Why GlideinWMS? GlideinWMS Architecture Summary.
High Throughput Parallel Computing (HTPC) Dan Fraser, UChicago Greg Thain, Uwisc.
OSG Area Coordinators Meeting Operations Rob Quick 2/22/2012.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,
OSG Area Coordinators Meeting Operations Rob Quick 2/22/2012.
OSG Area Coordinators Meeting Security Team Report Kevin Hill 08/14/2013.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 12/21/2011.
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
OSG Site Provide one or more of the following capabilities: – access to local computational resources using a batch queue – interactive access to local.
Integration and Sites Rob Gardner Area Coordinators Meeting 12/4/08.
1 1 Vulnerability Assessment of Grid Software Jim Kupsch Associate Researcher, Dept. of Computer Sciences University of Wisconsin-Madison Condor Week 2006.
EGEE is a project funded by the European Union under contract IST Testing processes Leanne Guy Testing activity manager JRA1 All hands meeting,
Campus Grids Report OSG Area Coordinator’s Meeting Dec 15, 2010 Dan Fraser (Derek Weitzel, Brian Bockelman)
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
G RID M IDDLEWARE AND S ECURITY Suchandra Thapa Computation Institute University of Chicago.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
BOSCO Architecture Derek Weitzel University of Nebraska – Lincoln.
Report on the state of OSG Software Alain Roy. OSG Year 6 Planning, July 2011 Points to discuss 1.State of RPM transition 2.State of last scheduled Pacman.
OSG Software and Operations Plans Rob Quick OSG Operations Coordinator Alain Roy OSG Software Coordinator.
Evolution of the Open Science Grid Authentication Model Kevin Hill Fermilab OSG Security Team.
Production Coordination Staff Retreat July 21, 2010 Dan Fraser – Production Coordinator.
Remote Cluster Connect Factories David Lesny University of Illinois.
05/29/2002Flavia Donno, INFN-Pisa1 Packaging and distribution issues Flavia Donno, INFN-Pisa EDG/WP8 EDT/WP4 joint meeting, 29 May 2002.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 8/15/2012.
OSG Production Report OSG Area Coordinator’s Meeting Aug 12, 2010 Dan Fraser.
Turning Software Projects into Production Solutions Dan Fraser, PhD Production Coordinator Open Science Grid OU Supercomputing Symposium October 2009.
Towards a Global Service Registry for the World-Wide LHC Computing Grid Maria ALANDES, Laurence FIELD, Alessandro DI GIROLAMO CERN IT Department CHEP 2013.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 11/02/2011.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 6/6/2012.
OSG PKI Transition: Transition Phase Report Von Welch OSG PKI Transition Lead Indiana University Center for Applied Cybersecurity Research.
State of the OSG Software Stack Alain Roy OSG Software Coordinator.
OSG Production Report OSG Area Coordinator’s Meeting Nov 17, 2010 Dan Fraser.
INFSO-RI Enabling Grids for E-sciencE Information and Monitoring Status and Plans Plzeň, 10 July 2006 Steve Fisher/RAL.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
Portal Update Plan Ashok Adiga (512)
9 Oct Overview Resource & Project Management Current Initiatives  Generate SOWs  8 written and 6 remain;  drafts will be complete next week 
Production Coordination Area VO Meeting Feb 11, 2009 Dan Fraser – Production Coordinator.
Top 10 Reasons to Upgrade to OSG Version Rob Quick OSG Operations Coordinator.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
Jan 2010 OSG Update Grid Deployment Board, Feb 10 th 2010 Now having daily attendance at the WLCG daily operations meeting. Helping in ensuring tickets.
Weekly Report By: Devin Trejo Week of July 13, 2015-> July 19, 2015.
OSG Area Coordinators Meeting Operations Rob Quick 1/11/2012.
Production Oct 31, 2012 Dan Fraser. Current Production Focus Transition to RPMs 52(44) sites using RPM based installs 52(44) sites using RPM based installs.
OSG Area Report Production – Operations – Campus Grids Jan 11, 2011 Dan Fraser.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
User Support of WLCG Storage Issues Rob Quick OSG Operations Coordinator WLCG Collaboration Meeting Imperial College, London July 7,
OSG Security: Updates on OSG CA & Federated Identities Mine Altunay, PhD OSG Security Team OSG AHM March 24, 2015.
II EGEE conference Den Haag November, ROC-CIC status in Italy
The Great Migration: From Pacman to RPMs Alain Roy OSG Software Coordinator.
Campus Grid Technology Derek Weitzel University of Nebraska – Lincoln Holland Computing Center (HCC) Home of the 2012 OSG AHM!
OSG Area Coordinators Meeting Security Team Report Mine Altunay 8/15/2012.
CERN LCG1 to LCG2 Transition Markus Schulz LCG Workshop March 2004.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Scientific Linux Connie Sieh CSAM Meeting May 2, 2006.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Software Tools Group & Release Process Alain Roy Mine Altunay.
Regional Operations Centres Core infrastructure Centres
CREAM Status and Plans Massimo Sgaravatto – INFN Padova
Leigh Grundhoefer Indiana University
Adding Computational Resources to SURAgrid (the document) September 27, 2007 Mary Trauner SURA Consultant.
Website Testing Checklist
Presentation transcript:

OSG Area Report Production – Operations – Campus Grids June 19, 2012 Dan Fraser Rob Quick

Current Production Focus Transition to RPMs 31 sites using RPM based installs 31 sites using RPM based installs 70 sites still using Pacman install 70 sites still using Pacman install All the main RPMs have been released All the main RPMs have been released Incl. Gums and Gratia Still working on a few issues / bugs Still working on a few issues / bugs ABCD gradually winding down ABCD gradually winding down Main RPM packages have been released Main RPM packages have been released Documentation for RPMs is usable Documentation for RPMs is usable ease3/WebHome ease3/WebHome Doc Fest Mon-Tues this week

Overall Production

Some Production Issues… Effort from the entire team Recent compromise of hypernews and webserver security breech Recent compromise of hypernews and webserver security breech Transition to BDII v5 – basically abandoned Transition to BDII v5 – basically abandoned Adding lots of new services to GOC … Adding lots of new services to GOC … SE-only publishing issue SE-only publishing issue Ongoing GRAM-5 production issues (memory leaks, scaling, etc.) Ongoing GRAM-5 production issues (memory leaks, scaling, etc.) CA updates no longer automated by OSG (RPM) CA updates no longer automated by OSG (RPM) GUMS-Voms compatibility with CERN (progress ongoing) GUMS-Voms compatibility with CERN (progress ongoing) …

Campus Grid Status Campus grid working in production: Glow, Diagrid (All Condor campus grids) Glow, Diagrid (All Condor campus grids) Nebraska Nebraska University of Florida (not used much) University of Florida (not used much)Pre-Production/testing: RENCI RENCI Virginia Tech (Bio-tech) Virginia Tech (Bio-tech) r:dhcpseven237.bioinformatics.vt.edu r:dhcpseven237.bioinformatics.vt.edu Virginia Tech (Belle, DayaBay) Virginia Tech (Belle, DayaBay) FIU FIU

Campus Grid Direction Make the factory easier to install / maintain Current factory = flocking + glide-in + BLAH Current factory = flocking + glide-in + BLAH Fewer “assumptions” about the existing environment Fewer “assumptions” about the existing environment Parallel path of using SSH + BLAH Have been working with Condor & CG team Have been working with Condor & CG team BOSCO (Blah over SSH Condor Overlay) BOSCO (Blah over SSH Condor Overlay) New Documentation Focus on end user Focus on end user Manage multiple sites from one submit host (SSH) Manage multiple sites from one submit host (SSH) The hard part is not “technology” but finding / keeping users

Campus Grids Draft Project Plan pusInfrastructureProjectPlan Release of BOSCO V0 Completed on time in March. Completed on time in March. Included Documentation and usability testing. Included Documentation and usability testing. New checkbox for Campus Grids New checkbox for Campus Grids Procedure is to file a ticket, we add them Procedure is to file a ticket, we add themhttps://ticket.grid.iu.edu/goc/submit Still working on Gratia accounting

Campus Grids Draft Project Plan Release of BOSCO V1Q2 Slipped to July primarily due to Condor File Transfer component. Slipped to July primarily due to Condor File Transfer component. Includes Centralized Campus Factory Includes Centralized Campus Factory Still has dependency on same OS between submit host and cluster. To include “traceroute” functionality To include “traceroute” functionality Upgrade existing CG sitesQ3 Working on next gen WBS Including an “engage with researchers” plan Including an “engage with researchers” plan