D. Britton Preliminary Project Plan for GridPP3 David Britton 15/May/06.

Slides:



Advertisements
Similar presentations
Managing Hardware and Software Assets
Advertisements

CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
1 17/3/2009 European Commission Directorate General Information Society & Media Funding Instrument Briefing for Remote Reading.
D. Britton GridPP Status - ProjectMap 8/Feb/07. D. Britton08/Feb/2007GridPP Status GridPP2 ProjectMap.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Oversight Committee Meeting.
GridPP4 – Revised Plan Implementing the PPAN recommendations.
GridPP Meeting, 28/6/06 UB Overview à Role of the UB n Represent user community within GridPP management n Request and allocate.
Storage Review David Britton,21/Nov/ /03/2014 One Year Ago Time Line Apr-09 Jan-09 Oct-08 Jul-08 Apr-08 Jan-08 Oct-07 OC Data? Oversight.
Welcome to the 12 th GridPP Collaboration Meeting Introduction Steve Lloyd, Chair of the GridPP Collaboration Board Report on recent meetings of: PPARC.
GridPP Funding Model(s) D. Britton Imperial College 24/5/01 £21m + £5m.
Project Status David Britton,15/Dec/ Outline Programmatic Review Outcome CCRC08 LHC Schedule Changes Service Resilience CASTOR Current Status Project.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
GridPP4 Oversight Committee Meeting 4th February 2010
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
S.L.LloydGridPP9 IntroductionSlide 1 Introduction Welcome to the 9 th GridPP Collaboration Meeting Dissemination Officer GridPP2 Posts Tier-2 Centres Steve.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Oversight Committee Meeting.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
S.L.LloydGridPP CB 4 December 2003Slide 1 Agenda 1.Minutes of Previous Meeting (19 Feb 2003) 2.Matters Arising 3.Announcements (Steve) 4.GridPP2 Proposal.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
2 GridPP2 Budget David Britton, 4/12/03 Imperial College.
D. Britton Project Manager’s Report David Britton 12/Jan/2005.
Chapter 12 Strategic Planning.
ROSCOE Status IT-GS Group Meeting Focus on POW November 2009.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP26 Collaboration Meeting.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
D. Britton GridPP3 David Britton 27/June/2006. D. Britton27/June/2006GridPP3 Life after GridPP2 We propose a 7-month transition period for GridPP2, followed.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
D. Britton Collaboration Board Meeting David Britton 16/Jul/07.
D. Britton GridPP Status - ProjectMap 22/Feb/06. D. Britton22/Feb/2006GridPP Status GridPP2 ProjectMap.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
D. Britton Collaboration Board Meeting Draft Response to the PPRP David Britton 25/Oct/06.
Tony Doyle - University of Glasgow 6 September 2005Collaboration Meeting GridPP Overview (emphasis on beyond GridPP) Tony Doyle.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
PPAN Programmatic Review Presentation to PP town meeting Jordan Nash.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP36 Collaboration Meeting.
A Dutch LHC Tier-1 Facility
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

D. Britton Preliminary Project Plan for GridPP3 David Britton 15/May/06

D. Britton15/May/2006GridPP3 Boundary Conditions Timeframe: GridPP2+ Sep 07 to Mar 08 GridPP3 Apr 08 to Mar 11 Budget Line: Unknown exactly. Scale set by exploitation review input. (both from GridPP input and Experiments) Exploitation Review input from GridPP: FY07: £7,343k FY08-FY10: £29,302k Total: £36,643k

D. Britton15/May/2006GridPP3 LHC Hardware Requirements GridPP Exploitation Review input: Took Global Hardware requirements and multiplied by UK authorship fraction. ALICE 1%ATLAS 10%CMS 5%LHCB 15% Problematic using “Authors” in the denominator when not all Authors (globally) have an associated Tier-1. Such an algorithm applied globally would not result in sufficient hardware. GridPP has asked the experiments for requirements and their input (relative to global requirements) is: ALICE ~1.3%ATLAS ~13.7%CMS ~10.5%LHCb ~16.8% ?? (Global Requirements) X (Global T1 author frac.) (Global Requirements) (Number of Tier1s) ~50% X (Global Requirements) (Number of Tier1s) ~ UK Authorship fraction

D. Britton15/May/2006GridPP3 Proposed Hardware The proposal from the User Board is that that the hardware requirements in the GridPP3 proposal are: Those defined by the LHC experiments; plus those defined by BaBar (historically well understood); plus a 5% provision for “Other” experiments at the Tier-2s only.

D. Britton15/May/2006GridPP3 Proposal We propose to use the UB input to define the Hardware request (and not include alternative scenarios). We will note that these hardware requirements are not very elastic. Strategic decisions on the UK obligations, roles, and priorities will need to be made if the Hardware is to be significantly reduced. (Internally, we should continue to discuss how to respond to lower funding scenarios).

D. Britton15/May/2006GridPP3 Hardware Costs Hardware costs are rather uncertain. We have previously quantified this uncertainty as 10% per year of extrapolation (10% in 2007, 20% in 2008, etc). Translates to an uncertainty of about £3.8m in the proposal. (Actual numbers here will be updated – these are a few months old)

D. Britton15/May/2006GridPP3 Tier-1 Hardware (Work in progress: numbers are still evolving!)

D. Britton15/May/2006GridPP3 Running Costs (Work in progress)

D. Britton15/May/2006GridPP3 Running Costs Running costs traditionally charged indirectly (at institutes and CCLRC). Normally averaged over larger communities which tends to be to the advantage of particle physics. We hope this continues as long as possible. Exploitation review input contained ~£1.8m running costs split between Tier-1 and Tier2 which is only 50% of the current estimate. Should we avoid explicitly include running costs in the GridPP3 proposal (on the basis that it is not known how these will be charged)? Instead, include a footnote pointing out the assumption that running costs are funded by other mechanisms (SLA, FEC).

D. Britton15/May/2006GridPP3 Tier-2 Resources In GridPP2 we paid for staff in return for provision of hardware, which is not a sustainable model. Need a transition to a sustainable model that generates sufficient (but not excessive) hardware, which institutes will buy into. Such a model should: Acknowledge that we are building a Grid (not a computer centre). That historically Tier2s have allowed us to lever resources/funding. That Tier2 are designed to provide different functions and different levels of service from the Tier1. Dual funding opportunities may continue for a while. Institutes may have strategic gain by continuing to be part of the "World's largest Grid"

D. Britton15/May/2006GridPP3 Tier-2 Resources A possible model: - GridPP funds ~15 FTE at the Tier-2s (same as Tier-1). - Tier-2 Hardware requirements are defined by the UB request. - That GridPP pays the cost of purchasing hardware to satisfy the following years requirements at the current year price, divided by the nominal hardware lifetime (4 years for disk; 5 years for CPU). E.g TB of Disk is required in In January 2007, this would cost ~1.0k£/TB. With a life-time of 4 years, the 1-year “value” is 2253/4 = £563k. Note: This does not necessarily reimburse the full cost of the hardware because in subsequent years, the money GridPP pays depreciates with the falling cost of hardware, whereas the Tier2s who actually made a purchase, have been locked into a cost determined by the purchase date. However, GridPP does pay cost up to 1-year before the actual purchase date, and institutes which already own resources can delay the spend further.

D. Britton15/May/2006GridPP3 Tier-2 Resources Sanity Checks: 1)Can apply the model and compare cost of hardware at the Tier-1 and Tier-2 integrated over the lifetime of the project: 2)Total cost of ownership: Can compare total cost of the Tier-2 facilities with the cost of placing the same hardware at the Tier-1 (estimate that doubling the Tier-1 hardware requires a 35% increase in staff). Tier-1 Tier-2 CPU (K£/KSI2K-year): DISK (K£/TB-year): TAPE (K£/TB-year):0.05 Including staff and hardware, the cost of the Tier-2 facilities is 80% of cost of an enlarged Tier-1. Question: Would institutes be prepared to participate at this level?

D. Britton15/May/2006GridPP3 Staff Effort Currently using the GridPP input to the exploitation review as the baseline (with the addition of Dissemination + Industrial Liaison). GridPP2+ GridPP3

D. Britton15/May/2006GridPP3 Staff Costs

D. Britton15/May/2006GridPP3 Tier-1 Staff The staff required will be 15 FTE to run and operate the CPU, disk, tape, networking and core services as well as provide Tier-1 operations, deployment and experiments support managed in an effective manner. Support will be during daytime working hours (08:30-17:00 Monday to Friday) with on call cover outside this period. CCLRC may provide additional effort to underpin the service. In order to provide staff present on-site for 24x7 (weekend) cover a further 5 FTE (2 FTE) would be needed. 9 FTE in GridPP1; 13.5 FTE in GridPP2 Management Service Disk Service Tape Service File System Service CPU Service Deployment Service Experiment Support Middleware Support Core Services Operations Service Security Service Network Service Other Service (Exploitation review input)

D. Britton15/May/2006GridPP3 Tier-2 Staff Currently GridPP provides 9.0 FTE of effort for hardware support at the Tier-2s (London 2.5, NorthGrid 4.5, ScotGrid 1.0 and SouthGrid 1.0). This is acknowledged to be too low and operating the Tier-2s is a significant drain on rolling-grant funded System Managers and Physicist Programmers. Large facilities require at least one FTE per site, whereas smaller sites need at least a half FTE. On the basis of currently available hardware an allocation for HEP computing would be 4 FTE to London (5 sites), 6 FTE to NorthGrid (4 Sites), 2 to ScotGrid (3 sites) and 3 to SouthGrid (5 sites) making a total of 15 FTE. (Exploitation review input)

D. Britton15/May/2006GridPP3 Grid Support Staff From the middleware support side, at least one FTE is required for each of the following areas: security support; storage systems; workload management; networking; underlying file transfer and data management systems; and information systems and monitoring (where an additional FTE of effort is anticipated ensuring that our main contribution to EGEE is supported in the longer term). It would be inappropriate to reduce to this level of effort abruptly at precisely the time that LHC is expected to start producing data in Rather it is advised to phase the reduction to this level over FY08 and FY09 thereby sustaining a necessary and appropriate level of support at this critical time. …there will remain core Grid application interfaces supporting the experiment applications that will continue into the LHC running period. These stand to some extent independent of the experiment-specific programmes, although they serve them. A total of 7 FTEs is required for these common application interface support tasks. It should be noted that the proposed effort in this combined area is a significant reduction from the current effort in these Grid developments of more than 30FTEs. (Exploitation review input)

D. Britton15/May/2006GridPP3 Grid Operations (Exploitation review input) In order to operate and monitor the deployment of such a Grid, a further 8 FTEs of effort is needed, corresponding to the Production Manager, 4 Tier-2 Regional Coordinators and 3 members of the Grid Operations Centre.

D. Britton15/May/2006GridPP3 Future Management GridPP2Beyond GridPP Project LeaderTony Doyle0.7Project Leader~0.7 Project ManagerDave Britton0.9Project Manager~0.9 CB/Tier-2 ChairSteve Lloyd0.5 “Deployment Supervisor”~0.4 Deployment Board ChairDave Kelsey0.3 Applications CoordinatorRoger Jones0.5 “Technical Supervisor”~0.4 Middleware CoordinatorRobin Middleton0.5 Total Project Leader appointed by CB search Committee Others by Project Leader? 2.5  1.5 over time? What about CB itself? What about Dissemination? (Last CB –Steve’s slide)

D. Britton15/May/2006GridPP3 Dissemination 4. The bid (s) should : a) show how developments build upon PPARC’s existing investment in e- Science and IT investment, leverage investment by the e-science Core programme and demonstrate close collaboration with other science and industry and with key international partners such as CERN. It is expected that a plan for collaboration with industry will be presented or justification if such a plan is not appropriate. For exploitation review it was assumed dissemination was absorbed by PPARC. Unlikely at this point! Presently we have effectively 1.5 FTE working on dissemination alone (Sarah Pearce plus events officer). Want to maintain a significant dissemination activity (insurance policy) so adding in industrial liaison suggests maintaining the level at 1.5 FTE.

D. Britton15/May/2006GridPP3 Full Proposal (work-in-progress) Compares with exploitation review input of £36,643k which included £1,800k running costs excluded above.

D. Britton15/May/2006GridPP3 Proposed Balance (work in progress)