Download presentation
Presentation is loading. Please wait.
Published byLesley Whitehead Modified over 8 years ago
1
Your university or experiment logo here User Board Glenn Patrick GridPP20, 11 March 2008
2
U B 2 Tier 1: Non-Grid Access Classical PBS/qsub access to Tier 1 restricted on 21 Feb. Access to UI also restricted. List of exclusions agreed through UB. ATLAS4identifiers BABAR9 CALICE2 CMS7 Dteam1 LHCb4 MINOS24(3 months after working Castor reduces to 8) TOTAL51 Includes some AFS accounts. Classical PBS/qsub access to Tier 1 restricted on 21 Feb. Access to UI also restricted. List of exclusions agreed through UB. ATLAS4identifiers BABAR9 CALICE2 CMS7 Dteam1 LHCb4 MINOS24(3 months after working Castor reduces to 8) TOTAL51 Includes some AFS accounts.
3
U B 3 Tier 1 Squeeze – 2008/Q1 Headroom ATLAS CPU March CPU capacity= 1439 KSI2K March Requests= 2640 KSI2K CPU over-allocated for 2008/Q1 Pain spread by fairshare system. CPU March CPU capacity= 1439 KSI2K March Requests= 2640 KSI2K CPU over-allocated for 2008/Q1 Pain spread by fairshare system. DISK March Disk capacity= 922TB March Disk requests= 920.8TB March Headroom=1.2TB ATLAS/CMS/LHCb got 70% of their request. BaBar reduced from 100TB to ~41TB. All other experiments frozen until March. “Special measures” taken through the quarter. Living dangerously! DISK March Disk capacity= 922TB March Disk requests= 920.8TB March Headroom=1.2TB ATLAS/CMS/LHCb got 70% of their request. BaBar reduced from 100TB to ~41TB. All other experiments frozen until March. “Special measures” taken through the quarter. Living dangerously!
4
U B 4 Tier 1 Disk Squeeze ATLAS (291TB) CMS (242TB) LHCb (116TB) BaBar(49TB) ALICE (5.9TB)
5
U B 5 Tier 1 Disk Use ExperimentDisk Used (7 March) ALICE0% ATLAS72% CMS34% LHCb57% BaBar71% MINOS65%
6
U B 6 Tier 1 CPU Fairshares...Allocated Reality… CMS ATLAS BaBar LHCb ATLAS BaBar CMS LHCb ALICE
7
U B 7 LHC approaches! Friday 7 March 2008 CMS Plenary 25 Feb Machine cold by 1 June? Protons could be injected by mid-June.
8
U B 8 Not only LHC Experiments HEALTH WARNING!
9
U B 9 Tier 1 - 2008 Ramp Up CPU (67%) Disk (39%) Tape (54%) Current Allocated/Total Dec 2008 Request 2008 Request Latest procurement should satisfy all experiment 2008 requests if they don’t change. Need to worry about 2009 now.
10
U B 10 dCache – Castor2 Migration Castor Data Timeline 20 June. At the UB meeting it was agreed that 6 month notice be given for dCache termination. 26 November. Proposal to terminate dCache by end of May. Going to be Tight…. LHCb – all disk data migrated and 60% of tape data. ATLAS – disk and tape migration ongoing (?). On 20 Feb, 12TB trimmed from CMS allocation to help ATLAS migration (270TB allocation+20TB). ALICE – Updated request received 25 Jan. Allocated one server on 6 February. Need xrootD plug-in. MINOS – agreed 3 month period from date of working Castor instance.
11
U B 11 User Support Posts - GridPP3 Janusz Martnyiak (Imperial) = 50% FTE Ex-portal post. Technical assistance with Grid related software, interfacing experiments to middleware, development of tools, etc. First priority is to help smaller non-LHC experiments to get established on the Grid. LHC projects, generic PP tools (eg. Ganga) and KE with non-HEP VOs also possible work areas. Experiments bid through UB chair for support. MICE (Ganga and LFC) already approved and some SuperNemo work (LFC). Stephen Burke (RAL) = 50% FTE Documentation post. Focussed on immediate and short-term issues. For example, helping answer technical enquiries (outside ticket system), trouble-shooting user/VO problems, locating suitable documentation, etc.
12
U B 12 The End (and The Start) GridPP3 GridPP2
13
U B 13
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.