LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 2 Major conclusions and recommendations 1 mScale of resource requirements assessed and accepted mMulti-tier hierarchical model + Grid endorsed Expected ~1/3 at CERN mNeed affordable research 1.5 Gbps for each experiment by 2006 mJoint software efforts encouraged between experiments and IT mData challenges encouraged to test infrastructure and software mAreas of concern in software (support of simulation & analysis) mMissing manpower for Core Software teams, CERN/IT
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 3 Major conclusions and recommendations 2 mTotal hardware costs (240 MCHF, LHCb ~27 MCHF i.e. ~11%) Investment spread over ’05, ’06, ’07 in approx. equal portions M&O – rolling replacement every 3 years mJoint prototype reaching ~50% of 1 facility for ’03/’04 mLHC Software & Computing Steering Committee (SC2)+TAGs to oversee deployment of entire computing structure mMoU describing funding of and responsibility for hardware and software mInterim MoU to be signed prior to MoU (software, prototype)
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 4 Multi-Tier Hierarchical Model CERN Region I Region F Region UK Region D CERN Tier-0 Tier-1 (national) Institute Tier-2 (production) Institute Server Institute Server Institute Server Institute Server Tier-3 Desktop server Institute MAP..
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 5 Rates and Installed Capacities ALICEATLASCMSLHCbTotal Event size (MB) Raw data/year (PB) MC data/year (PB) Tape at CERN (TB) Disk at CERN (TB) CPU at CERN (kSI95) Tape worldwide (TB) Disk worldwide (TB) CPU worldwide (kSI95) WAN Tier0/Tier1 (Mb)
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 6 Manpower (FTEs) for CORE Software 2000 Have (miss) ALICE12(5) ATLAS23(8) CMS15(10) LHCb14(5) Total64(28) CERN/IT - current staff complement minimum required to run centre predicted complement in Only computing professionals counted
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 7 Hardware costs of CERN Computing ’05-’07 Units kCHFALICEATLASCMSLHCb CPU Disk Robotic Tape Shelf Tape Total Cost Costs spread over ’05 (30%) ’06 (30%) ’07 (40%) LHCb Tier-1’s kSFr (74%)
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 8 Joint Prototype mUse testbed to test at realistic scales: Fabric management Data Challenges with realistic rates Scalability tests of CPU and I/O performance New technologies - Copper gigabit; New tapes, IA-64 Data Grid functionality mLHCb Data Challenges July '02 : Functional OO software July '02 : DC events in ~2 weeks Dec '02 : Computing TDR July '03 : DC events in ~2 weeks (DataGrid milestone) Dec '04 : Software Production Readiness Review July '05 : DC events (full test of software & infrastructure)
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 9 CERN Testbed Plans WAN links (Mbps) Tape I/O rate (GB/s) Disk I/O rate (GB/s) Tape capacity (PB) Disk capacity (TB) 1’ (April 140) 200 300 Number of systems (dual CPU systems) 4Q. 024Q. 014Q. 00
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 10 Observations and conclusions mWaiting for response from CERN management guidelines on construction and cost sharing of prototype timescale for Computing TDR and MoU allocation of additional new effort to IT and experiments role and composition of SC2 and timescale for launch Data management project already in preparation mCommunication with funding agencies Discussions at LHCC, RRBs - preparation of IMoU Responsibilities for core software (sharing policy) Advance notice of long term computing plan (cost sharing) Policy of access to centres outside CERN mPreparation of distributed computing infrastructure Development of analysis model – physics use-cases Development of grid services – integration in GAUDI Preparation of data challenges
LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 11 Projects mEvent Filter Farm (Computing) Control and management of farm, installation, scalability specialisation of GAUDI to filter farm environment mSoftware Framework (GAUDI) Event model – development and optimisation Detector description – development and optimisation of geometry Scripting component to allow interactive analysis based on PYTHON Grid services Data management (event data, conditions data, bookkeeping) mPhysics frameworks Simulation framework using GEANT4 – coordination Analysis framework – coordination High level trigger framework – coordination mTools/utilities software and data quality monitoring Documentation, workbooks m….