LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, 2001 7 th LHCb Software Week.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Ian Bird LHCC Referees’ meeting; CERN, 11 th June 2013 March 6, 2013
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Fabric Management for CERN Experiments Past, Present, and Future Tim Smith CERN/IT.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CERN Computing Review Recommendations ATLAS Plenary 22 February 2001 Gilbert Poulard / CERN-EP-ATC.
Particle Physics and the Grid Randall Sobie Institute of Particle Physics University of Victoria Motivation Computing challenge LHC Grid Canadian requirements.
23 Feb 2000F Harris Hoffmann Review Status1 Status of Hoffmann Review of LHC computing.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
December 10,1999: MONARC Plenary Meeting Harvey Newman (CIT) Phase 3 Letter of Intent (1/2)  Short: N Pages è May Refer to MONARC Internal Notes to Document.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
LHCbComputing Manpower requirements. Disclaimer m In the absence of a manpower planning officer, all FTE figures in the following slides are approximate.
J. Harvey : Panel 3 – list of deliverables Slide 1 / 14 Planning Resources for LHCb Computing Infrastructure John Harvey LHCb Software Week July
CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Procedure to follow for proposed new Tier 1 sites Ian Bird CERN, 27 th March 2012.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Procedure for proposed new Tier 1 sites Ian Bird WLCG Overview Board CERN, 9 th March 2012.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
JINR WLCG Tier 1 for CMS CICC comprises 2582 Core Disk storage capacity 1800 TB Availability and Reliability = 99% 49% 44% JINR (Dubna)End of.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid related projects CERN openlab LCG EDG F.Fluckiger
Russian Regional Center for LHC Data Analysis
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
New strategies of the LHC experiments to meet
US ATLAS Physics & Computing
LHCb thinking on Regional Centres and Related activities (GRIDs)
Presentation transcript:

LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 2 Major conclusions and recommendations 1 mScale of resource requirements assessed and accepted mMulti-tier hierarchical model + Grid endorsed  Expected ~1/3 at CERN mNeed affordable research 1.5 Gbps for each experiment by 2006 mJoint software efforts encouraged between experiments and IT mData challenges encouraged to test infrastructure and software mAreas of concern in software (support of simulation & analysis) mMissing manpower for Core Software teams, CERN/IT

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 3 Major conclusions and recommendations 2 mTotal hardware costs (240 MCHF, LHCb ~27 MCHF i.e. ~11%)  Investment spread over ’05, ’06, ’07 in approx. equal portions  M&O – rolling replacement every 3 years mJoint prototype reaching ~50% of 1 facility for ’03/’04 mLHC Software & Computing Steering Committee (SC2)+TAGs to oversee deployment of entire computing structure mMoU describing funding of and responsibility for hardware and software mInterim MoU to be signed prior to MoU (software, prototype)

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 4 Multi-Tier Hierarchical Model CERN Region I Region F Region UK Region D CERN Tier-0 Tier-1 (national) Institute Tier-2 (production) Institute Server Institute Server Institute Server Institute Server Tier-3 Desktop server Institute MAP..

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 5 Rates and Installed Capacities ALICEATLASCMSLHCbTotal Event size (MB) Raw data/year (PB) MC data/year (PB) Tape at CERN (TB) Disk at CERN (TB) CPU at CERN (kSI95) Tape worldwide (TB) Disk worldwide (TB) CPU worldwide (kSI95) WAN Tier0/Tier1 (Mb)

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 6 Manpower (FTEs) for CORE Software 2000 Have (miss) ALICE12(5) ATLAS23(8) CMS15(10) LHCb14(5) Total64(28) CERN/IT - current staff complement minimum required to run centre predicted complement in Only computing professionals counted

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 7 Hardware costs of CERN Computing ’05-’07 Units kCHFALICEATLASCMSLHCb CPU Disk Robotic Tape Shelf Tape Total Cost Costs spread over ’05 (30%) ’06 (30%) ’07 (40%) LHCb Tier-1’s kSFr (74%)

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 8 Joint Prototype mUse testbed to test at realistic scales:  Fabric management  Data Challenges with realistic rates  Scalability tests of CPU and I/O performance  New technologies - Copper gigabit; New tapes, IA-64  Data Grid functionality mLHCb Data Challenges  July '02 : Functional OO software  July '02 : DC events in ~2 weeks  Dec '02 : Computing TDR  July '03 : DC events in ~2 weeks (DataGrid milestone)  Dec '04 : Software Production Readiness Review  July '05 : DC events (full test of software & infrastructure)

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 9 CERN Testbed Plans WAN links (Mbps) Tape I/O rate (GB/s) Disk I/O rate (GB/s) Tape capacity (PB) Disk capacity (TB) 1’ (April 140) 200  300 Number of systems (dual CPU systems) 4Q. 024Q. 014Q. 00

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 10 Observations and conclusions mWaiting for response from CERN management  guidelines on construction and cost sharing of prototype  timescale for Computing TDR and MoU  allocation of additional new effort to IT and experiments  role and composition of SC2 and timescale for launch Data management project already in preparation mCommunication with funding agencies  Discussions at LHCC, RRBs - preparation of IMoU  Responsibilities for core software (sharing policy)  Advance notice of long term computing plan (cost sharing)  Policy of access to centres outside CERN mPreparation of distributed computing infrastructure  Development of analysis model – physics use-cases  Development of grid services – integration in GAUDI  Preparation of data challenges

LHC Computing Review Recommendations J. Harvey 7 th LHCb Software Week 28 March 2001 Slide 11 Projects mEvent Filter Farm (Computing)  Control and management of farm, installation, scalability  specialisation of GAUDI to filter farm environment mSoftware Framework (GAUDI)  Event model – development and optimisation  Detector description – development and optimisation of geometry  Scripting component to allow interactive analysis based on PYTHON  Grid services  Data management (event data, conditions data, bookkeeping) mPhysics frameworks  Simulation framework using GEANT4 – coordination  Analysis framework – coordination  High level trigger framework – coordination mTools/utilities  software and data quality monitoring  Documentation, workbooks m….