Download presentation
Presentation is loading. Please wait.
1
Changes to sources of funding for computing in the UK. Past and present computing resources. Future plans for computing developments. UK Status & Planning for LHC(b) Computing Andrew Halley CERN LHCb Software Week 26th November 1999. Outline
2
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 2 Funding Sources for PP Computing in the UK Until recently (last 2 years), UK funding for particle physics computing had two components: Direct funding to the individual University Groups Central funding to the IT Group of the CCLRC in Rutherford-Appleton Lab. Enter, the New Concept from UK Government Outcomes Installed equipment small scale, but well-tailored at Universities. Large facility @RAL but needs expt’s to motivate changes. Get the individual experiments and/or University Groups to bid for (big) money.
3
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 3 New external sources of computing funding Joint Research Equipment Initiative (JREI) The aim of JREI is to contribute to the physical research infrastructure and to enable high-quality research to be undertaken, particularly in areas of basic and strategic priority for science and technology, such as those identified by Foresight. £99M in 1999, the 4th round Joint Infrastructure Fund (JIF) £700M over three years The money will enable universities to finance essential building, refurbishment and equipment projects to ensure that they remain at the forefront of international scientific research.
4
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 4 Personal summary: PPARC comp. JREI bids Following represents a summary of what information is available from various sources, including PPARC.
5
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 5 Personal summary: PPARC comp. JIF bids Following represents a summary of what information is available from various sources, excluding PPARC.
6
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 6 Particle Physics Bids (1) BaBar JREI 98 - awarded £800K for disk and servers at 10 UK sites 12.5TB RAID ~10TB usable Sun won tender, installation soon LHCb JREI 98 - awarded MAP - Montecarlo Array Processor 300 Linux PCs in custom-built chassis CDF JIF 98 - submitted December 98 postponed until next round T-Quarc at FNAL, 10TB disk, 4 SMP workstations at RAL, 5TB disk, 5TB tape, SMP and line to FNAL at 4 univs, single cpu machine and 1.7TB of disk
7
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 7 Particle Physics Bids (2) BaBar JIF 99 - submitted April 99 Line to SLAC computers for analysis, big SMP at RAL, smaller SMPs at each site, Linux farm(s) for simulation. LHCb JREI 99 - submitted May 99 40 PCs with 1TB of disk each to store data generated by MAP and analyse it.
8
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 8 Timescales and deadlines for bid procedures Recent round needed to be submitted by May 1999 with decisions not expected before January 2000. The 1999 bids had to be submitted by Spring 1999 with decision expected around November 1999. Next round submission dates is 11th October 1999 for the “decision point” expected to be March 2000.
9
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 9 Current Computing Resources in the UK Considering the central facilities currently available: Large central datastore, combining both a large disk pool, together with backing store and tape robots. Central CPU farms and servers currently comprising of : CSF facility based on Hewlett-Packard processors. Windows NT facility based on Pentium processors. Upgraded Linux-CSF facility based on Pentium processors. In addition, home Universities have considerable power in workstation clusters and dedicated farms, often “harvested” by software-”robots” which serve out tasks remotely.
10
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 10 Current usage statistics of the RAL datastore Typically, ~10->15 Tb accessible from the datastore, but only ~5 Tb actively used, at any recently given time,
11
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 11 Usage of the HP CSF facility at RAL As an example snapshot of the use of the service, from April ‘99 to September ‘99, average use is ~80%.
12
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 12 Linux CSF farm and its usage at RAL The Linux farm now consists of: Forty Pentium II 450 MHz CPU’s with 256Kb memory, 10 Gb fast local disk, 100 Mb/s fast ethernet. maximum capacity Currently well used by active experiments, and with excellent potential for upgrades.
13
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 13 Windows NT Farm and usage at RAL Ten dual processor machines with 450 MHz CPUs added to the farm. Upgrade increases the capacity of the farm by factor of ~5. Service used heavily by both ALEPH and LHCb for MC production. Will be used as part of LHCb plans to generate large numbers (10 6 ) inclusive bbar events in the near future. Automatic job submission software set-up for LHCb System software replication set-up so it’s now very easy to extend the system as appropriate.
14
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 14 New computing resources outside of RAL 300 processors 400MHz PII 128 Mbytes memory 3 Gbytes disk D-Link 100BaseT ethernet +hubs commercial units BUT custom boxes for packing and cooling On the basis of the new funding arrangements in the UK, University of Liverpool was given funds to make MAP, a large MC processor based on cut-down linux nodes: The nodes are rack mounted and running a stripped down version of RedHat Linux 5.2. Tailored for production using a 1 Tb local mounted disk, but needs corresponding solution for analysing the data locally.
15
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 15 Computing resources outside of RAL: MAP Master External Ethernet MAP Slaves Hub 100BaseT System is scalable, can be increased by adding more slaves, and/or network hubs. Benefits from bulk purchase of uniform hardware…. The idea: and in reality:
16
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 16 Future plans for cpu upgrades etc. Intention is to develop the Linux farm @ RAL: Order 30 new dual-processor 600 MHz nodes to be added to the existing cluster. Add more hardware around April/May next financial year to keep up with demand. Also plans to augment MAP at Liverpool with subsystems at additional LHCb UK sites, also: Developing COMPASS, a model for LHC analyses. Using a fast Linux server to check large disk pool read/write speeds of 50/20 Mbs with over 1 Tb of data space attached.
17
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 17 Service Centre Future “plans” for LHC computing in the UK Given the new funding arrangements in the UK, and the challenges facing us with the LHC computing needs: CER N Tier-1 Regional Centre Tier 1 regionalCentre Tier-2 Regional Centre Institutes UK plans to operate a Tier 1 Regional Centre based @ RAL, with several Tier 2 Centres (such as MAP/COMPASS) at the Universities. Submission of an LHC-wide UK JIF bid for capital funding for the years through the LHC start-up.
18
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 18 Ramping up the UK resources for the LHC The resources needed are dependent, somewhat, on the computing models adopted by the experiments, but are currently: An additional tape robot will be purchased in 2003, to allow datastore extensions to 320 Tb. Network bandwidth to CERN is assumed to be 50 Mbs with similar performances achieved to Tier 2 centres in 2003 and increased thereafter to 500 Mbs.
19
26/11/99LHCb Software Week at CERN, Andrew Halley (CERN) 19 Tentative conclusions and summary. Clearly, the field is evolving quickly. Status can be broken down into : upgraded linux (NT?) farms ~doubling capacity every year or so, increases in datastore size. new massive simulation facilities like MAP coming online, analyses engines being developed to cope with generated data rates. development of Tier 1 and 2 data centres with 2 orders of magnitude increases in stored data & cpu power, 2-3 orders of magnitude in bandwidth improvements in network access.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.