Presentation is loading. Please wait.

Presentation is loading. Please wait.

21 March 2000System Managers Meeting Slide 1 The Particle Physics Computational Grid Paul Jeffreys/CCLRC.

Similar presentations


Presentation on theme: "21 March 2000System Managers Meeting Slide 1 The Particle Physics Computational Grid Paul Jeffreys/CCLRC."— Presentation transcript:

1 21 March 2000System Managers Meeting Slide 1 The Particle Physics Computational Grid Paul Jeffreys/CCLRC

2 21 March 2000System Managers Meeting Slide 2 Financial Times, 7 March 2000

3 21 March 2000System Managers Meeting Slide 3 Front Page FT, 7 March 2000

4 21 March 2000System Managers Meeting Slide 4 LHC Computing: Different from Previous Experiment Generations – Geographical dispersion: of people and resources – Complexity: the detector and the LHC environment – Scale: Petabytes per year of data – (NB – for purposes of this talk – mostly LHC specific) ~5000 Physicists 250 Institutes ~50 Countries Major challenges associated with:  Coordinated Use of Distributed Computing Resources  Remote software development and physics analysis  Communication and collaboration at a distance R&D: A New Form of Distributed System: Data-Grid

5 21 March 2000System Managers Meeting Slide 5 The LHC Computing Challenge – by example Consider UK group searching for Higgs particle in LHC experiment –Data flowing off detectors at 40TB/sec (30 million floppies/sec)! Factor of c. 5.10 5 rejection made online before writing to media –But have to be sure not throwing away the physics with the background –Need to simulate samples to exercise rejection algorithms Simulation samples will be created around the world Common access required –After 1 year, 1PB sample of experimental events stored on media Initial analysed sample will be at CERN, in due course elsewhere –UK has particular detector expertise (CMS: e -, e +,  ) –Apply our expertise to : access 1PB exptal. data (located?), re-analyse e.m. signatures (where?) to select c. 1 in 10 4 Higgs candidates, but S/N will be c. 1 to 20 (continuum background), and store results (where?) Also.. access some simulated samples (located?), generate (where?) additional samples, store (where?) -- PHYSICS (where?) In addition.. strong competition Desire to implement infrastructure in generic way

6 21 March 2000System Managers Meeting Slide 6 Proposed Solution to LHC Computing Challenge (?) A data analysis ‘Grid’ for High Energy Physics Tier 1 T2 3 3 3 3 3 3 3 3 3 3 3 3 CERN T2 4444 3 3

7 21 March 2000System Managers Meeting Slide 7 Access Patterns Raw Data ~1000 Tbytes AOD ~10 TB Reco-V1 ~1000 TbytesReco-V2 ~1000 Tbytes ESD-V1.1 ~100 Tbytes ESD-V1.2 ~100 Tbytes ESD-V2.1 ~100 Tbytes ESD-V2.2 ~100 Tbytes Access Rates (aggregate, average) 100 Mbytes/s (2-5 physicists) 500 Mbytes/s (5-10 physicists) 1000 Mbytes/s (~50 physicists) 2000 Mbytes/s (~150 physicists) Typical particle physics experiment in 2000-2005: On year of acquisition and analysis of data

8 21 March 2000System Managers Meeting Slide 8 Hierarchical Data Grid Physical –Efficient network/resource use  local > regional > national > oceanic Human –University/regional computing complements national labs, in turn complements accelerator site Easier to leverage resources, maintain control, assert priorities at regional/local level –Effective involvement of scientists and students independently of location The ‘challenge for UK particle physics’ … How do we: –Go from the 200 PC99 farm maximum of today to 10000 PC99 centre? –Connect/participate in European and World-wide PP grid? –Write the applications needed to operate within this hierarchical grid? AND –Ensure other disciplines able to work with us, our developments & applications are made available to others, exchange of expertise, and enjoy fruitful collaboration with Computer Scientists and Industry

9 21 March 2000System Managers Meeting Slide 9 Quantitative Requirements Start with typical experiment’s Computing Model UK Tier-1 Regional Centre specification Then consider implications for UK Particle Physics Computational Grid –Over years 2000, 2001, 2002, 2003 –Joint Infrastructure Bid made for resources to cover this –Estimates of costs Look further ahead

10 21 March 2000System Managers Meeting Slide 10

11 21 March 2000System Managers Meeting Slide 11

12 21 March 2000System Managers Meeting Slide 12

13 21 March 2000System Managers Meeting Slide 13

14 21 March 2000System Managers Meeting Slide 14

15 21 March 2000System Managers Meeting Slide 15

16 21 March 2000System Managers Meeting Slide 16

17 21 March 2000System Managers Meeting Slide 17

18 21 March 2000System Managers Meeting Slide 18 Steering Committee ‘Help establish the Particle Physics Grid activities in the UK' a. An interim committee be put in place. b. The immediate objectives would be prepare for the presentation to John Taylor on 27 March 2000, and to co-ordinate the EU 'Work Package' activities for April 14 c. After discharging these objectives, membership would be re-considered d. The next action of the committee would be to refine the Terms of Reference (presented to the meeting on 15 March) e. After that the Steering Committee will be charged with commissioning a Project Team to co-ordinate the Grid technical work in the UK f. The interim membership is: Chairman:Andy Halley Secretary:Paul Jeffreys Tier 2 reps:Themis Bowcock, Steve Playfer CDF:Todd Hoffmann D0:Ian Bertram CMS:David Britton BaBar:Alessandra Forti CNAP:Steve Lloyd –The 'labels' against the members are not official in any sense at this stage, but the members are intended to cover these areas approximately!

19 21 March 2000System Managers Meeting Slide 19 UK Project Team Need to really get underway! System Managers crucial! PPARC needs to see genuine plans and genuine activities… Must coordinate our activities And –Fit in with CERN activities –Meet needs of experiments (BaBar, CDF, D0, …) So … go through range of options and then discuss…

20 21 March 2000System Managers Meeting Slide 20 EU Bid(1) Bid will be made to EU to link national grids –“Process” has become more than ‘just a bid’ Almost reached the point where have to be active participant in EU bid, and associated activities, in order to access data from CERN in the future Decisions need to be taken today… Timescale: –March 7 Workshop at CERN to prepare programme of work (RPM) –March 17 Editorial meeting to look for industrial partners –March 30 Outline of paper used to obtain pre-commitment of partners –April 17 Finalise ‘Work Packages’ – see next slides –April 25 Final draft of proposal –May 1 Final version of proposal for signature –May 7 Submit

21 21 March 2000System Managers Meeting Slide 21 EU Bid(2) The bid was originally for 30MECU, with matching contribution from national funding organisations –Now scaled down, possibly to 10MECU –Possibly as ‘taster’ before follow-up bid? –EU funds for Grid activities in Framework VI likely to be larger Work Packages have been defined –Objective is that countries (through named individuals) take responsibility to split up the work and define deliverables within each, to generate draft content for EU bid –BUT Without doubt the same people will be well positioned to lead the work in due course.. And funds split accordingly?? Considerable manoeuvering! –UK – need to establish priorities, decide where to contribute…

22 21 March 2000System Managers Meeting Slide 22 Work Packages MiddlewareContact Point 1Grid Work SchedulingCristina VistoliINFNCristina Vistoli 2Grid Data ManagementBen SegalCERNBen Segal 3Grid Application MonitoringRobin MiddletonUK 4Fabric ManagementTim SmithCERNTim Smith 5Mass Storage ManagementOlof BarringCERNOlof Barring Infrastructure 6Testbed and DemonstratorsFrançois EtienneIN2P3François Etienne 7Network ServicesChristian MichauCNRSChristian Michau Applications 8HEP ApplicationsHans Hoffmann4exptsHans Hoffmann 9Earth Observation ApplicationsLuigi FuscoLuigi Fusco 10Biology ApplicationsChristian MichauChristian Michau Management 11Project ManagementFabrizio GagliardiCERNFabrizio Gagliardi Robin is ‘place-holder’ – holding UK’s interest (explanation in Open Session)

23 21 March 2000System Managers Meeting Slide 23 UK Participation in Work Packages MIDDLEWARE 1.Grid Work Scheduling 2. Grid Data ManagementTONY DOYLE, Iain Bertram? 3.Grid Application monitoringROBIN MIDDLETON, Chris Brew 4.Fabric Management 5.Mass Storage ManagementJOHN GORDON INFRASTRUCTURE 6.Testbed and demonstrators 7.Network ServicesPETER CLARKE, Richard Hughes- Jones APPLICATIONS 8.HEP Applications

24 21 March 2000System Managers Meeting Slide 24 PPDG

25 21 March 2000System Managers Meeting Slide 25 PPDG

26 21 March 2000System Managers Meeting Slide 26 PPDG

27 21 March 2000System Managers Meeting Slide 27 PPDG

28 21 March 2000System Managers Meeting Slide 28 LHCb contribution to EU proposal HEP Applications Work Package Grid testbed in 2001, 2002 Production 10 6 simulated b->D*pi –Create 10 8 events at Liverpool MAP in 4 months –Transfer 0.62TB to RAL –RAL dispatch AOD and TAG datasets to other sites 0.02TB to Lyon and CERN Then permit a study of all the various options for performing a distributed analysis in a Grid environment

29 21 March 2000System Managers Meeting Slide 29 American Activities Collaboration with Ian Foster –Transatlantic collaboration using GLOBUS Networking –QoS tests with SLAC –Also link in with GLOBUS? CDF and D0 –Real challenge to ‘export data’ –Have to implement 4Mbps connection –Have to set up mini Grid BaBar –Distributed LINUX farms etc in JIF bid

30 21 March 2000System Managers Meeting Slide 30 Networking Proposal - 1

31 21 March 2000System Managers Meeting Slide 31 Networking - 2

32 21 March 2000System Managers Meeting Slide 32 Networking - 3

33 21 March 2000System Managers Meeting Slide 33 Networking - 4

34 21 March 2000System Managers Meeting Slide 34 Pulling it together… Networking: –EU work package –Existing tests –Integration of ICFA studies to Grid Will networking lead the non-experiment activities?? Data Storage –EU work package Grid Application Monitoring –EU work package CDF, D0 and BaBar –Need to integrate these into Grid activities –Best approach is to centre on experiments

35 21 March 2000System Managers Meeting Slide 35 …Pulling it all together Experiment-driven –Like LHCb, meet specific objectives Middleware preparation –Set up GLOBUS? QMW, RAL, DL..? –Authenticate –Familiar –Try moving data between sites Resource Specification Collect dynamic information Try with international collaborators –Learn about alternatives to GLOBUS –Understand what is missing –Exercise and measure performance of distributed cacheing What do you think? Anyone like to work with Ian Foster for 3 months?!


Download ppt "21 March 2000System Managers Meeting Slide 1 The Particle Physics Computational Grid Paul Jeffreys/CCLRC."

Similar presentations


Ads by Google