Presentation is loading. Please wait.

Presentation is loading. Please wait.

Laura Gilbert Particle Physics and Grid – A Virtualisation Study.

Similar presentations


Presentation on theme: "Laura Gilbert Particle Physics and Grid – A Virtualisation Study."— Presentation transcript:

1 Laura Gilbert Particle Physics and Grid – A Virtualisation Study

2 Agenda What are we looking for? Exciting Physics!! Overview of Particle Accelerators and the ATLAS detector. The Computing Challenge – data taking and simulation. Where do the Grid and virtualisation come in? Presentation of Results – VIRTUALISATION STUDY DEMONSTRATION

3 What are we looking for? IForces: IIIIIGeneration: QUARKS LEPTONS u d c s t b e μτ υeυe υμυμ υτυτ g γ Z W The Standard Model: we have detected matter and force particles. We think we know how these interact with each other.

4 What are we looking for? We are looking for a “Theory of everything”. So what’s missing? –Are these particles “fundamental”? –Are there more? –How do we get mass? –What is gravity? (force particle? superstrings?) –Why is there more matter than antimatter in the universe?

5 Looking for New Physics Looking for particles we have never seen before. If these particles exist, we haven’t seen them because we didn’t have enough energy to make them. Mass can be created from energy (Einstein) → heavy particles can be made from lighter ones with lots of kinetic energy. ATLAS will work at higher energies than ever before.

6 + - ++ -- + -- + Particle Accelerators “Bunch” of +ve protons Linear array of plates with holes: alternating high energy field applied. As particles approach a plate they are accelerated towards it by an opposite charge on the plate. As they pass through the plate, polarity is switched: plate now repels them. They are accelerated towards the next plate.

7 Particle Accelerators + - ++ - To allow greater acceleration the accelerator is circular. The path of a charged particle is curved in the presence of a magnetic field. The tracks of the protons are curved to fit using dipole magnets: Magnetic fields curve particle paths Electric fields speed up particles Linear array of plates with holes: alternating high energy field applied. As particles approach a plate they are accelerated towards it by an opposite charge on the plate. As they pass through the plate, polarity is switched: plate now repels them. They are accelerated towards the next plate.

8 CERN (birthplace of the World Wide Web!) 8.5km The path of the LHC… 100m below ground ATLAS – proton beams collide here The SPS Super Proton Synchrotron (SPS) accelerates protons. Large Hadron Collider (LHC) accelerates further and collides them. A television is an accelerator in which electrons gain around 10 keV (10 000 eV). The SPS will accelerate protons to around 7 TeV (7 000 000 000 000 eV).

9 The experiment at ATLAS Two protons collide at very high energy…

10 The Structure of a proton: Three Quarks (two “up”, one “down”) Held together by “gluons” “Sea” of virtual particles: individually variable, statistically constant (depending on energy). The experiment at ATLAS Two protons collide at very high energy…

11 Two protons collide at very high energy Exactly what happens at the point of collision is unknown, but models exist which mimic the outcome of real collisions The experiment at ATLAS ?

12 Detecting and Identifying particles Particles can be identified (almost) UNIQUELY by their mass and charge. These are what we need to measure. Different types of particle interact in different ways, on different timescales. We need a physically very large detector with many different components.

13 ATLAS Summary Particle physics experiment, due to start taking data in 2007 Higher energies than ever recorded before Largest collaborative effort ever attempted in physical sciences: 1850 physicists at 150 universities in 34 countries. Expected cost around $400 million. Computing Hardware Budget (CERN alone) around $20 million physicist! Electromagnetic calorimeters hadron calorimeters magnets - to curve charged particle tracks muon detectors 44m 22m

14 The Computing Challenge Data-Taking Simulation Analysis Modern Particle physics experiments are highly CPU intensive.

15 Data Taking Data-taking is the process of recording and refining the output from the detector once it is up and running. ATLAS will have around 16 million readout channels More than 1MB of raw data for every event recorded. Beams will collide every 25ns. On-line trigger reduces rate of data recording to several hundred a second: selects and records only obviously useful events. The total amount of data recorded per year will be of the order of petabytes.

16 Simulation Mimicking the way a real event would look in the real detector. The simulation of the ATLAS detector is vital: 1)During design phase: to develop an optimal detector (within constraints of “technology, survivability and finances”). 2) Data-taking phase: simulations become important for calibration and understanding of the data.

17 Event Generation Detector Simulation –ATLFAST: simple: smearing only fast (~100,000 events/hr) –GEANT4 very detailed, includes electronics response and digitisation, cooling and support structures. slow (~1 event/hr) simulated data equivalent in format to the “real” data. Event Reconstruction Simulation Three Phases: ?

18 Analysis Users write code which is applied to data. Searches for signal of chosen event, removes backgrounds Example: I have generated a million W bosons through ATLFAST, demonstrated I can select them, and displayed a histogram of the reconstructed “transverse” mass (which peaks over real W mass of 80GeV)

19 Where does Grid come in? On-the-fly data taking (Tier 0) Compute-intensive simulation programs (Tier 0 and 1) Analysis, IO intensive (Tier 0, 1 and 2) LHC Computing Grid Project (LCG) – collaboration –5,000 CPUs –4,000 TB of storage –70 sites around the world –4,000 simultaneous jobs –monitored via Grid Operations Centre (RAL) Security issues - virtualised clusters vital. Not tested on HEP applications before.

20 Testing at Dell, Austin Benchmarking of PE2650 and PE6650s with ATLAS software – GEANT simulations. PE2650 ~2x faster than 6650. Cluster set up: PE6650 master node, eight PE2650 compute elements. Measured overhead of: –Hyper-threading on PE2650s (2.4±1.1%) and PE6650s (3.5±2.6%) –Sun Grid Engine on PE2650s and PE6650s - negligible. –VMware (ESX) on PE2650 (11.1±3.8% for single instance) Comparison of simulation speeds over cluster (batch jobs) with and without VMware - new study. Increased number of instances of VMware on cluster. Paper submitted to “19th International Parallel and Distributed Processing Symposium”

21 Virtualising a Cluster Comparison: 1)8 physical nodes: 2 GB RAM, 2 CPUs, 2 queues → total of 8 CEs, 16 queues. 2)16 virtual nodes: 1 GB RAM, 1 CPU, 1 queue → total of 16 CEs, 16 queues. 3)32 virtual nodes: 0.5 GB RAM, sharing one CPU between two, 1 queue → total of 32 CEs, 32 queues. Batches of 16, 32, 64 and 92 jobs sent, total Time to completion measured.

22 Virtualising a Cluster Results: 5.0±2.2% increase 1 → 2 VMs/CPU. Not useful unless you consider MRT (7% improvement) 8 physical → 16 virtual vms: <11% increase. Large batches: virtualised cluster out-performs physical. Not a memory issue or re-ordering of jobs. Context switching overhead?

23 Further Work? Other virtualisation software: GSX or Bochs VMware SMP License Hardware: more memory, processors etc. Different applications Many thanks to the HPCC and Virtualisation teams at Dell, especially Saeed Iqbal, Ron Pepper, Garima Kochhar, Monica Kashyap, Rinku Gupta, Jenwei Hsieh, and Mark Cobban, my sponsor at Dell EMEA.


Download ppt "Laura Gilbert Particle Physics and Grid – A Virtualisation Study."

Similar presentations


Ads by Google