Tom Dietel University of Cape Town for the ALICE Collaboration Computing for ALICE at the LHC.

Slides:



Advertisements
Similar presentations
First results from the ATLAS experiment at the LHC
Advertisements

The Biggest Experiment in History. Well, a tiny piece of it at least… And a glimpse 12bn years back in time To the edge of the observable universe So.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
24/04/2007ALICE – Masterclass Presentation1 ALICE Hannah Scott University of Birmingham.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
Mini Bang at Big Accelerators Prashant Shukla Institute of Physics University of Heidelberg Presentation at ISA, 30 January 2005, Heidelberg, Germany.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
LHC’s Second Run Hyunseok Lee 1. 2 ■ Discovery of the Higgs particle.
1 ALICE Status Orlando Villalobos Baillie University of Birmingham NuPECC Meeting Edinburgh 10 th October 2014.
Christina Markert Physics Workshop UT Austin November Christina Markert The ‘Little Bang in the Laboratory’ – Accelorator Physics. Big Bang Quarks.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
03/27/'07T. ISGC20071 Computing GRID for ALICE in Japan Hiroshima University Takuma Horaguchi for the ALICE Collaboration
1 Chasing the Higgs boson with a worldwide distributed trigger system Sander Klous NIKHEF VENI proposal 2006.
Discovery of the Higgs Boson Gavin Lawes Department of Physics and Astronomy.
Institute for Anything of the University of Everything Claudia-Elisabeth Wulz New Physics at the LHC C.-E. Wulz (Institute of High Energy Physics) 1 Institute.
Chinorat Kobdaj SPC May  What is heavy ion physics?  What is ALICE?
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
1 CERN related research program in Nuclear Physics High energy nuclear physics –ALICE experiment »Installation and Commissioning »Data taking.
What is the Higgs??? Prof Nick Evans University of Southampton.
ATLAS experiment at the CERN Large Hadron Collider Peter Watkins, Head of Particle Physics Group, University of Birmingham, UK
Next Generation Operating Systems Zeljko Susnjar, Cisco CTG June 2015.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Introduction to CERN Activities
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Pierre VANDE VYVRE for the O 2 project 15-Oct-2013 – CHEP – Amsterdam, Netherlands.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
O 2 Project Roadmap P. VANDE VYVRE 1. O2 Project: What’s Next ? 2 O2 Plenary | 11 March 2015 | P. Vande Vyvre TDR close to its final state and its submission.
ALICE Online Upgrade P. VANDE VYVRE – CERN/PH ALICE meeting in Budapest – March 2012.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
CWG13: Ideas and discussion about the online part of the prototype P. Hristov, 11/04/2014.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Monitoring for the ALICE O 2 Project 11 February 2016.
1 A collision in the CMS detector Particle trajectories are reconstructed with precision of few microns (1 μ = m)
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
ALICE O 2 | 2015 | Pierre Vande Vyvre O 2 Project Pierre VANDE VYVRE.
Workshop ALICE Upgrade Overview Thorsten Kollegger for the ALICE Collaboration ALICE | Workshop |
16 September 2014 Ian Bird; SPC1. General ALICE and LHCb detector upgrades during LS2  Plans for changing computing strategies more advanced CMS and.
Domenico Elia1 ALICE computing: status and perspectives Domenico Elia, INFN Bari Workshop CCR INFN / LNS Catania, Workshop Commissione Calcolo.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
Predrag Buncic CERN Plans for Run2 and the ALICE upgrade in Run3 ALICE Tier-1/Tier-2 Workshop February 2015.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
The Discovery of THE HIGGS BOSON Why is it Exciting?
Grid site as a tool for data processing and data analysis
evoluzione modello per Run3 LHC
The 'Little Bang’ in the Laboratory - Physics at the LHC
Workshop Computing Models status and perspectives
LHC experiments Requirements and Concepts ALICE
ALICE – First paper.
Offline data taking and processing
LHC DATA ANALYSIS INFN (LNL – PADOVA)
Commissioning of the ALICE HLT, TPC and PHOS systems
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
ALICE Computing Upgrade Predrag Buncic
Presentation transcript:

Tom Dietel University of Cape Town for the ALICE Collaboration Computing for ALICE at the LHC

Outline Physics at the Large Hadron Collider Higgs – ATLAS & CMS Quark-Gluon Plasma – ALICE (+ATLAS,CMS) Computing for ALICE Present: Processing LHC Run-1 ( ) Near Future: Run-2 ( ) Long-Term Development: Run-3 (after 2018)

Large Hadron Collider collides protons and lead ions at > % of the speed of light to research the most fundamental particles and their interactions CMS LHCb ATLAS ALICE

Search for the Higgs Boson CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 4 quantum field fills universe field gives mass to elementary particles: W/Z, quarks, leptons new particle → Higgs boson Predicted in 1964 Peter Higgs R. Brout, F. Englert G. S. Guralnik, C. R. Hagen, and T. W. B. Kibble

ATLAS Higgs Candidate CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 5

Discovery of the Higgs Boson at LHC CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 6 Spring 2010 start of data taking 4 July 2012 discovery of a new particle March 2013 it’s a Higgs! October 2013 Nobel prize Extremely Rare few 100 Higgs in a quadrillion (10 15 ) collisions

Mass of the Proton - the other 99% CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 7 Proton contains 3 quarks 2 up-quarks: m u ≈ 2.5 MeV 1 down-quark: m d ≈ 5 MeV Proton heavier than 3 quarks 2u+1d: mass ≈ 10 MeV m p = 931 MeV 100 time heavier Where does the mass come from? Quantum-Chromodynamics Confinement: no free quarks

Quark-Gluon Plasma Compression reduce distance between nucleons Heating thermally create pions fill space between nucleons hadrons overlap quarks roam freely (deconfinement) Quark-Gluon Plasma CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 8

Heavy-Ion Physics Can the quarks inside the protons and neutrons be freed? What happens to matter when it is heated to times the temperature at the centre of the Sun? Why do protons and neutrons weigh 100 times more than the quarks they are made of? → collisions of heavy nuclei (Pb) at high energies

10

ALICE Event Display CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 11

CERN and South Africa CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 12 SA-CERN home to all CERN research in South Africa 5 universities + 1 national lab more than 60 scientists SA-CERN home to all CERN research in South Africa 5 universities + 1 national lab more than 60 scientists ALICE heavy-ion physics quark-gluon plasma UCT, iThemba ALICE heavy-ion physics quark-gluon plasma UCT, iThemba ATLAS particle physics Higgs physics SUSY, BSM UCT, UKZN, UJ, Wits ATLAS particle physics Higgs physics SUSY, BSM UCT, UKZN, UJ, Wits ISOLDE rare isotope facility nuclear and atomic physics UKZN, UWC, Wits, iThemba ISOLDE rare isotope facility nuclear and atomic physics UKZN, UWC, Wits, iThemba Theory particle, heavy-ion and nuclear physics UCT, UJ, Wits Theory particle, heavy-ion and nuclear physics UCT, UJ, Wits

ALICE Data Flow CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 13 Event 1 readout of detectors approx. 1 collision (but: pile-up, empty events) data block: 1 (pp) to 100 MB (Pb-Pb) independent → embarrassingly parallel processing Event 1 readout of detectors approx. 1 collision (but: pile-up, empty events) data block: 1 (pp) to 100 MB (Pb-Pb) independent → embarrassingly parallel processing Storage disk buffer: short term, random access working copy long term (“tape”): backup Storage disk buffer: short term, random access working copy long term (“tape”): backup Reconstruction merge signals from same particle determine particle properties (momentum, energy, species) Reconstruction merge signals from same particle determine particle properties (momentum, energy, species) Simulation event generators model of known physics compare experiment / theory particle transport model of detectors correct for detector effects Simulation event generators model of known physics compare experiment / theory particle transport model of detectors correct for detector effects User Analysis extraction of physics results based on reconstructed data 100’s of different analysis at User Analysis extraction of physics results based on reconstructed data 100’s of different analysis at

Reconstruction – Bubble Chambers CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 14

Raw Data production: 7 PB Tom Dietel 15 TeV TeV TeV TeV TeV TeV CHPC National Meeting, 4-6 Dec 2013 Big Data!

ALICE Grid Computing CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 16 Tier-0 CERN ( + Budapest) reco, sim, analysis 1 copy of raw data Tier-1 reco, sim, analysis 1 shared copy of raw data Tier-2 sim, analysis no access to raw data

ALICE Computing Resources tape storage (Tier-0: 22.8 PB, Tier-1: 13.1 PB) network human resources CPU Cores total: cores Disk Storage (PB) total: 28.7 PB

ALICE GRID Sites Tom Dietel 18CHPC National Meeting, 4-6 Dec 2013

South African Tier-2 at CHPC CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 19 iQudu Cluster IBM e1350 cluster 160 nodes – 2 dual-core AMD 2.6 GHz – 16 GB RAM ethernet + infiniband 100 TB storage (xroot) launched in 2007 – high power consumption – aging hardware used by ALICE since October 2012

ALICE Computing at CHPC CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 20 Avg: 348 running jobs 1% of all ALICE jobs

Completed Jobs at CHPC CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 21 Start of CHPC Network Switch Failure jobs / month

Tom Dietel 22 CPU delivered 2012 South Africa 0.3% projection for 2013: ~ 1% Resources Sharing CHPC National Meeting, 4-6 Dec 2013

CPU Requirements – RUN2 Tom Dietel % CHPC National Meeting, 4-6 Dec 2013

Disk Requirements – RUN2 Tom Dietel 24 ×2.3 CHPC National Meeting, 4-6 Dec 2013

CHPC Upgrade WLCG sign MoU in (April) 2014 representation in WLCG replace grid cluster (iQudu) first quarter of GHz 900 TB storage ALICE + ATLAS additional human resources goal: Tier-1 Parallel session “CHPC Roadmap” Fri morning

ALICE LS2 Upgrade CHPC National Meeting, 4-6 Dec 2013 Tom Dietel /19 (LHC 2nd Long Shutdown) 50 kHz Pb-Pb collisions ALICE Hardware Upgrade Inner Tracking System (ITS) Time Project Chamber Change of Strategy all data into online computing farm continuous readout of detectors massive online processing

ALICE Challenges for Run-3 data rates – reduce 1 TB/s to 30 GB/s – data compression – use partial reconstruction overlapping events – process time-slices – major change in data model DetectorEvent Size (MB) Bandwidth (GByte/s) TPC TRD ITS0.840 Others0.525 Total

CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 28 Computing Working Groups CWG1 Architecture CWG2 Tools CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG7 Reconstruction CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG8 Physics Simulation CWG7 Reconstruction CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG8 Physics Simulation CWG7 Reconstruction CWG9 QA, DQM CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG9 QA, DQM CWG10 Control, Configuration CWG8 Physics Simulation CWG7 Reconstruction CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG9 QA, DQM CWG10 Control, Configuration CWG11 Software Lifecycle CWG8 Physics Simulation CWG7 Reconstruction CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model CWG5 Computing Platforms CWG6 Calibration CWG9 QA, DQM CWG12 Computing Hardware CWG10 Control, Configuration CWG11 Software Lifecycle CWG8 Physics Simulation CWG7 Reconstruction CWG1 Architecture CWG2 Tools CWG3 Dataflow CWG4 Data Model

Summary Present ALICE computing part of WLCG – more than CPU cores – almost 30 TB of data – Big Data! South Africa – CHPC – 1% of ALICE resources Near Future growth within current computing model upgrade of CHPC – towards Tier-1 Long-term Future major ALICE upgrade → extreme data rates new computing concepts → huge R&D effort

CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 30 Backup

AliRoot CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 31

O 2 Project Institution Boards Computing Board Online Institution Board Computing Working Groups Projects O 2 Steering Board Project Leaders DAQ CWG1 Architecture CWG2 Procedure & Tools HLT CWG3 DataFlow CWG4 Data Model Offline CWG5 Platforms CWG6 Calibration CWG13 Sw Framework CWGnn CWG7 Reconstruc. CWG8 Simulation CWGnn CWGnn CWGnn CWGnn CWG9 QA, DQM, Vi CWG10 Control CWG11 Sw Lifecycle CWG12 Hardware - 50 people active in 1-3 CWGs - Service tasks

O 2 Hardware System 2 x 10 or 40 Gb/s FLP 10 Gb/s FLP ITS TRD Muon FTP L0 L1 FLP EMC FLP TPC FLP TOF FLP PHO Trigger Detectors ~ 2500 links in total ~ 250 FLPs First Level Processors EPN Data Storage EPN Storage Network Storage Network Farm Network Farm Network 10 Gb/s ~ 1250 EPNs Event Processing Nodes

CHPC National Meeting, 4-6 Dec 2013 Tom Dietel 34 Dataflow Model