LHC computing HEP 101 Lecture #8 ayana arce. Outline Major computing systems for LHC experiments: –(ATLAS) Data Reduction –(ATLAS) Data Production –(ATLAS)

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

First results from the ATLAS experiment at the LHC
From Quark to Jet: A Beautiful Journey Lecture 1 1 iCSC2014, Tyler Dorland, DESY From Quark to Jet: A Beautiful Journey Lecture 1 Beauty Physics, Tracking,
HPC - High Performance Productivity Computing and Future Computational Systems: A Research Engineer’s Perspective Dr. Robert C. Singleterry Jr. NASA Langley.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
ATLAS Analysis Model. Introduction On Feb 11, 2008 the Analysis Model Forum published a report (D. Costanzo, I. Hinchliffe, S. Menke, ATL- GEN-INT )
1 Hadronic In-Situ Calibration of the ATLAS Detector N. Davidson The University of Melbourne.
Tracker Reconstruction SoftwarePerformance Review, Oct 16, 2002 Summary of Core “Performance Review” for TkrRecon How do we know the Tracking is working?
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Introduction Advantages/ disadvantages Code examples Speed Summary Running on the AOD Analysis Platforms 1/11/2007 Andrew Mehta.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Postgraduate Computing Lectures Applications I: Overview 1 Applications: Overview Symbiosis: Theory v. Experiment Theory –Build models to explain existing.
Event Reconstruction in SiD02 with a Dual Readout Calorimeter Detector Geometry EM Calibration Cerenkov/Scintillator Correction Jet Reconstruction Performance.
Alexander Richards, UCL 1 Atlfast and RTT (plus DCube) Christmas Meeting 18/12/2007.
9-13/9/03 Atlas Overview WeekPeter Sherwood 1 Atlfast, Artemis and Atlantis What, Where and How.
CaloTopoCluster Based Energy Flow and the Local Hadron Calibration Mark Hodgkinson June 2009 Hadronic Calibration Workshop.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Argonne Jamboree January 2010 Esteban Fullana AOD example analysis.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.
Kyle Cranmer (BNL)HCP, Isola d’Elba, March 23, The ATLAS Analysis Architecture Kyle Cranmer Brookhaven National Lab.
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
1 D.Chakraborty – VLCW'06 – 2006/07/21 PFA reconstruction with directed tree clustering Dhiman Chakraborty for the NICADD/NIU software group Vancouver.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
The ATLAS TAGs Database - Experiences and further developments Elisabeth Vinek, CERN & University of Vienna on behalf of the TAGs developers group.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
The “Comparator” Atlfast vs. Full Reco Automated Comparison Chris Collins-Tooth 19 th February 2006.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
TAGS in the Analysis Model Jack Cranshaw, Argonne National Lab September 10, 2009.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
ATLAS Z-Path Masterclass Masterclass Analysis Intro.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Finding Data in ATLAS. May 22, 2009Jack Cranshaw (ANL)2 Starting Point Questions What is the latest reprocessing of cosmics? Are there are any AOD produced.
Axel Naumann, DØ University of Nijmegen, The Netherlands 6/20/2001 Dutch Morning Meeting 1 From n-Tuples to b-Tags ?
L1Calo EM Efficiency Maps Hardeep Bansil University of Birmingham L1Calo Weekly Meeting 07/03/2011.
WLCG November Plan for shutdown and 2009 data-taking Kors Bos.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
Photon purity measurement on JF17 Di jet sample using Direct photon working Group ntuple Z.Liang (Academia Sinica,TaiWan) 6/24/20161.
Starting Analysis with Athena (Esteban Fullana Torregrosa) Rik Yoshida High Energy Physics Division Argonne National Laboratory.
David Lange Lawrence Livermore National Laboratory
ATLAS Distributed Computing Tutorial Tags: What, Why, When, Where and How? Mike Kenyon University of Glasgow.
An AOD analysis example Esteban Fullana Torregrosa High Energy Physics Division Argonne National Laboratory.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
AOD example analysis Argonne Jamboree January 2010
ALICE analysis preservation
Overview: high-energy computing
Commissioning of the ALICE HLT, TPC and PHOS systems
An introduction to the ATLAS Computing Model Alessandro De Salvo
Data Analysis in Particle Physics
Linear Collider Simulation Tools
Plans for checking hadronic energy
High Granularity Calorimeter Upgrade Studies
ATLAS DC2 & Continuous production
Linear Collider Simulation Tools
Presentation transcript:

LHC computing HEP 101 Lecture #8 ayana arce

Outline Major computing systems for LHC experiments: –(ATLAS) Data Reduction –(ATLAS) Data Production –(ATLAS) Data Analysis End-user tools: –Exercise: plotting and fitting data with ROOT –homework: writing a toy Monte Carlo

DATA REDUCTION managing the data volume

overview: the data reduction chain Hardware Trigger (prefilter) Event Filter (software event selection) data reconstruction and distribution

The TDAQ system Trigger: –(almost) real-time filtering of collision events –Events read every ~25ns: how long does the trigger take to decide? DAQ: –Sends event data through the trigger and readout systems –Merges trigger and detector conditions data with event data

L1 select 1/10,000 in 2.5 µs hardware-based, 256 items L2 select 1/15 in 40 ms L3 read global detector data select 1/15 in 4 seconds Storage similar triggers grouped: data streams analysis trigger data used to account for bias local (event fragments) ~1700 nodes (8/12 core, 16/24 GB) dedicated L3 ~10 Gb links flexible L2/L3 processors 10 Gb links ATLAS full events ATLAS trigger system

Example: electron trigger is it an electron? clusteringtrackingelectron selection is there a cluster of hot cells with straight tracks nearby? clustering cluster selection tracking cluster/track matching are any EM calorimeter regions hot?

DATA PRODUCTION managing the data volume

Global data processing and storage LHC data output estimate: 15 PB/year (and we prefer multiple copies) –Stored and processed on WLCG: shared by all CERN experiments –Your “local” Tier-1: BNL –Your local Tier-3: in your backpack! Every stored physics event is modeled by many simulated events –thus most resources are spent in Monte Carlo simulation note: ATLAS computing systems alone must handle MILLIONS of production/analysis jobs daily

analyze create MC analyze create MC backup RAW reprocess (re-reconstruct) backup RAW reprocess (re-reconstruct) store RAW calibrate reconstruct (6k cores) store RAW calibrate reconstruct (6k cores) Tier 0 Tier 1 Tier 2 Tier 1 Tier 2 físicos physicists 理者理者 38 T2 centers 120k cores total cernVM environment ATLAS Tier computing: roles

Production: data ATLAS trigger convert MERGE & derive MERGE & derive bytestream RECO esd aod tag D3PD aod RDO (raw) RDO (raw) pattern recognition sorting

Production: Monte Carlo MERGE & derive MERGE & derive RECO esd aod tag D3PD aod MONTE CARLO PRODUCTION CHAIN RDO (raw) RDO (raw)

pick random x, random y if y 2 < 1-x 2 : increment area What is Monte Carlo, really? HEP predictions require a lot of convolution integrals –one reason: QM! Monte Carlo calculation of π

pick random x, random y if y 2 < 1-x 2 : increment area What is Monte Carlo? HEP predictions require a lot of convolution integrals –one reason: QM! The Monte Carlo Method: –use random numbers as an integration tool Monte Carlo calculation of π this is probably the simplest way to use a computer for a calculation… but it works!

What is Monte Carlo? Z picks mass and decay angles electron E T electron E T The Monte Carlo Method: –use random numbers as an integration tool Very intuitive picture of convolution integrals: –a series of choices from probability distributions

What is Monte Carlo? Z picks mass and decay angles electron E T electron E T calorimeter (mis)measurement observed electron E T The Monte Carlo Method: –use random numbers as an integration tool Very intuitive picture of convolution integrals: –a series of choices from probability distributions

Meet your (3-part) Monte Carlo Slides: Sjöstrand

Meet your MC: PYTHIA, HERWIG, MadGraph, MCFM, BaurMC, POWHEG, &c. …

Meet your MC: PYTHIA, HERWIG/JIMMY, Sherpa…

What’s the third part? Detector simulation: up to 5 minutes for a high-mass event (lots of particles, each individually tracked through hundreds of detector elements) why is this essential?

DATA ANALYSIS measurements and discoveries!

ATLAS computing for users Programming languages Main programming languages: – FORTRAN (some generators) – C++ (main reconstruction algorithms, analysis) – python (steering, analysis) Interactive interfaces Main interface: athena – reads all data formats – C++ ; steered by python – this runsall simulation and reconstruction – can run your analysis too…but excecutable typically 4GB Light interface: ROOT

Data representation always organized by event global quantities: –metadata –missing energy… physics object lists: –muons –jets –tracks –“truth” particles… object properties: –hits on tracks –jet constituents µ µ track jet trac k hit event “n-tuple”  “tree”

Data representation Event number nTrackstrack pTtrack etatrack phitrack layers… …………

User’s interface to nature: histograms ``Hello World’’ for HEP computing: making a histogram TH1F::Fill(value,weight) TH1F(“name”, “title; x title; y title”, nBins, firstBinValue, LastBinValue)

EXAMPLE! note: in code examples, your input is given in green

Let’s measure the kaon lifetime (again)! open the ROOT file: –you% root Hep101Data_2013.root How to see everything in the file: –root [1] new TBrowser(); the file contains one histogram (taken from your homework)

Some ROOT features: root [0] double x(3.0),y(4.0); sqrt(x*x+y*y) (const double) e+00 root [1] TLorentzVector pion(1500,0,0, ); root [2] printf("The mass is %3.4g\n", pion.M( )); The mass is root [3] TMath::C( Double_t C() // m s^-1 root [4] TMath::C() (Double_t) e+08

Mathematical functions in ROOT Simple: FitPanel (under Tools) Also easy: root [9] KaonDecays->Fit(“expo”) More explicit: root [10] TF1 f("f","[0]*exp(-x/(100*[1]*TMath::C()))",0,60); //free parameters specified in brackets root [11] KaonDecays->Fit(f); Complete program (from Dave)

Next steps You can download ROOT: –root.cern.ch Homework: write your own Monte Carlo generator to solve Problem 2 from lecture 5 a neutral pion beam with energy E decays to two photons. What is the photon energy distribution in the laboratory frame? Feel free to contact with solutions, questions,

homework hint: random numbers Use the ROOT class TRandom3 for good performance. Example –root [1] TRandom3 r; –root [2] float random1 = r.Gaus(0,35); //generate a gaussian-distributed random number with mean 0 and width 35; –root [3] float random2 = r.Flat(0,2*TMath::Pi()); //generate a scalar meson decay angle

Postscript: if you don’t like C++ >>> import ROOT #from ROOT import * also works >>> pion = ROOT.TLorentzVector(1500,0,0, ); >>> print "The mass is", pion.M(), "MeV" The mass is MeV