Pere Mató, PH Department, CERN 19 September 2008.

Slides:



Advertisements
Similar presentations
1 AMY Detector (eighties) A rather compact detector.
Advertisements

EvtGen in ATLAS/LHC Roger W.L. Jones James R. Catmore Maria Smizanska Lancaster University, UK.
Experimental Particle Physics PHYS6011 Joel Goldstein, RAL 1.Introduction & Accelerators 2.Particle Interactions and Detectors (2) 3.Collider Experiments.
From Quark to Jet: A Beautiful Journey Lecture 1 1 iCSC2014, Tyler Dorland, DESY From Quark to Jet: A Beautiful Journey Lecture 1 Beauty Physics, Tracking,
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Bob Jacobsen July 22, 2003 From Raw Data to Physics From Raw Data to Physics: Reconstruction and Analysis Introduction Sample Analysis A Model Basic Features.
Sept 30 th 2004Iacopo Vivarelli – INFN Pisa FTK meeting Z  bb measurement in ATLAS Iacopo Vivarelli, Alberto Annovi Scuola Normale Superiore,University.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Event Simulation Tools in ALICE Andreas Morsch Generator Mini-Workshop CERN, Geneva June 20, 2003.
Introduction to the workshop LHCb Generators Tuning Mini Workshop Bucharest 22 nd & 23 rd November 2012 LHCb Generators Tuning Mini Workshop Bucharest.
LHC’s Second Run Hyunseok Lee 1. 2 ■ Discovery of the Higgs particle.
TILC09, April 2009, Tsukuba P. Mato /CERN.  Former LHCb core software coordination ◦ Architect of the GAUDI framework  Applications Area manager.
Niko Neufeld, CERN/PH-Department
LC Software Workshop, May 2009, CERN P. Mato /CERN.
LHC data processing challenges: from Detector data to Physics July 2007 Pere Mat ó CERN.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
LHC computing HEP 101 Lecture #8 ayana arce. Outline Major computing systems for LHC experiments: –(ATLAS) Data Reduction –(ATLAS) Data Production –(ATLAS)
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
Common Application Software for the LHC experiments NEC’2007 International Symposium, Varna, Bulgaria September 2007 Pere Mato, CERN.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
Andreas Morsch, CERN EP/AIP CHEP 2003 Simulation in ALICE Andreas Morsch For the ALICE Offline Project 2003 Conference for Computing in High Energy and.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Trigger & Analysis Avi Yagil UCSD. 14-June-2007HCPSS - Triggers & AnalysisAvi Yagil 2 Table of Contents Introduction –Rates & cross sections –Beam Crossings.
Study of the to Dilepton Channel with the Total Transverse Energy Kinematic Variable Athens, April 17 th 2003 Victoria Giakoumopoulou University of Athens,
Postgraduate Computing Lectures Applications I: Overview 1 Applications: Overview Symbiosis: Theory v. Experiment Theory –Build models to explain existing.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
VICOMTECH VISIT AT CERN CERN 2013, October 3 rd & 4 th O.COUET CERN/PH/SFT DATA VISUALIZATION IN HIGH ENERGY PHYSICS THE ROOT SYSTEM.
Bob Jacobsen Aug 6, 2002 From Raw Data to Physics From Raw Data to Physics: Reconstruction and Analysis Introduction Sample Analysis A Model Basic Features.
FIMCMS, 26 May, 2008 S. Lehti HIP Charged Higgs Project Preparative Analysis for Background Measurements with Data R.Kinnunen, M. Kortelainen, S. Lehti,
Prospects in ALICE for  mesons Daniel Tapia Takaki (Birmingham, UK) for the ALICE Collaboration International Conference on STRANGENESS IN QUARK MATTER.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
Niko Neufeld, CERN/PH. ALICE – “A Large Ion Collider Experiment” Size: 26 m long, 16 m wide, 16m high; weight: t 35 countries, 118 Institutes Material.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Top mass error predictions with variable JES for projected luminosities Joshua Qualls Centre College Mentor: Michael Wang.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
M. Gilchriese Basic Trigger Rates December 3, 2004.
ATLAS Trigger Development
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Data Acquisition, Trigger and Control
G Watts/UW Seattle2 SM Hidden Sector Phenomenology as complex as you wish Decay back to SM particles Scalar.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
General requirements for BES III offline & EF selection software Weidong Li.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
1 Offline Week, October 28 th 2009 PWG3-Muon: Analysis Status From ESD to AOD:  inclusion of MC branch in the AOD  standard AOD creation for PDC09 files.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
10 January 2008Neil Collins - University of Birmingham 1 Tau Trigger Performance Neil Collins ATLAS UK Physics Meeting Thursday 10 th January 2008.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
SEARCH FOR DIRECT PRODUCTION OF SUPERSYMMETRIC PAIRS OF TOP QUARKS AT √ S = 8 TEV, WITH ONE LEPTON IN THE FINAL STATE. Juan Pablo Gómez Cardona PhD Candidate.
V. Pozdnyakov Direct photon and photon-jet measurement capability of the ATLAS experiment at the LHC Valery Pozdnyakov (JINR, Dubna) on behalf of the HI.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
Introduction to Particle Physics II Sinéad Farrington 19 th February 2015.
CMS High Level Trigger Configuration Management
From Raw Data to Physics: Reconstruction and Analysis
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
Commissioning of the ALICE HLT, TPC and PHOS systems
Conference on High-Energy Physics Moscow, Russia,
Simulation and Physics
Measurement of b-jet Shapes at CDF
Presentation transcript:

Pere Mató, PH Department, CERN 19 September 2008

Pere Mato, CERN/PH2 The goal is to understand in the most general; that’s usually also the simplest. - A. Eddington We use experiments to inquire about what “reality” does. Theory & Parameters Reality This talk is about filling this gap

Pere Mato, CERN/PH3 “Clear statement of how the world works formulated mathematically in term of equations” Particle Data Group, Barnett et al Additional term goes here

Pere Mato, CERN/PH4 0x01e84c10: 0x01e8 0x8848 0x01e8 0x83d8 0x6c73 0x6f72 0x7400 0x0000 0x01e84c20: 0x0000 0x0019 0x0000 0x0000 0x01e8 0x4d08 0x01e8 0x5b7c 0x01e84c30: 0x01e8 0x87e8 0x01e8 0x8458 0x7061 0x636b 0x6167 0x6500 0x01e84c40: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84c50: 0x01e8 0x8788 0x01e8 0x8498 0x7072 0x6f63 0x0000 0x0000 0x01e84c60: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84c70: 0x01e8 0x8824 0x01e8 0x84d8 0x7265 0x6765 0x7870 0x0000 0x01e84c80: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84c90: 0x01e8 0x8838 0x01e8 0x8518 0x7265 0x6773 0x7562 0x0000 0x01e84ca0: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84cb0: 0x01e8 0x8818 0x01e8 0x8558 0x7265 0x6e61 0x6d65 0x0000 0x01e84cc0: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84cd0: 0x01e8 0x8798 0x01e8 0x8598 0x7265 0x7475 0x726e 0x0000 0x01e84ce0: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84cf0: 0x01e8 0x87ec 0x01e8 0x85d8 0x7363 0x616e 0x0000 0x0000 0x01e84d00: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84d10: 0x01e8 0x87e8 0x01e8 0x8618 0x7365 0x7400 0x0000 0x0000 0x01e84d20: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84d30: 0x01e8 0x87a8 0x01e8 0x8658 0x7370 0x6c69 0x7400 0x0000 0x01e84d40: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84d50: 0x01e8 0x8854 0x01e8 0x8698 0x7374 0x7269 0x6e67 0x0000 0x01e84d60: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84d70: 0x01e8 0x875c 0x01e8 0x86d8 0x7375 0x6273 0x7400 0x0000 0x01e84d80: 0x0000 0x0019 0x0000 0x0000 0x0000 0x0000 0x01e8 0x5b7c 0x01e84d90: 0x01e8 0x87c0 0x01e8 0x8718 0x7377 0x6974 0x6368 0x0000 1/5000th of an event in the CMS detector ◦ Get about 100 events per second

Pere Mato, CERN/PH5  Digitization: “Address”: what detector element took the reading “Value”: What the electronics wrote down Look up type, calibration info Look up/calculate spatial position Check valid, convert to useful units/form Draw

Pere Mato, CERN/PH6

Pere Mato, CERN/PH7 Raw Data Theory & Parameters Reality A small number of general equations, with specific input parameters (perhaps poorly known) The imperfect measurement of a (set of) interactions in the detector Very strong selection Observables Specific lifetimes, probabilities, masses, branching ratios, interactions, etc Events A unique happening: Run 21007, event 3916 which contains a H -> xx decay Reconstruction Analysis Phenomenology

Data Acquisition in the LHC Detectors8  Cross sections of physics processes vary over many orders of magnitude ◦ Inelastic: 10 9 Hz ◦ W   : 10 2 Hz ◦ t t production: 10 Hz ◦ Higgs (100 GeV/c 2 ): 0.1 Hz ◦ Higgs (600 GeV/c 2 ): 10 –2 Hz  QCD background ◦ Jet E T ~250 GeV: rate = 1 kHz ◦ Jet fluctuations  electron bkg ◦ Decays of K, , b  muon bkg  Selection needed: 1:10 10–11 ◦ Before branching fractions... 20/07/08

Data Acquisition in the LHC Detectors9 CollisionDetectors Trigger Event fragments Full event Storage Offline analysis

 Task: inspect detector information and provide a first decision on whether to keep the event or throw it out  The trigger is a function of event data, detector conditions and parameters 20/07/0810 T () ACCEPTED REJECTED Detector data not (all) promptly available Selection function highly complex  T(...) is evaluated by successive approximations TRIGGER LEVELS Detector data not (all) promptly available Selection function highly complex  T(...) is evaluated by successive approximations TRIGGER LEVELS Data Acquisition in the LHC Detectors

 Level-1 ◦ Hardwired processors (ASIC, FPGA, …) ◦ Pipelined massive parallel ◦ Partial information, quick and simple event characteristics (pt, total energy, etc.) ◦ 3-4 µs maximum latency  Level-2 (optional) ◦ Specialized processors using partial data  High Level ◦ Software running in processor farms ◦ Complex algorithms using complete event information ◦ Latency at the level of factions of second ◦ Output rate adjusted to what can be afforded 20/07/08Data Acquisition in the LHC Detectors11 ~ 1:10 4 ~ 1:10 1 ~ 1:10 2

20/07/08Data Acquisition in the LHC Detectors12 Lvl-1 Lvl-2 HLT Front end pipelines Readout buffers Processor farms Switching network Detectors “Traditional”: 3 physical levels

 Physics facts: ◦ pp collisions produce mainly hadrons with p T ~1 GeV ◦ Interesting physics has particles (leptons and hadrons) with large transverse momenta:  W  e : M(W)=80 GeV/c 2 ; P T (e) ~ GeV  H(120 GeV)  : P T (  ) ~ GeV  Basic requirements: ◦ Impose high thresholds on particles  Implies distinguishing particle types; possible for electrons, muons and “jets”; beyond that, need complex algorithms 20/07/08Data Acquisition in the LHC Detectors13 pTpT p

Data Acquisition in the LHC Detectors14 No.Levels Level-1Event ReadoutFilter Out Trigger (HW/SW)Rate (kHz)Size (MB) Bandw.(GB/s)MB/s (Event/s) 1/ (200) 1/ (100) 1/ (2000) 3/1 Pb-Pb (14) p-p (80) ATLAS CMS LHCb ALICE 20/07/08

 Readout and Level-1 Trigger ◦ Custom electronics (ASIC, FPGA), radiation hard/tolerant ◦ Optical detector links (1-2 Gb/s)  Event Building ◦ Gigabit Ethernet links and switches  High-Level Trigger ◦ Rack mounted PCs (~2500 nodes/experiment)  Online computing facilities ◦ Large power/cooling requirements ◦ Local data storage O(100 TB) 20/07/08Data Acquisition in the LHC Detectors15

20/07/08Data Acquisition in the LHC Detectors16

20/07/08Data Acquisition in the LHC Detectors17

Data Acquisition in the LHC Detectors18 individual physics analysis batch physics analysis batch physics analysis detector Event Summary Data (ESD) raw data event reconstruction event reconstruction event simulation event simulation event filter (selection & reconstruction) event filter (selection & reconstruction) processed data Analysis Object Data (AOD) (extracted by physics topic) 20/07/08

 The scientific software needed to process this huge amount of data from the LHC detectors is developed by the LHC collaborations ◦ Must cope with the unprecedented conditions and challenges (trigger rate, data volumes, etc.) ◦ Each collaboration has written millions of lines of code  Modern technologies and methods ◦ Object-oriented programming languages and frameworks ◦ Re-use of a number of generic and domain-specific ‘open- source’ packages  The organization of this large software production activity is by itself a huge challenge ◦ Large number of developers distributed worldwide ◦ Integration and validation require large efforts 20/07/08Data Acquisition in the LHC Detectors19

 Data processing applications are based on frameworks ◦ Ensure coherency and integration  Every experiment has a framework for basic services and various specialized frameworks: ◦ Event model, detector description, visualization, persistency, interactivity, simulation, calibrations, etc.  Core libraries and services provide basic functionality. Examples: ◦ Geant4 – Simulation of particles through matter ◦ ROOT – Data storage and analysis framework  Extensive use of generic software packages ◦ GUI, graphics, utilities, math, db, etc. 20/07/08Data Acquisition in the LHC Detectors20 non-HEP specific software packages Experiment Framework Event Det Desc. Calib. Applications Core Libraries Simulation Data Mngmt. Distrib. Analysis Software Structure

Pere Mato, CERN/PH21  Foundation Libraries ◦ Basic types ◦ Utility libraries ◦ System isolation libraries  Mathematical Libraries ◦ Special functions ◦ Minimization, Random Numbers  Data Organization ◦ Event Data ◦ Event Metadata (Event collections) ◦ Detector Conditions Data  Data Management Tools ◦ Object Persistency ◦ Data Distribution and Replication  Simulation Toolkits  Event generators  Detector simulation  Statistical Analysis Tools  Histograms, N-tuples  Fitting  Interactivity and User Interfaces  GUI  Scripting  Interactive analysis  Data Visualization and Graphics  Event and Geometry displays  Distributed Applications  Parallel processing  Grid computing

Pere Mato, CERN/PH22 Raw Data Theory & Parameters Reality Observables Events Calculate expected branching ratios Randomly pick decay paths, lifetimes, etc for a number of events Calculate what imperfect detector would have seen for those events Treat that as real data and reconstruct it Compare to original to understand efficiency

Pere Mato, CERN/PH23  Many MC generators and tools are available to the experiments provided by a solid community ◦ Each experiment chooses the tools more adequate for their physics  Example: ATLAS alone uses currently ◦ Generators  AcerMC: Zbb~, tt~, single top, tt~bb~, Wbb~  Alpgen (+ MLM matching): W+jets, Z+jets, QCD multijets  Charbydis: black holes  HERWIG: QCD multijets, Drell-Yan, SUSY...  Hijing: Heavy Ions, Beam-gas..  tt~, Drell-Yan, boson pair production  Pythia: QCD multijets, B-physics, Higgs production... ◦ Decay packages  TAUOLA: Interfaced to work with Pythia, Herwig and Sherpa,  PHOTOS: Interfaced to work with Pythia, Herwig and Sherpa,  EvtGen: Used in B-physics channels.

Pere Mato, CERN/PH24  Geant4 has become an established tool, in production for the majority of LHC experiments during the past two years, and in use in many other HEP experiments and for applications in medical, space and other fields  On going work in the physics validation  Good example of common software LHCb : ~ 18 million volumes ALICE : ~3 million volumes

Pere Mato, CERN/PH25  1980s: mainframes, batch jobs, histograms back. Painful.  Late 1980s, early 1990s: PAW arrives. ◦ NTUPLEs bring physics to the masses ◦ Workstations with “large” disks (holding data locally) arrive; looping over data, remaking plots becomes easy  Firmly in the 1990s: laptops arrive; ◦ Physics-in-flight; interactive physics in fact.  Late 1990s: ROOT arrives ◦ All you could do before and more. In C++ this time. ◦ FORTRAN is still around. The “ROOT-TUPLE” is born ◦ Side promise: reconstruction and analysis form a continuum  2000s: two categories of analysis physicists: those who can only work off the ROOT-tuple and those who can create/modify it  Mid-2000s: WiFi arrives; Physics-in-meeting

Pere Mato, CERN/PH26  Experiments have developed tools to facilitate the usage of the Grid  Example: GANGA ◦ Help configuring and submitting analysis jobs (Job Wizard) ◦ Help users to keep track of what they have done ◦ Hide completely all technicalities ◦ Provide a palette of possible choices and specialized plug- ins:  pre-defined application configurations  batch/grid systems, etc. ◦ Single desktop for a variety of tasks ◦ Friendly user interface is essential

 The online multi-level trigger is essential to select interesting collisions (1 in ) ◦ Level-1: custom hardware, huge fanin/out problem, fast algorithms on coarse-grained, low-resolution data ◦ HTL: software/algorithms on large processor farm of PCs ◦ Large DAQ system built with commercial components  The experiments will produce about 7 PB/year raw data  Reconstruction and analysis to get from raw data to physics results ◦ Huge programs (10 7 lines of code) developed by 100’s of physicists ◦ Unprecedented need of computing resources 20/07/08Data Acquisition in the LHC Detectors27