ATLAS Data Challenges The Physics point of view UCL, September 5th 2001 Fabiola Gianotti (CERN)

Slides:



Advertisements
Similar presentations
ATLAS ATLAS PESA Meeting 25/04/02 B-Trigger Working Group Work-plan This talk:
Advertisements

03/31/2006 DOSAR Workshop at UTA Pat Skubic (OCHEP) OCHEP Physics Interests Pat Skubic, for the Oklahoma Center for High Energy Physics Introduction Physics.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Alain Romeyer - January Light higgs decay in SUSY cascade - Status report Introduction Trigger efficiency B tagging Jet calibration Invariant mass.
1 Rutherford Appleton Laboratory The 13th Annual International Conference on Supersymmetry and Unification of the Fundamental Interactions Durham, 2005.
HiT – Higgs and Top (Thursday, June 22, 2000) 1.News - Boaz/Meenakshi 2.Trigger Simulator Status – who?? 3.Proposal for a HiT workshop – Meenakshi/Boaz.
Progress on the FTK Physics Case (For H/Abb 4b signal) Kohei Yorita Young-Kee Kim University of FTK Meeting on January 26 th, 2006 (1)Brief Sample.
25/03/2003Simulation Application for the LHCb Experiment CHEP March 2003 Presented by: W. Pokorski / CERN Authors: I. Belyaev, Ph. Charpentier,
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
Update on Tools FTK Meeting 06/06/06 Erik Brubaker U of Chicago.
Sept 30 th 2004Iacopo Vivarelli – INFN Pisa FTK meeting Z  bb measurement in ATLAS Iacopo Vivarelli, Alberto Annovi Scuola Normale Superiore,University.
On the Trail of the Higgs Boson Meenakshi Narain.
Jet Reconstruction and Calibration in Athena US ATLAS Software Workshop BNL, 27/08/03 Ambreesh Gupta, for the JetRec Group University of Chicago Outline:
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
F. Gianotti, AFP kick-off meeting, 20/9/2012 Recommendations for the next steps Reports from the:  Technical Review A.Henriques  Physics Review R.Hawkings.
Steps toward ttH (H->bb) in the Atlas ‘PhysicsAnalysis’ Environment. Chris Collins-Tooth Christian Shaw.
C.ClémentTile commissioning meeting – From susy group talk of last Wednesday  Simulation and digitization is done in version (8) 
2004 Xmas MeetingSarah Allwood WW Scattering at ATLAS.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
September 30, 2004FTK meeting1 Making the FTK Physics Case M. Shochet 9/30/04.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
Fabiola Gianotti, 31/8/’99 PHYSICS and SOFTWARE ATLAS Software Week 31/8/’99 Fabiola Gianotti Software requirements of physics groups What should Detector.
ATLAS ATLAS Week: 25/Feb to 1/Mar 2002 B-Physics Trigger Working Group Status Report
Andreas Morsch, CERN EP/AIP CHEP 2003 Simulation in ALICE Andreas Morsch For the ALICE Offline Project 2003 Conference for Computing in High Energy and.
W + /W - and l + /l - A Means to investigate PDFs T. Schörner-Sadenius, G. Steinbrück Hamburg University HERA-LHC Workshop, CERN, October 2004.
Muon LPC Meeting, 14 Sep Overview of Muon PRS Activities Darin Acosta University of Florida.
HERA-LHC, CERN Oct Preliminary study of Z+b in ATLAS /1 A preliminary study of Z+b production in ATLAS The D0 measurement of  (Z+b)/  (Z+jet)
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Possibility of tan  measurement with in CMS Majid Hashemi CERN, CMS IPM,Tehran,Iran QCD and Hadronic Interactions, March 2005, La Thuile, Italy.
9-13/9/03 Atlas Overview WeekPeter Sherwood 1 Atlfast, Artemis and Atlantis What, Where and How.
05/09/2001ATLAS UK Physics Meeting Data Challenge Needs RWL Jones.
CaloTopoCluster Based Energy Flow and the Local Hadron Calibration Mark Hodgkinson June 2009 Hadronic Calibration Workshop.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
- Early Adopters (09mar00) May 2000 Prototype Framework Early Adopters Craig E. Tull HCG/NERSC/LBNL ATLAS Arch CERN March 9, 2000.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
MOORE MOORE (Muon Object Oriented REconstruction) Track reconstruction in the Muon Spectrometer MuonIdentification MuonIdentification Reconstruction and.
M. Gilchriese Basic Trigger Rates December 3, 2004.
Fabiola Gianotti, 13/05/2003 Simulation Project Leader T. Wenaus Framework A. Dell’Acqua WP Geant4 J.Apostolakis WP FLUKA Integration A.Ferrari WP Physics.
Calorimeter Simulation Infrastructure Norman Graf Arlington ‘03.
Fast Simulation and the Higgs: Parameterisations of photon reconstruction efficiency in H  events Fast Simulation and the Higgs: Parameterisations of.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
22.July.2003 M.Smizanska: EvtGen in ATLAS EvtGen in ATLAS Outline: ATLAS EvtGen users: General requirements ATLAS EvtGen B-physics group requirements ATLAS.
1 OO Muon Reconstruction in ATLAS Michela Biglietti Univ. of Naples INFN/Naples Atlas offline software MuonSpectrometer reconstruction (Moore) Atlas combined.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Software Week - 8/12/98G. Poulard - CERN EP/ATC1 Status of Software for Physics TDR Atlas Software Week 8 December 1998 G. Poulard.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
Régis Lefèvre (LPC Clermont-Ferrand - France)ATLAS Physics Workshop - Lund - September 2001 In situ jet energy calibration General considerations The different.
Alain Romeyer - Sept Light Higgs search in SUSY cascades Introduction (Previous studies) Plans for this analysis Simulation chain Reconstruction.
UK LVL1 Meeting, RAL, 31/01/00Alan Watson 1 ATLAS Trigger Simulations Present & Future? What tools exist? What are they good for? What are the limitations?
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
10 January 2008Neil Collins - University of Birmingham 1 Tau Trigger Performance Neil Collins ATLAS UK Physics Meeting Thursday 10 th January 2008.
Atlas Software May, 2000 K.Amako Status of Geant4 Physics Validation Atlas Software Week 10 May, Katsuya Amako (KEK)
STAR Simulation. Status and plans V. Perevoztchikov Brookhaven National Laboratory,USA.
Detector SimOOlation activities in ATLAS A.Dell’Acqua CERN-EP/ATC May 19th, 1999.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
Introduction to FCC Software FCC Istanbul 11 March, 2016 Alice Robson (CERN/UNIGE) on behalf of / with thanks to the FCC software group.
1 Plans for the Muon Trigger CSC Note. 2 Muon Trigger CSC Studies General performance studies and trigger rate evalution for the full slice Evaluation.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
Introduction 08/11/2007 Higgs WG – Trigger meeting Ricardo Gonçalo, RHUL.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
Migration of reconstruction and analysis software to C++
US ATLAS Physics & Computing
MOORE (Muon Object Oriented REconstruction) MuonIdentification
ATLAS DC2 & Continuous production
Presentation transcript:

ATLAS Data Challenges The Physics point of view UCL, September 5th 2001 Fabiola Gianotti (CERN)

 Three data challenges are foreseen: -- DC0 : end DC1 : first half DC2 : first half 2003  Computing TDR  Goals : validate our computing model and our software Important physics content : provide data samples for physics studies and hopefully many physics results  How ? Start with data which looks like real data  need MC generators, G3/G4 simulation, event model, detailed detector response (e.g. noise, cross-talk, etc.), pile-up Run the filtering/trigger and reconstruction chain Store the output data into the database Run the analysis Produce physics results

DC0: November - December 2001  In principle should be a test of the WHOLE software chain : a kind of “rehearsal” for DC1 (check that everything works for DC1)  Issue is therefore not massive production of huge data samples but few 100k events able to test the whole software chain  Chosen physics sample : few 100 k Z+jet events, with Z . -- allows tests of ALL sub-detectors (including b-tagging since 6% of jets are b-jets) -- idea is to produced several samples with the 3 general-purpose generators (PYTHIA, Isajet, Herwig)  If you want to participate in DC1, you are (strongly) encouraged to participate in DC0 as well.

DC1: February - July 2002 Scope : stress-test the system with large-scale production, reconstruction and analysis Several samples of up to 10 7 events   10% data collected at LHC in one year. Crucial issues : -- simulation will be done mainly with G3 but it is important to perform smaller-scale production with G4 -- comparison G3/G4 (with same geometry, to be meaningful …) -- learn about event model and detector description -- I/O performances : N events with different technologies -- pile-up treatment -- understand bottle necks -- understand distributed computing model / GRID (not discussed here)

DC1: Physics samples 10 7 jets for e/jet separation studies in view of Trigger/DAQ TDR (due end of 2002). ~ 10 times more statistics than “old jet production”. Study performance of ATHENA and HLT algorithms. Useful also for other physics studies (e.g. optimisation of jet energy reconstruction algorithm) Any other CPU-consuming physics sample considered useful for physics studies. Mainly SM “background processes” : examples: -- inclusive muon sample (for B-physics and muon performance studies), Zbb and Wbb samples (backgrounds to many searches), WW/ZZ samples -- Z   for tau-lifetime studies -- several samples with different generators to understand the physics of various MC Physics groups and Combined Performance groups asked to prepare list of wishes  first discussions at Physics Coordination in Lund and at October ATLAS week. Everybody is encouraged to make suggestions

DC2 : January - September 2003  Scope/precise goals: depend on the outcomes of DC0/1  Present goals: events (  data collected in 1 LHC year) -- Geant4 should play a major role -- full test of calibration/alignement procedures and condition database -- question : do we want to add part or all of DAQ, LVl1, LVL2, Event filter ?  Physics content: -- demonstrate capability of extracting and interpreting a signal from New Physics -- generate various SM samples and “hide” in each one a different New Physics process (e.g. SUSY for one mSUGRA point, excited leptons, etc.). -- people will be asked to understand the nature and all possible features of the signal (without knowing a priori what it is)

DC production : CPU and data size ~ 20 TB ~10 7 DC1 ~ 0.2 TB ~10 5 DC0 Total sizeTime hours SI95 Number of events DC2 ~ 10 8 ~ 200 TB

“Physics readiness document” (kind of Physics TDR prime … ) : LHC t 0 -1year Content (examples):  Work done with MC generators, the ATLAS MC library, status/strategy for MC production  Strategy for using different levels of simulation (full, parametrisations, fast) for different processes  Comparisons G4/test-beam data, FLUKA/test beam-data  systematics from full simulation  Main figures of Physics TDR redone with new/final software  Specialised packages needed for various physics studies (e.g. MSSM scan packages for Higgs and SUSY with up-to-date theoretical calculations, etc.)  etc.

Status of the non-core software (my view, emphasis on “physics part”)  Main generators (PYTHIA, ISAJET, HERWIG) interfaced to HepMC (HERWIG being finalised …). Next : specialised generators (e.g. VECBOS, QQ)  Simulation : -- G4 : physics validation not completed (lot of work done with EM physics, hadronic physics being tested now); full ATLAS geometry not yet in. -- DC0, DC1: use G3 plus smaller/restricted (e.g. to some detector parts) productions with G4 -- FLUKA : I am 100% sure with need it. I intitiated a pilot-project with Tilecal : G4 test-beam geometry input to FLUKA (first results in Lund). Then extend to other sub-detectors  Intermediate simulation (e.g. shower/track parametrisation): I am 100% sure we need it. Tried to find people over the last two years  failed. Recently a couple of groups have shown some interest.

 ATLFAST OO (UK product): -- runs in ATHENA -- reads HepMC from Objectivity, writes output into Objectivity (and ntuples) -- first validation made. Further results in Lund (from “users-non- developers”) -- next steps: improve functionality (beyond ATLFAST fortran). E.g. : shower shapes ? Trigger simulation ? Parametrisation for B-physics ?  C++/OO reconstruction: -- runs in ATHENA -- reads G3 hits/digis (Phyiscs TDR data) -- validation results in Lund  Less clear situation (to me...) for : e.g. -- event data model -- detector description -- database, condition database, technology choice -- simulation framework vs ATHENA -- analysis tools (maybe premature today but one of the aims of DC’s should be validation of analysis tools)

Where could you contribute ? Lot of work to be done everywhere, of course …. Examples:  Improve understanding of ATLAS potential for physics (e.g. SUSY, Extra-dimensions, backgrounds) and detector performance (e.g. can we tag charm-jets ?) by analysing data produced by DC’s.  Improve reconstruction, algorithms, etc. (e.g. HLT, E-flow algorithm for jet reconstruction using ID+CALOs)  Validation of MC generators: e.g. Which MC for which process ? For which processes do we need more calculations and/or additional/specialised MC ?  Validation of G4/FLUKA physics : comparisons with test-beam data (in particular nuclear interactions)  Validation of ATLFAST OO and new reconstruction against old/fortran ATLFAST and ATRECON  Intermediate simulation (shower and track parametrisations)  Detector response : hits  digi (including noise, pile-up with correct time structure, efficiency, etc.)

HLT-DC1 scenario  This has to be discussed with the HLT community but the basis could be similar to what has been done previously  Generation: Pt hard scattering > 17 Gev |  | < samples 1) e-candidate  Et > 17 Gev, no , no Grid 0.12 x ) Jet-candidate  Et > 40 Gev Grid 1.0 x 1.0 A first selection is made at the level of the event generation One keeps 14.5% of generated events 14.4% for (1) and 2% for (2)

HLT- DC1 scenario  Simulation The remaining events are run through the full simulation The Lvl1 trigger is applied at that level One keeps 13.7% of the events 97% for (1) and 10% for (2) The pile-up is run for the remaining events means ~2% of the ‘generated’ sample  Reconstruction Then the events are run through Lvl2, Event Filter and offline reconstruction

What next  Prepare a first list of goals & requirements with HLT, Physics community simulation, reconstruction, database communities people working on ‘infrastructure’ activities (bookkeeping) to be discussed with A-team with CSG (July 24 th meeting) In order to prepare a list of tasks Some Physics oriented But also like testing code, running production, … define the priorities

Then  Start the validation of the various components in the chain (putting dead lines for readiness) Software Simulation, pile-up, … Infrastructure Database, bookkeeping, …  Estimate what it will be realistic (!) to do For DC0 For DC1  “And turn the key”

The ATLAS Data Challenges Project Structure Organisation ATLAS Data Challenges CSG DC Overview Board DC Execution Board DC Definition Committee (DC2) Work Plan Definition WP RTAG WP Reports Reviews NCB Resource Matters Other Computing Grid Projects DataGrid Project TIERs

Expression of interests  So far, after the NCB meeting of July 10 th : Canada, France, Italy, Japan, Nordic Grid, Russia, Taiwan, UK Proposition to help in DC0 Proposition to participate to DC1 Contact with HLT community Contact with EU-Data-GRID Kit of ATLAS software