L3 Algorithms: status and plans for the near future D  Trigger Workshop: 22 nd April 2002 Dan Claes and Terry Wyatt.

Slides:



Advertisements
Similar presentations
CBM Calorimeter System CBM collaboration meeting, October 2008 I.Korolko(ITEP, Moscow)
Advertisements

Mayukh Das 1Louisiana Tech University For the 2004 D0SAR Workshop Activities with L3 and the Higgs By : Mayukh Das.
Freiburg Seminar, Sept Sascha Caron Finding the Higgs or something else ideas to improve the discovery ideas to improve the discovery potential at.
UK egamma meeting, Sept 22, 2005M. Wielers, RAL1 Status of Electron Triggers Rates/eff for different triggers Check on physics channels Crack region, comparison.
The Silicon Track Trigger (STT) at DØ Beauty 2005 in Assisi, June 2005 Sascha Caron for the DØ collaboration Tag beauty fast …
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
Level 1 Trigger system design specifications 1MHz Tevatron Collisions Level 2 Level 3 Tape 10kHz1kHz20-50Hz Time budget for Level3 i/o, event building,
Update on Tools FTK Meeting 06/06/06 Erik Brubaker U of Chicago.
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
In order to acquire the full physics potential of the LHC, the ATLAS electromagnetic calorimeter must be able to efficiently identify photons and electrons.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
Photon reconstruction and calorimeter software Mikhail Prokudin.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
9/14/2001D0 Simulation Tasks Q.Li 1 Simulation Tasks Status Qizhong Li Fermilab Sept. 14, 2001 D0 Collaboration Meeting.
BEACH Conference 2006 Leah Welty Indiana University BEACH /7/06.
L3 Muon Trigger at D0 Status Report Thomas Hebbeker for Martin Wegner, RWTH Aachen, April 2002 The D0 muon system Functionality of the L3 muon trigger.
1 Tracking Reconstruction Norman A. Graf SLAC July 19, 2006.
C.ClémentTile commissioning meeting – From susy group talk of last Wednesday  Simulation and digitization is done in version (8) 
Level 3 Muon Software Paul Balm Muon Vertical Review May 22, 2000.
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
Thomas Jefferson National Accelerator Facility Page 1 EC / PCAL ENERGY CALIBRATION Cole Smith UVA PCAL EC Outline Why 2 calorimeters? Requirements Using.
19/07/20061 Nectarios Ch. Benekos 1, Rosy Nicolaidou 2, Stathes Paganis 3, Kirill Prokofiev 3 for the collaboration among: 1 Max-Planck-Institut für Physik,
Software Status  Last Software Workshop u Held at Fermilab just before Christmas. u Completed reconstruction testing: s MICE trackers and KEK tracker.
A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
Organisation of the Beatenberg Trigger Workshop Ricardo Gonçalo Higgs WG Meeting - 22 Jan.09.
L3 Filtering: - short- and medium term plans - how you can help out All-D  Meeting: 14 th March 2003 Terry Wyatt (FNAL-CD/Manchester) on behalf of the.
CaloTopoCluster Based Energy Flow and the Local Hadron Calibration Mark Hodgkinson June 2009 Hadronic Calibration Workshop.
1 Silke Duensing DØ Analysis Status NIKHEF Annual Scientific Meeting Analysing first D0 data  Real Data with:  Jets  Missing Et  Electrons 
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
CBM ECAL simulation status Prokudin Mikhail ITEP.
Electroweak and Related Physics at CDF Tim Nelson Fermilab on behalf of the CDF Collaboration DIS 2003 St. Petersburg April 2003.
PESAsim – the e/  analysis framework Validation of the framework First look at a trigger menu combining several signatures Short-term plans Mark Sutton.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
W/Z Plan For Winter Conferences Tom Diehl Saclay 12/2001.
Aurelio Juste (Fermilab) Rencontres de Moriond, March OUTLINE Tevatron Run 2 The upgraded DØ Detector Status Performance First Physics Results Outlook.
CALOR April Algorithms for the DØ Calorimeter Sophie Trincaz-Duvoid LPNHE – PARIS VI for the DØ collaboration  Calorimeter short description.
Paul Balm - EPS July 17, 2003 Towards CP violation results from DØ Paul Balm, NIKHEF (for the DØ collaboration) EPS meeting July 2003, Aachen This.
L3 Filtering: status and plans for the near future D  Collaboration Meeting: 25 th April 2002 Dan Claes and Terry Wyatt, on behalf of the L3 Algorithms.
Monitoring of L1Calo EM Trigger Items: Overview & Midterm Results Hardeep Bansil University of Birmingham Birmingham ATLAS Weekly Meeting 11/11/2010.
L3 Algorithms: status and plans for the near future All-D  Meeting: 1 st March 2002 Dan Claes and Terry Wyatt.
STAR Analysis Meeting, BNL – oct 2002 Alexandre A. P. Suaide Wayne State University Slide 1 EMC update Status of EMC analysis –Calibration –Transverse.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
The “Comparator” Atlfast vs. Full Reco Automated Comparison Chris Collins-Tooth 19 th February 2006.
Issues with cluster calibration + selection cuts for TrigEgamma note Hardeep Bansil University of Birmingham Birmingham ATLAS Weekly Meeting 12/08/2010.
Jet Studies at CDF Anwar Ahmad Bhatti The Rockefeller University CDF Collaboration DIS03 St. Petersburg Russia April 24,2003 Inclusive Jet Cross Section.
A search for the ZZ signal in the 3 lepton channel Azeddine Kasmi Robert Kehoe Southern Methodist University Thanks to: H. Ma, M. Aharrouche.
1 S, Fedele, Student Presentations, 2004/08/04S Amazing Title Slide Reworking the CES Cluster Reconstruction Algorithm By: Steve Fedele Advisor: Pavel.
L1Calo EM Efficiencies Hardeep Bansil University of Birmingham L1Calo Joint Meeting, Stockholm 29/06/2011.
DØ Beauty Physics in Run II Rick Jesik Imperial College BEACH 2002 V International Conference on Hyperons, Charm and Beauty Hadrons Vancouver, BC, June.
Trigger study on photon slice Yuan Li Feb 27 th, 2009 LPNHE ATLAS group meeting.
Régis Lefèvre (LPC Clermont-Ferrand - France)ATLAS Physics Workshop - Lund - September 2001 In situ jet energy calibration General considerations The different.
10 January 2008Neil Collins - University of Birmingham 1 Tau Trigger Performance Neil Collins ATLAS UK Physics Meeting Thursday 10 th January 2008.
Generic Trigger List for MC Production Terry Wyatt. L3 Algorithms Meeting 11 th Sept overview of planned list tools/Filters in L3 technical section.
Aug _5071 Top stop by charm channel analysis using D0 runI data OUTLINE physics process of top to stop Monte Carlo simulation for signal data.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
W/Z Muon Selection For Winter Conferences Tom Diehl Saclay 12/2001 “Tight” and “Loose” Muon selection for the WZ group.
Using direct photons for L1Calo monitoring + looking at data09 Hardeep Bansil University of Birmingham Birmingham ATLAS Weekly Meeting February 18, 2010.
 reconstruction and identification in CMS A.Nikitenko, Imperial College. LHC Days in Split 1.
John Marshall, 1 John Marshall, University of Cambridge LCD WG6 Meeting, February
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
H->WW->lνlν Analysis - Improvements and results - - Data and MC - Higgs Working group meeting, 6 January 2011 Magda Chełstowska & Rosemarie Aben.
David Lange Lawrence Livermore National Laboratory
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Monitoring of L1Calo EM Efficiencies
Calorimeter Status Electronics Installation and Commissioning
Interactions of hadrons in the Si-W ECAL
The Silicon Track Trigger (STT) at DØ
Individual Particle Reconstruction
Plans for checking hadronic energy
Presentation transcript:

L3 Algorithms: status and plans for the near future D  Trigger Workshop: 22 nd April 2002 Dan Claes and Terry Wyatt.

Design: L1 L2 L3 tape 10kHz 1kHz 20-50Hz Currently: L1 L3 tape. 100Hz 20-50Hz Factor ~ 5 rejection needed Calorimeter-based filtering only (jets, electrons, taus) Next steps (p11.06 release) to commission: Muons, tracking, track-based primary vertexing, and hot cell killing (L3 NADA). Time budget for L3 i/o, event building, filtering  100 nodes * s = 0.1 s

Level 3 Jargon Tool: Does the real work –Unpacks raw data, finds tracks, clusters –Identifies physics objects (e, , , jet, , W, Z) Filter: Applies (simple) cuts on objects –e.g. pt(  ) > 10 GeV Filter Script: Defines a L3 trigger condition –Logical.AND. of one or more filters –If all filters in a script are.TRUE. trigger is satisfied and event is recorded ‘Mark and Pass’: –Selects unbiased sample of input events to be recorded ‘Forced Unbiased’ –Events written out exclusively because ‘Mark and Pass’ bit set

ScriptRunner Author: Moacyr Souza (Fermilab/LAFEX) L3 central code management: Jon Hays (Imperial) In order to save processing time: Only a partial reconstruction of each event is performed, depending on the L1/L2 trigger information For each L1/L2 trigger that fires: –One or more L3 filter scripts run –Details of which filters/tools are called by each script is determined by the triggerlist Tool results are kept in case they are needed again

Short term goal: to run a filter L3FMuon with ‘tight / loose / a-stub’-like quality criteria cutting on muo_local p T Mainly needed for single muon triggers (which currently have a L1 prescale of ~40) L3TMuon (local muon track reconstruction) original author: Paul Balm (NIKHEF) current responsibles: Christophe Clement (Stockholm), Martijn Mulders (Fermilab), Martin Wegner (Aachen) + L3TMuon uses much of the ‘offline’ muon reco code

L3TMuon Issues: Unpacker Memory leaks and timing problems Keeping up with updates to the offline muon code Efficiency/rejection P version run online in special runs: Run : 120k mu1ptxctxx_fz (central) Run : 70k mu1ptxbtxx_fz (forward)

Unpacker Dynamic unpacker: recognises from the raw data which crates/modules are being read out. Written by Scott Snyder. Shown to give ~ identical results to old unpacker + the correct cfg.dat file. Released (p )

Memory Leaks Longstanding problem of ~2 kByte/event memory leak Traced to problem in muon_geometry Fixed (Rick Jesik) in p version Unfortunately, p contained many other changes to the ‘offline’ muon software, which caused huge memory leak Temporary cure: revert to ignoring MDT modules with >30 hits (which was p default) p release Some evidence of residual low level memory leak when run online (in single muon special runs)

Memory Leaks

Timing p default rcp parameters (maxopt version running on 1 GHz PIII) mean time/event ~100 ms when run online ~50 out of 200,000 events took more than 30 seconds to process p parameters tuned to reduce time taken: - extrapolation step size - number of track-fit iterations - number of A/BC segments considered mean time/event ~16 ms expected to eliminate time-outs

Efficiency (pT = 5 GeV single muons) ‘Loose’ L3 muon ‘Tight’ L3 muon ‘Loose’ efficiency ~80% (cf. geometrical acceptance ~90%) Poor ‘tight’ efficiency in central region (track fit fails to converge – also seen in ‘offline’ reco.)

Rejection measured on single muon test runs default parameters tuned parameters centralforward

Next Steps for L3FMuon Take another single muon test run (with tuned parameters) –A test run with a cosmic trigger has been taken and showed no obvious signs of timing or memory problems Get L3FMuon running full-time –Global_CalMuon6.00 exits with loose L3FMuon hanging off L1 single muon and muon-jet triggers (100% Mark&Pass) Optimise parameters: –Memory/timing efficiency/rejection

Next Steps for L3FMuon Fix reco memory leak that occurs when MDTs have many hits Stricter procedures for production releases of ‘offline’ muon reco software –Including a specific requirement for L3FMuon tests BEFORE code changes released to production branch Longer term: We need a serious analysis of the cost/benefit of retaining/breaking the link between L3TMuon and muon reco

Recent progress in L3 central tracking Offline quality unpacking and geometry for L3 Improved SMT-CFT matching Proposal for stand-alone tracking filter Track-based primary vertex tool

Recent improvements in CFT unpacker: replace global threshold with individual channel thresholds up to date thresholds and cable maps Offline quality geometry implemented for SMT and CFT in L3 (Will be released in p ) When improved thresholds/cable maps/geometry are available: requires no code changes but care in archiving/version-tagging (general problem!) SMT Unpacking CFT Unpacking Principal author: Robert Illingworth ( Imperial College )

Level 3 Global Track Finding author: Daniel Whiteson ( Berkeley ) a global (SMT plus CFT) track finder Find axial CFT tracks Match stereo CFT clusters Extend into SMT Require 8/8 axial CFT hits if no matched axial SMT hits Require 7/8 axial CFT hits if  3 matched axial SMT hits If CFT axial/stereo match fails: CFT-SMT match done in xy-only but SMT stereo information still used to give 3D tracking (this is a new feature implemented in p )

Current L3 global tracking performance track  (rad) z at dca (cm) Number of axial hits on track CFT only Number of stereo hits on track

Comparison of L3 and offline tracking number of tracks track  (rad) q/p T (GeV -1 ) dca (cm)

A possible stand-alone track-based filter Require at least one track with p T > cut Try out on events with single muon triggers Rejection factor (all events) Efficiency (events with tight offline muon with p T >3.0) Rejection vs. Efficiency At 50% efficiency, rejection of ~25

Timing for global track tool Time per event (ms) On d0mino (debug version) + about 30 ms per event for unpack tools

CFT Tracking Algorithm - L3TCFTTracker Principal author: Ray Beuselink ( IMPERIAL ) P11 tool certification: Robert Illingworth and Chris Barnes

A possible plan for filtering on single muon triggers EITHER: Loose L3 muon OR: Central track –(i.e., using redundancy to improve efficiency) –N.B. Tracking will not be perfect for a long time –(If you don’t like this, you can always exclude these event by using the L3 trigger names) Longer term: –May need to require track-muon match (at least for low p T ) Tool exists (Paul Balm) –Also investigate track-Calorimeter MIP match Tool under development (Martin Grünewald) We could require an.OR. of:

L3 Primary Vertex needed soon! author: Guilherme Lima (UERJ/Brazil) Has yet to be tested on REAL DATA Opportunity for new person to get involved! 1) Histogram technique using SMT hits 2) L3TVertexFinder Track-based vertexing tool author: Ray Beuselinck (Imperial) Recently upgraded to use either CFT or Global candidate tracks as input Chris Barnes, Per Jonsson (Imperial) testing N.B. Marseille group (Arnaud Duperrin, Mossadek Talby, Eric Kajfasz) hope to be actively involved in testing tracking, vertexing.

L3TEle Electron Tool authors: Volker Buescher ( Mainz ) Ulla Blumenschein ( Mainz ) Current filter: simple 0.25 cone applying cuts on E T e.m. fraction (>0.9) transverse shower shape ,  : energy weighted cluster axis position

Electron efficiency in WH  e bb Monte Carlo events Rejection in CEM(1,5) data events Events rejected by: p t cut e.m. fraction shower shape electron candidates p T electron (GeV)

Cuts on transverse shower shape Cut recommended by L3 group Cut accepted by trigger board

Single electron triggers At L1 we have two (unprescaled) single electron triggers: –CEM(1,10) –CEM(2,5) At L3 we run the same two single electron filters: –p T > 15 GeV, emfrac > 0.9 –p T > 12 GeV, emfrac > 0.9, shower shape We do a similar thing with CEM(1,5) –(heavily prescaled) Most of the rest of the e.m. trigger list CEM(1,10).and.X is relatively uninteresting

Rejection factors for single electron filters L1 trigger Trigger name expected actual rejection rejection CEM(1,10) EM_HI EM_HI_SH (3.6) CEM(2,5) EM_HI_2CEM EM_HI_2CEM5_SH (6.7) CEM(1,5) EM_LO EM_LO_SH (7.8) (run ) (numbers in brackets represent the.or. of the two parallel filter scripts)

Further developments for single electron triggers ? More sophisticated treatment of transverse and longitudinal shower shape –Studies in progress Add in parallel to the two current filters: –Higher p T cut and softer e.m. fraction cut? –Stand-alone track filter? –Matched track+ looser e.m. cuts? –Matched preshower + looser e.m. cuts? –Do we have enough L3 trigger bits? –Alternative: have one filter that combines all available information (with details of the trigger decision stored in L3PhysicsResults) Try CEM(1,8).CEM(2,2) at L1?

L3TJet Tool author: Volker Buescher ( Mainz ) Rejection of L1-accepts makes use of: high precision calorimeter readout available at L3 simple cone algorithm identify (and reject) low-E T events passing L1 trigger sharpen the turn-on curve running online stably since early Sept’01

DATA runs e.g., Fraction of events passing L3FJet(1,15) p T of leading offline JCCA jet (GeV) Main effect smearing turn-on is lack of primary vertex at L3

QCD group has adjusted each L3 jet p T cut to the value at which the relevant L1 trigger “reaches ~ 100% efficiency”: L1 Trigger Trigger Name Actual Rejection Estimated Rejection CJT(2,3) jt_25tt 74075/2684 = CJT(2,5) jt_45tt 41142/1889 = CJT(3,5) jt_65tt 30391/1110 = CJT(4,5) jt_95tt 8270/ 160 = CJT(4,7) jt_125tt 1742/ 37 = (run ) Calorimeter non-linearity corrections implemented in calorimeter unpacker for p (Marumi Kado) Killing of hot cells needed - Marumi Kado, Gregorio Bernardi are implementing offline NADA into L3

Running online since 17-Jan-2002 Example: mu1ptxatxx_CJT(1,3) + L3Tau (pT > 10 GeV) gives rejection factor ~ 5.5 Z  QCD L3FTauHadronic Level3 TauTool author: Gustaaf Brooijmans (Fermilab) current responsible: Yann Coadou (Upsala) Based on calorimeter jet shape variables

Other tools on a longer timescale cps and fps cluster finding and unpacking missing E T tools to associate objects in different detectors (e.g. track to muon) b-tag: impact parameter, secondary vertex tools to calculate "physics" quantities –(e.g., inv. mass, delta_eta) tools to identify physics event types –(e.g., W, Z, stream definitions) –How to organise this? Hang W and Z script off each relevant L1/L2 bit? (limited number of L3 bits?) –Keep raw data on reco output of W/Z candidates? Many opportunities for new people to get involved!

Level3 Requirements for certification of Code Fully tested on a Linux system. Works "out of the box" (No private mods to code or RCP files) No crashes/memory leaks on samples of order 100K events: real data and Monte Carlo. Timing studies Performance studies/plots DOCUMENTED Efficiency, rejection, physics distributions these tests may run the tool singly, but : Must run on data with triggerlist exercising all released tools The filter code must be tested, as well as associated tools. Must test persistency of physics_results: Write out events. Read back in to check physics distributions. Verification: run trigger simulator on real data and check that results agree with those obtained online. Shadow nodes (in the future) to test new code ‘online’

L3 Monitoring L3 filter statistics for each trigger available to shift crew via daq_monitor ‘physics_results’ for each tool written out on each accepted event debug_info l3fanalyze program: produces rootuple –Each tool must provide methods to fill rootuple –Used for offline checks of data quality –Plan to run online as ‘examine’ – use root to fill monitoring histograms from rootuple –Extra person needed to work on this! L3 monitoring needs to get a lot more systematic and routine!

Can we do more sophisticated online monitoring in the L3 nodes? For example, collect histograms, measure efficiencies –L3 does a pretty complete reconstruction of the data Make use of the 90% of the events that we reject? –Measure trigger turn-on curves (for L1 and L2 as well as L3) –Do background studies –(Why write out events and have the huge overhead in having to run offline reconstruction and storing them permanently if they are needed for relatively simple operations that can be performed adequately in L3?) –Write out a stream with L3PhysicsResults and no raw data? Or a ‘not for reco’ stream? Best way to concatenate results from monitor processes running on each of the 100 L3 farm nodes not worked out yet.

Will require extra resources at L3, but the potential return (in terms of spotting trigger problems and in saving offline resources) might make this a very cost-effective investment. This will also be the case if we find that lack of cpu power is limiting the sophistication of the event reconstruction and/or filtering that is possible in L3.

L3 central infrastructure: opportunities for new people to get involved Scriptrunner + central L3 code infrastructure, release management Monitoring/Quality control: – quality control macros – migration to online – "bit-wise" on/offline check – matching L3 objects to MC/L1/L2/reco objects Calibration/alignment technical infrastructure L3RhysicsResults on thumbnail Development of "user" and "physics analysis" tools

Conclusions, outlook. Currently factor ~ 5 rejection needed at L3 Calorimeter-based filtering (jets, electrons, taus) only Next steps (p11.06 release): Commission muons, tracking, primary vertex, NADA Get more systematic monitoring for L3 When L2 turns on we’ll still need factor ~ 5 rejection at L3, but will have to work harder to achieve it! When Tevatron/L1 track and calorimeter triggers/DAQ all turn on we’ll need a much larger rejection factor! Lots of interesting challenges and lots of scope for clever ideas in the months ahead!

To find out more: L3 Algorithms web-pages: L3 Algorithms working group meetings take place every week, Wednesday 14:00-15:30 in the Farside Talk to Dan Claes or Terry Wyatt about the opportunities to get involved!