January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DoE/NSF.

Slides:



Advertisements
Similar presentations
ATLAS PESA Meeting 20/11/02 1 B-Physics Trigger in the TDR Demonstrate viable & affordable B-physics trigger based on the evaluation of two strategies:
Advertisements

CERN Summer Student Session
September 5, 2000D. Acosta -- USCMS Physics Meeting1 Muon PRS Group Darin Acosta (“Muon Guy”) University of Florida.
CMS High Level Trigger Selection Giuseppe Bagliesi INFN-Pisa On behalf of the CMS collaboration EPS-HEP 2003 Aachen, Germany.
Aras Papadelis, Lund University 8 th Nordic LHC Physics Workshop Nov , Lund 1 The ATLAS B-trigger - exploring a new strategy for J/  (ee) ●
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
Upgrading the CMS simulation and reconstruction David J Lange LLNL April CHEP 2015D. Lange.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
C. Foudas, Imperial CMS Upgrades Workshop FNAL – Trigger Introduction Status of the CMS Trigger Upgrade Activities: –uTCA and Optical Link upgrades.
CMS101 Introduction to the CMS Trigger
Wesley SmithAug. 17, 2010 Trigger Budget Review: Travel & COLA - 1 US CMS FY 2011 Budget Review Trigger Travel & COLA Wesley Smith Aug. 17, 2010 Outline.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Status of PRS/Muon Activities D. Acosta University of Florida Work toward the “June” HLT milestones US analysis environment.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Introduzione al Software di CMS N. Amapane. Nicola AmapaneTorino, Aprile Outline CMS Software projects The framework: overview Finding more.
Hadron Calorimeter Readout Electronics Calibration, Hadron Calorimeter Scintillator Upgrade, and Missing Transverse Momentum Resolution due to Pileup Zishuo.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
Muon LPC Meeting, 14 Sep Overview of Muon PRS Activities Darin Acosta University of Florida.
Il Trigger di Alto Livello di CMS N. Amapane – CERN Workshop su Monte Carlo, la Fisica e le simulazioni a LHC Frascati, 25 Ottobre 2006.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
Kati Lassila-Perini/HIP HIP CMS Software and Physics project evaluation1/ Electron/ physics in CMS Kati Lassila-Perini HIP Activities in the.
Commissioning and Performance of the CMS High Level Trigger Leonard Apanasevich University of Illinois at Chicago for the CMS collaboration.
CMS101 Introduction to the CMS Trigger Darin Acosta University of Florida.
Introduction to CMSSW Framework Concepts Simulation & Reconstruction Liz Sexton-Kennedy January 10, 2008.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
June 29, 2000DOE/NSF USCMS Computing and Software Report. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DOE/NSF.
0 Fermilab SW&C Internal Review Oct 24, 2000 David Stickland, Princeton University CMS Software and Computing Status The Functional Prototypes.
CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.
W/Z Plan For Winter Conferences Tom Diehl Saclay 12/2001.
Overview of a Plan for Simulating a Tracking Trigger (Fermilab) Overview of a Plan for Simulating a Tracking Trigger Harry Cheung (Fermilab)
Friday the 18th of May, 2001US CMS Physics J.G. Branson1 Physics in (US) CMS James G. Branson UC San Diego US CMS Collaboration Meeting Riverside CA.
ATLAS Trigger Development
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
US CMS UC Riverside, 18-May-2001, S.Kunori1 Status of JetsMET Shuichi Kunori U. of Maryland 18-May-2001 PRS: Physics Reconstruction and Selection.
Γ +Jet Analysis for the CMS Pooja Gupta, Brajesh Choudhary, Sudeep Chatterji, Satyaki Bhattacharya & R.K. Shivpuri University of Delhi, India.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
USCMS Physics, May 2001Darin Acosta1 Status Report of PRS/  D.Acosta University of Florida Current U.S. activities PRS/  Activities New PRS organization.
L1 Global Muon Trigger Simulation Status URL of this presentation:
Mark Thomson University of Cambridge High Granularity Particle Flow Calorimetry.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Software Week - 8/12/98G. Poulard - CERN EP/ATC1 Status of Software for Physics TDR Atlas Software Week 8 December 1998 G. Poulard.
CMS week1 Agenda Sarah Eno: Status of project Volker Drollinger: jet energy scale and top mass Ritua Kinnunen: Higgs mass resolution E. Doroshkevic:
July 27, 2002CMS Heavy Ions Bolek Wyslouch1 Heavy Ion Physics with the CMS Experiment at the Large Hadron Collider Bolek Wyslouch MIT for the CMS Collaboration.
LHC Symposium 2003 Fermilab 01/05/2003 Ph. Schwemling, LPNHE-Paris for the ATLAS collaboration Electromagnetic Calorimetry and Electron/Photon performance.
USCMS May 2002Jim Branson 1 Physics in US CMS US CMS Annual Collaboration Meeting May 2002 FSU Jin Branson.
SiD Tracking in the LOI and Future Plans Richard Partridge SLAC ALCPG 2009.
CMS Trigger Upgrades Phase-I (1) Muon Trigger Upgrades: – CSCTF (CSC TrackFinder) – MPC (Muon Port Card) Calorimeter Trigger Upgrades: – HCAL TPG – Calorimeter.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Effects of Endcap Staging/Descoping D.Acosta University of Florida.
1 Plans for the Muon Trigger CSC Note. 2 Muon Trigger CSC Studies General performance studies and trigger rate evalution for the full slice Evaluation.
David Lange Lawrence Livermore National Laboratory
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
ALICE – First paper.
High Level Trigger Studies for the Efstathios (Stathis) Stefanidis
Low Level HLT Reconstruction Software for the CMS SST
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
August 19th 2013 Alexandre Camsonne
Presentation transcript:

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DoE/NSF Review of US CMS Software and Computing DoE Germantown January 18, 2000

DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta2 The CMS DAQ Architecture CMS has a multi-tiered trigger system: è L1 reduces rate from 40 MHz to 100 kHz p Custom hardware processes calorimeter and muon data to select e, , muon, jet, E T, E T above threshold è L2, L3, … (HLT) reduces rate from 100 kHz to 100 Hz p Commercial CPU farm runs online programs to select physics channels The CPU farm is fed by an Event Builder composed of readout buffers and a large network switch

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta3 The DAQ/HLT Challenge è DAQ hardware needs sufficient bandwidth p Total event size is 1 MB, event rate is 100 kHz p Bandwidth limited by switch and link technology è HLT selection algorithms must keep only 1 event per 1000 p Limited by ability to filter a L1 data sample which is already rich in physics These two needs are coupled. The needs of HLT drive the hardware, but hardware limitations drive the HLT algorithms è Require “On-Demand” Reconstruction  ORCA p Pull data through switch only as needed (partial event reconstruction), and process only what is needed to keep the HLT latency low

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta4 The Physics Groups è Four Physics Reconstruction and Selection (PRS) groups were established by CMS in April, 1999 p Electron/PhotonC. Seez p MuonU. Gasparini p Jet/Missing E T S. Eno p b/tauA. Caner è Overall coordination by P. Sphicas è US-CMS has substantial involvement in the Electron/Photon, Jet/Met and Muon groups p Focus of the Calorimeter, Endcap Muon, and TriDAS construction project communities è The charge is to evaluate the full chain from L1 to offline on the physics capability of CMS p Design and implement the algorithms and software to provide the necessary rejection in the L1 and HLT triggers and to keep the efficiency high for the CMS physics plan

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta5 Context of Studies The L1 Trigger TDR is targeted for November, 2000 è Still to need to pin down some parameters of the Muon and Calorimeter triggers based on these studies The DAQ TDR is targeted for November, 2001 è Need to understand the rejection capability of the HLT triggers and the amount of data each step requires to validate possible hardware solutions A Physics TDR is planned for ~2003 è It was actually delayed to allow for the transition to object- oriented software In all cases, we need to validate the algorithms for the CMS physics plan, taking into account all possible backgrounds

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta6 Strategy for Physics Groups Collaboration decided to deploy Object-Oriented software  ORCA è HLT studies should be performed in this environment è ORCA provides “On-Demand” reconstruction è No reason in principle why online code should differ from offline code (other than source of data), except that CPU farm has fixed time constraints First HLT milestone for Physics Groups: è For November 1999, show that we can get a factor of 10 rejection of the L1 triggers using only 25% of the event data p Calorimeter and muon data can be used, but only a tiny fraction of tracker data

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta7 Work Plan from April to Nov. ‘99 è Write software for ORCA (in C++ as much as possible) p Hit generation, digitization, L1 trigger, reconstruction è Produce large Monte Carlo samples p Fortran CMSIM package provides GEANT3 hits to ORCA p Must coordinate the CPU resources of many institutes è Make them persistent p Store digitized hits in an Objectivity database è Write user analysis jobs and produce Ntuples è Determine L1 rates and efficiencies p Set the baseline è Develop L2 algorithms p See what HLT can do This was an aggressive plan for 6 months time!

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta8 Trigger and Physics Challenges Electron/Photon: è Primary background from  0 s è After threshold, further rate reduction from p Isolation (L1–L2), track match and E/p cuts (L3) Muon: è Rate primarily from real muons! (  /K and leptonic b,c decays) è After threshold, further rate reduction from p Isolation, topology (L1–L2) Jet/MET: è Rate from jet production è After threshold, further rate reduction from p Refined jet algorithms,  coverage, granularity (L1–L2) B/tau: è Rate reduction from tracking (L3) è Principal challenge is how to design L1 trigger without it

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta9 Technical Challenges for U.S. Groups Jet/MET and Electron/Photon è Basic calorimetry code existed for the ECAL at the outset (Apr.99) è Needed to be extended to HCAL in a unified way è Detailed simulation for pile-up had to be introduced (Endcap) Muon è Basic code existed for barrel muon system at outset (Apr.99) è Code for endcap muon system had to be written from scratch p Detector technology and organization is completely different, as is the entire L1 trigger chain In All Groups è Working environments had to be established in the U.S. p Manpower had to found p Regular meetings were arranged p User facility was established at Fermilab è Data handling facilities had to be set up p Caltech, Florida, UC Davis

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta10 U.S. Involvement U.S. makes up a large fraction of the Physics Groups Jet/MET Physics Group: è Substantial U.S. contribution from 10 in all areas R.Wilkinson (Caltech), D.Green & W.Wu (FNAL), E.McCliment (Iowa), S.Eno & S.Kunori (Maryland), P.Sphicas (MIT), D.Stickland & C.Tully (Princeton), S.Dasu (Wisconsin) Electron/Photon Physics Group: è Substantial U.S. contribution from 6 in all areas S.Shevchenko & R.Wilkinson (Caltech), P.Vikas (Minnesota), P.Sphicas (MIT), D.Stickland & C.Tully (Princeton) Muon Physics Group: è Substantial contribution from 7 in all areas R.Wilkinson (Caltech), P.Sphicas (MIT), R.Breedon & T.Cox(UCDavis), B.Tannenbaum (UCLA), D.Acosta & S.M.Wang (UFlorida), A unique opportunity to have lead role in CMS physics studies

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta11 Jet/MET and e/  HLT Accomplishments è A complete simulation, reconstruction, and analysis chain in ORCA was set up by the Nov milestone p Considerable U.S. involvement in meeting this milestone p U.S. needs software engineering support for training and guidance è Large MC data sets generated by Caltech p Installed into Objectivity database at CERN p Being made available at FNAL user facility (limited by manpower and resources) è L1 Trigger rates and efficiencies determined è L2 algorithms developed p EM clustering, isolation,  0 rejection, bremsstrahlung recovery p Jet algorithms,  identification p Typical rejection factors in 3-10 range

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta12 L1 Jet Trigger Rates

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta13 L2 Single e/  Trigger Rate è Rejection factor in range 3-10 è Efficiency about 94% è Higher statistics needed

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta14 Muon HLT Accomplishments è A complete simulation, reconstruction, and analysis chain in ORCA was set up for (barrel) muon studies by the Nov milestone (but just barely…) è About 100,000 background events (min. bias and W/Z) were generated p Single muon weighting scheme developed at CERN Reduces colossal data sets, but slight bug found later… p Florida, UC Davis, and Italian groups produced samples è Objectivity database at CERN filled with digitized hits è Standard analysis Ntuples created è Preliminary results on L1 and L2 rates and efficiencies were obtained

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta15 Muon L1 and L2 Rates Factor 2–6 rejection from improved P T resolution

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta16 Endcap Muon Accomplishments è First release of core reconstruction software for endcap muon system was completed by Nov p Includes geometry, digitization, persistency, L1 trigger p But the integration of all these packages is still ongoing… p Offline OO track reconstruction is under development now è HLT studies were hampered by the need to get the reconstruction software written despite a small (and inexperienced) group è Endcap Muon community needs software engineering support to help guide this development and make it more robust and efficient è Nevertheless, preliminary physics results on the endcap L1 trigger performance were derived from CMSIM and non-ORCA software for the Nov milestone p Needed for L1 Trigger TDR to validate scheme

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta17 EMU Single Muon L1 Trigger Rate è Rate is for L=10 34 è Target rate of 1kHz per unit rapidity can be met with a threshold of 15 GeV only for a 3-station sagitta measurement è L1 hardware needs to be modified to include this feature

January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta18 Conclusions U.S. took a lead role in HLT studies, and in a short period of time delivered software and results è Very preliminary results show an L2 rejection factor of  5 Although substantial progress shown, the DAQ-HLT milestone was not met è Needed more time for code to stabilize, and to develop HLT algorithms è Lacked sufficient manpower and resources è Completion of first milestone is expected by June 2000 è Future milestones should demonstrate that 100 Hz DAQ rate is feasible with high physics efficiency (June 2001) User facilities and software/computing support is essential for U.S. physicists to maintain prominent role