L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.

Slides:



Advertisements
Similar presentations
Mayukh Das 1Louisiana Tech University For the 2004 D0SAR Workshop Activities with L3 and the Higgs By : Mayukh Das.
Advertisements

Michigan State University 4/15/ Simulation for Level 2 James T. Linnemann Michigan State University NIU Triggering Workshop October 17, 1997.
Freiburg Seminar, Sept Sascha Caron Finding the Higgs or something else ideas to improve the discovery ideas to improve the discovery potential at.
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
HLT - data compression vs event rejection. Assumptions Need for an online rudimentary event reconstruction for monitoring Detector readout rate (i.e.
The Silicon Track Trigger (STT) at DØ Beauty 2005 in Assisi, June 2005 Sascha Caron for the DØ collaboration Tag beauty fast …
FPCP 2002, 05/16-18/2002 p. 1 Richard E. Hughes, The Ohio State UniversityCDF Run II Status Status of CDF and Prospects Flavor Physics and CP Violation.
Level 1 Trigger system design specifications 1MHz Tevatron Collisions Level 2 Level 3 Tape 10kHz1kHz20-50Hz Time budget for Level3 i/o, event building,
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
Lucia Silvestris, INFN Bari and CERN/CMC Status Report on CPT Project 23 March 2001, CERN Meeting del Consorzio INFN Status Reports on CPT Project, on.
9/14/2001D0 Simulation Tasks Q.Li 1 Simulation Tasks Status Qizhong Li Fermilab Sept. 14, 2001 D0 Collaboration Meeting.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
May 14, 2001E. Gallas/Trigger Database1 Status of the Trigger Database Elizabeth Gallas, Rich Wellner, Vicky White Fermilab - Computing Division See my.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
The High-Level Trigger of the ALICE Experiment Heinz Tilsner Kirchhoff-Institut für Physik Universität Heidelberg International Europhysics Conference.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Alignment Strategy for ATLAS: Detector Description and Database Issues
Nick Brook Current status Future Collaboration Plans Future UK plans.
Level 3 Muon Software Paul Balm Muon Vertical Review May 22, 2000.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
April 22, 2002Elizabeth Gallas/Trigger Database 1 Trigger Database Tutorial Elizabeth Gallas Fermilab Computing Division DØ Collaboration Meeting April.
9 February 2000CHEP2000 Paper 3681 CDF Data Handling: Resource Management and Tests E.Buckley-Geer, S.Lammel, F.Ratnikov, T.Watts Hardware and Resources.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
A Graphics Sampler for the D  Detector G. Alverson, K. Bos, T. Burnett, F. Canelli, D. Coppage, L. Duflot, Y. Gershtein, N. Graf, S. Hagopian, P. Hamel,
CHEP06, Mumbai-India, Feb 2006V. Daniel Elvira 1 The CMS Simulation Validation Suite V. Daniel Elvira (Fermilab) for the CMS Collaboration.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
DØ Data Handling & Access The DØ Meta-Data Browser Pushpa Bhat Fermilab June 4, 2001.
L3 Filtering: - short- and medium term plans - how you can help out All-D  Meeting: 14 th March 2003 Terry Wyatt (FNAL-CD/Manchester) on behalf of the.
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
8-Dec-15T.Wildish / Princeton1 CMS analytics A proposal for a pilot project CMS Analytics.
June 29, 2000DOE/NSF USCMS Computing and Software Report. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DOE/NSF.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
1 Checks on SDD Data Piergiorgio Cerello, Francesco Prino, Melinda Siciliano.
ATLAS Trigger Development
DØ Calorimeter Software Meeting April 26, Leslie Groer Columbia UniversityCalorimeter Online Software Status + Needs 1  Examines  Crate Unpacking.
All Experimenters MeetingDmitri Denisov Week of September 9 to September 16 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 4.0pb.
L3 Algorithms: status and plans for the near future D  Trigger Workshop: 22 nd April 2002 Dan Claes and Terry Wyatt.
L3 Filtering: status and plans for the near future D  Collaboration Meeting: 25 th April 2002 Dan Claes and Terry Wyatt, on behalf of the L3 Algorithms.
Pocket User Guides Level 3 Muon Tools Paul Balm Oct 26, 2000 The ScriptRunner Tutorial Linked from: www-d0.fnal.gov/~balm/muon/
DØ Algorithms Meeting April 6, Leslie Groer, Columbia Univ Ursula Bassler, LPNHE, ParisCalorimeter Online Software Status 1  Examines  Crate Unpacking.
L3 Algorithms: status and plans for the near future All-D  Meeting: 1 st March 2002 Dan Claes and Terry Wyatt.
Examine Overview D0 Online Workshop June 3, 1999 Jae Yu Outline 1. What is an Examine? 2. How Many Examines? 3. How does it work? 4. What are the features?
04/09/2007 Reconstruction of LHC events at CMS Tommaso Boccali - INFN Pisa Shahram Rahatlou - Roma University Lucia Silvestris - INFN Bari On behalf of.
DØ Beauty Physics in Run II Rick Jesik Imperial College BEACH 2002 V International Conference on Hyperons, Charm and Beauty Hadrons Vancouver, BC, June.
Generic Trigger List for MC Production Terry Wyatt. L3 Algorithms Meeting 11 th Sept overview of planned list tools/Filters in L3 technical section.
BESF Framework Development Weidong Li
AliRoot survey: Reconstruction P.Hristov 11/06/2013.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
CALORIMETER CELL MONITORING TOOL Viacheslav Shary.
Mar 05 - hvdsOffline / HLT1  Athena SW Infrastructure  programming + applying tools wrt. dependencies between packages  developing + testing extra ideas.
Nikhef Scientific Meeting 2000Onne Peters D0 Muon Spectrometer December 14-15, Amsterdam Onne Peters Nikhef Jamboree 2000.
David Lange Lawrence Livermore National Laboratory
ATLAS Distributed Computing Tutorial Tags: What, Why, When, Where and How? Mike Kenyon University of Glasgow.
CMS High Level Trigger Configuration Management
ALICE – First paper.
Commissioning of the ALICE HLT, TPC and PHOS systems
Overview of CLAS12 Calibration
Kevin Burkett Harvard University June 12, 2001
FPD Motivation Andrew Brandt UTA
The Silicon Track Trigger (STT) at DØ
Special edition: Farewell for Stephen Bailey
Low Level HLT Reconstruction Software for the CMS SST
Presentation transcript:

L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current status of individual tools/filters see, e.g., L3 talk at trigger workshop:

Design rates: L1 L2 L3 tape 10kHz 1kHz 20-50Hz Functions of L3 system: Event building (not discussed further here) Filtering: Partial event reconstruction Select which events to be recorded Define streams Monitoring L3 farm has 100 x 1 GHz P3 CPUs Time budget for L3 i/o, event building, filtering  100/1000 = 0.1 s

Within weeks : L1 L2 L3 tape 500 Hz 30-40Hz Currently: L1 L3 tape. 100 Hz Hz Factor ~ 5 rejection needed Calorimeter-based filtering only (jets, electrons, taus) Next steps (p11.06, p11.07 releases) to commission: Muons, global tracking, track-based primary vertexing. “Offline” quality geometry and unpacking for tracking and calorimeter. and we shall have a much better idea of how much cpu is needed for filtering

For each L1/L2 trigger that fires: One or more L3 filter scripts run Each script calls the filters/tools necessary for the trigger decision Details of: which scripts called for each L1/L2 trigger which filters/tools are called by each script –performed by ScriptRunner –determined by the triggerlist database

Other tools in development cft-only tracking missing E T tools to associate objects in different detectors (e.g. track to muon) cps and fps cluster finding and unpacking b-tag: impact parameter, secondary vertex tools to calculate "physics" quantities –(e.g., inv. mass, delta_eta) tools to identify physics event types –(e.g., W, Z, stream definitions) Standard certification and verification requirements:

L3 Monitoring Now L3 filter statistics for each trigger available to shift crew via daq_monitor L3 reconstruction results written out with the raw data l3fanalyze program: produces rootuple –Used offline for: testing new code versions checks of data quality

L3 Monitoring Needed A lot more work to do here! –implement shadow nodes –run l3fanalyze online as ‘examine’ fill monitoring histograms from rootuple –macros to define/display monitoring histograms –common job submission on standard test samples to exercise all the L3 tools/filters migration to online –"bit-wise" on/offline check –matching L3 objects to MC/L1/L2/reco objects Try to pool efforts as much as possible with other groups, especially L2.

Extra manpower needed for L3 central infrastructure Scriptrunner + central L3 code infrastructure, release management 1.5 Streaming0.5 Monitoring1.5 Calibration/alignment technical infrastructure1.0 Development of "user" and "physics analysis" tools>1 L3 thumbnail0.5 + a lot of work to develop and test L3 tools and filters and work out how best to use them for physics!

There are many open questions How to handle redundancy in trigger, e.g., leptons –many L3 filter scripts –one L3 filter script (with details stored in L3 data) Are 256 L3 triggers enough? What else can we do in L3? –more sophisticated monitoring? –technical and/or physics measurements on rejected events? –define “not for reco” stream?

Trigger Simulation Level 1 “hardware” trigger requires full simulation Level 2 Level 3 “software” trigger (C++ code) run same software online and in simulator run on raw data or Monte Carlo algorithm development efficiency/rejection estimates technical software checks – e.g. memory/cpu consumption

Level 2 Simulator The currently available L2 preprocessors are: calorimeter jet calorimeter electromagnetic tested muon central on data muon forward calorimeter missing Et central preshower central track trigger (silicon + fibre tracker) (STT soon to be integrated) + L2 global essentially complete

Trigger List Trigger Database.xml file COOR COORSIM download D0TrigSim online RawDataChunk Monte Carlo Datafile (d0sim)

Manpower Requirements TopicNumber of FTE required D0TrigSim executable1 L1 subsystems + coordination 3 L2 simulation2 L2 offline verification + analysis tools 1