P H E N I X / R H I CQM04, Janurary 11-17, 20041 Event Tagging & Filtering PHENIX Level-2 Trigger Xiaochun He Georgia State University.

Slides:



Advertisements
Similar presentations
Group Meeting 03/02/2004. Format of Weekly Group Meeting  Daily chore  Research report: electronic presentation required  Journal report.
Advertisements

Measurement of charmonia at mid-rapidity at RHIC-PHENIX  c  J/   e + e -  in p+p collisions at √s=200GeV Susumu Oda CNS, University of Tokyo For.
Coupling an array of Neutron detectors with AGATA The phases of AGATA The AGATA GTS and data acquisition.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
GLAST LAT ProjectOnline Peer Review – July 21, Integration and Test L. Miller 1 GLAST Large Area Telescope: I&T Integration Readiness Review.
March 2003 CHEP Online Monitoring Software Framework in the ATLAS Experiment Serguei Kolos CERN/PNPI On behalf of the ATLAS Trigger/DAQ Online Software.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.
Trigger Overview Planning for TN (Trigger Nirvana) in Run Expectations for Run-04 Au-Au Luminosities 2.Trigger Planning for Run-04 3.Lessons Learned.
CLEO’s User Centric Data Access System Christopher D. Jones Cornell University.
CERN - European Laboratory for Particle Physics HEP Computer Farms Frédéric Hemmer CERN Information Technology Division Physics Data processing Group.
Υ Measurements at PHENIX Shawn Whitaker RHIC/AGS Users’ Meeting June 20, /20/20111Shawn Whitaker - RHIC/AGS Users Meeting.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Online Data Challenges David Lawrence, JLab Feb. 20, /20/14Online Data Challenges.
Artdaq Introduction artdaq is a toolkit for creating the event building and filtering portions of a DAQ. A set of ready-to-use components along with hooks.
Simulation issue Y. Akiba. Main goals stated in LOI Measurement of charm and beauty using DCA in barrel –c  e + X –D  K , K , etc –b  e + X –B 
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Performance of PHENIX High Momentum Muon Trigger 1.
Leo Greiner TC_Int1 Sensor and Readout Status of the PIXEL Detector.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
The PHENIX Event Builder David Winter Columbia University for the PHENIX Collaboration DNP 2004 Chicago, IL.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
ONCS Presentation Run control from user’s perspective Common Toolbox Interface to RHIC control C.Witzig PHENIX Core Week June 9, 1999
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
EMCal in ALICE Norbert Novitzky 1. Outline How Electro-Magnetic Calorimeters works ? Physics motivation – What can we measure with Emcal ? – Advantages.
Tracker Visualization Tool: integration in ORCA Maria S. Mennea, Giuseppe Zito University & INFN Bari, Italy Tracker b-tau Cosmic Challenge preparation.
1 The PHENIX Experiment in the RHIC Run 7 Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island,
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
Measurement of J/  -> e + e - and  C -> J/  +   in dAu collisions at PHENIX/RHIC A. Lebedev, ISU 1 Fall 2003 DNP Meeting Alexandre Lebedev, Iowa State.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
2007 Run Update for STAR Jeff Landgraf For the STAR collaboration.
STAR J/  Trigger in dA Manuel Calderon for the Heavy-Flavor Group Trigger Workshop at BNL October 21, 2002.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
Leo Greiner IPHC beam test Beam tests at the ALS and RHIC with a Mimostar-2 telescope.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Feasibility study of Heavy Flavor tagging with charged kaons in Au-Au Collisions at √s=200 GeV triggered by High Transverse Momentum Electrons. E.Kistenev,
Study of J/  ->e + e - in d-Au Collision with Electron Triggers in PHENIX W. Xie (UC. Riverside) PHENIX Collaboration April APS in 2003.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
PHENIX DAQ RATES. RHIC Data Rates at Design Luminosity PHENIX Max = 25 kHz Every FEM must send in 40 us. Since we multiplex 2, that limit is 12 kHz and.
Study of Charged Hadrons in Au-Au Collisions at with the PHENIX Time Expansion Chamber Dmitri Kotchetkov for the PHENIX Collaboration Department of Physics,
Muon Arm Physics Program Past, Present + Future Patrick L. McGaughey Los Alamos National Laboratory Santa Fe June 17, 2003.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Study of polarized sea quark distributions in polarized pp collisions at sqrt(s)=500GeV with PHENIX Oct. 10, 2008 Tsutomu Mibe (KEK) for the PHENIX collaboration.
1 Performance and Upgrade Plans for the PHENIX Data Acquisition System Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration.
PHENIX J/  Measurements at  s = 200A GeV Wei Xie UC. RiverSide For PHENIX Collaboration.
Quark Matter 2002, July 18-24, Nantes, France Dimuon Production from Au-Au Collisions at Ming Xiong Liu Los Alamos National Laboratory (for the PHENIX.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
CT-PPS DB Info (Preliminary) DB design will be the same as currently used for CMS Pixels, HCAL, GEM, HGCAL databases DB is Oracle based A DB for a sub-detector.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
LHC experiments Requirements and Concepts ALICE
Controlling a large CPU farm using industrial tools
Silicon Pixel Detector for the PHENIX experiment at the BNL RHIC
ALICE – First paper.
Commissioning of the ALICE HLT, TPC and PHOS systems
Sergey Abrahamyan Yerevan Physics Institute APEX collaboration
The IFR Online Detector Control at the BaBar experiment at SLAC
BESIII EMC electronics
The IFR Online Detector Control at the BaBar experiment at SLAC
The Online Detector Control at the BaBar experiment at SLAC
MOORE (Muon Object Oriented REconstruction) MuonIdentification
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
The LHCb High Level Trigger Software Framework
The CMS Tracking Readout and Front End Driver Testing
Implementation of DHLT Monitoring Tool for ALICE
Presentation transcript:

P H E N I X / R H I CQM04, Janurary 11-17, Event Tagging & Filtering PHENIX Level-2 Trigger Xiaochun He Georgia State University

P H E N I X / R H I C QM04, Janurary 11-17, S.S. Adler a, S. Belikov f, M. Bjorndal c, M. Chiu c, C.R. Cleven e, B.A. Cole c, K.K. Das d, E.J. Desmond a, J.E. Frantz c, A.D. Frawley d, X. He e, S. Kelly b, G.C. Mishra e, J.L. Nagle b, C. Pinkenburg a, M.L. Purschke a, H. Qu e Main Contributors a.Brookhaven National Laboratory, Upton, NY , USA b.University of Colorado, Boulder, CO 80309, USA c.Columbia University, New York, NY 10027, USA d.Florida State University, Tallahassee, FL 32306, USA e.Georgia State University, Atlanta, GA 30303, USA f.Iowa State University, Ames, IA 50011, USA

P H E N I X / R H I C QM04, Janurary 11-17, Introduction The PHENIX Level-2 is a software-based trigger which runs in the scope of the PHENIX Event Builder of the PHENIX data acquisition system. The first online implementation of the PHENIX Level-2 was for the second RHIC run in 2001/2002 and its results was published in NIM A (2003). The Level-2 trigger provides the capability for the PHENIX experiment to do event tagging and filtering online based on potential physics interests. The Level-2 trigger algorithms are developed and tested in an offline framework before being integrated Into the Event Builder. The infrastructure of the Level-2 trigger which Includes Its offline development framework, online configuration and monitoring will be presented in this poster.

P H E N I X / R H I C QM04, Janurary 11-17, PHENIX Challenge “Rare” physics program High data rate Complex particle detectors: 4 spectrometer arms 12 Detector subsystems 350,000 detector channels Lots of readout electronics

P H E N I X / R H I C QM04, Janurary 11-17, PHENIX Online System switch Event logger HPSS Online Monitor OnlCal system 70 MB/s 500 MB/s

P H E N I X / R H I C QM04, Janurary 11-17, PHENIX Event Builder … ATP Array of ATP: collect event packets SEBs run Level-2 trigger algorithms format event send events to event logger switc h From DCM Event loggers … Array of SEB: each SEB collect the associated data packets from the data collection modules. event segments are distributed across the configured SEB’s all event segments from the same event are collected by one of the available ATP SEB x SEB y Level-2 Monitoring … PHENIX Level-2 trigger algorithms run on each Assembly and Trigger Processor (ATP) which is part of the PHENIX Event Builder components. Each ATP collects event segments from all Sub Event Buffer (SEB) nodes and sends each complete formatted event to the event logger. Level-2 configuration parameters are sent to each ATP from Run Control before each run is started.

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 Infrastructure EVENT DATA and DATABASE Level-2 Control Level-2 Data Accessor Layer Level-2 Trigger Primitives Layer Level-2 Trigger Algorithms Layer The Level-2 trigger code is written in C++ and can be viewed as different logical layers as shown above. Aside from the control class, each specific trigger class is derived from common template to enforce a common interface. For each event each trigger algorithm object is created which then accesses the associated trigger primitives. Each trigger primitive object is created once only and can be shared by multiple trigger algorithms for each event. The primitive objects access data accessor objects, database objects and other primitive objects. These objects only get instantiated under request. A Level-2 monitoring object is also created at the end of the each event processing and is passed to the remote monitoring process via direct Corba connection.

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 Trigger in Run-2 Level-2 trigger system was first implemented in Run-2 for 200 GeV Au+Au collisions. There were seven trigger algorithms (which included single electron, electron pair, single deep muon, muon pair, high pT EMCal, High pT charged and coherent peripheral) each of them could use different threshold settings. The total Level-2 trigger processing time for a single event was found to be 31 ms (the design value is 25 ms). This was adequate for RHIC Run-2 with 32 ATP’s active given the maximum DAQ throughput about 800 Hz. For Run-2, regular Level-2 monitoring was only done with events written to disk. In over all, the PHENIX Level-2 triggers worked well in Run-2.

P H E N I X / R H I C QM04, Janurary 11-17, Trigger Development System Over the summer of 2003, a standalone Level-2 trigger development and test system was constructed both with Linux platform and NT which allows us to run the trigger algorithms and monitoring the trigger performance. This system greatly eases the process of integrate the Level-2 triggers into the PHENIX Event Builder and allows each trigger algorithm writer to fully debug and test it without needing the fully functional DAQ system. The histograms below show the example timing information obtained from this development system. The left one is the result from NT, and the right one from Linux. NT Linux

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 Configuration Level-2 trigger parameters are configured via Run Control before a run is started. This configuration process sets the association between the Level-1 trigger bits and the Level-2 algorithms. It also sets the scaledown factors for each algorithm.

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 Monitoring A screen snapshot of the Level-2 real-time performance monitoring windows. All running trigger processors (ATP’s) are indicated with green box, gray boxes mean that the corresponding ATP’s are not enabled and the red boxes indicate the enabled ATP’s not responding. Trigger list windows Note:

P H E N I X / R H I C QM04, Janurary 11-17, Monitoring Primitives A snapshot of trigger primitive information from a given ATP (example only)

P H E N I X / R H I C QM04, Janurary 11-17, Trigger Summary Tables The table below shows an example of the Level-2 trigger performance (averaged over all ATP’s) of all trigger algorithms configured in a test system. The most useful information are: a) Accepted event count b) Average event processing time c) Trigger rejection factor

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 Algorithms in Run-4 There will be four Level-2 trigger algorithms in Run-4 Au+Au di-electron trigger High pT electromagnet calorimeter trigger Muon North Arm di-muon trigger

P H E N I X / R H I C QM04, Janurary 11-17, Level-2 for Run-4 Currently we do not need Level-2 triggers since we can in principle archive everything that gets through the DAQ, which is about at least 250 MB/s (could be 500 MB/s). This is based on the facts that, on average, the expected RHIC luminosity will be 8 x cm -2 s -1 (note that it is currently seen 5 x cm -2 s -1 ) and assuming 50% vertex cut, the event rate will be 2500 Hz. The average event size is less than 200 KB. The Level-2 trigger for Run-4 (without rejection) will be enabled in the week after Quark Matter Each recorded event will be tagged with Level-2 processing information. It will not only allow us to make prompt online physics monitoring but also enable us to speed up offline data analysis via event filtering.