1 Upgrades for the PHENIX Data Acquisition System Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long.

Slides:



Advertisements
Similar presentations
Commissioning the PHENIX RPC Forward Trigger Upgrade Michael Daugherity Abilene Christian University for the PHENIX Collaboration.
Advertisements

An SiPM Based Readout for the sPHENIX Calorimeters Eric J. Mannel IEEE NSS/MIC October 30, 2013.
1 Run 9 PHENIX Goals and Performance John Haggerty Brookhaven National Laboratory June 4, 2009.
LHCb Upgrade Overview ALICE, ATLAS, CMS & LHCb joint workshop on DAQ Château de Bossey 13 March 2013 Beat Jost / Cern.
1 The Forward Silicon Vertex Detector Upgrade for the PHENIX Experiment at RHIC Douglas Fields University of New Mexico Feb. 12, 2011 Douglas Fields, WWND11,
PHENIX Vertex Tracker Atsushi Taketani for PHENIX collaboration RIKEN Nishina Center RIKEN Brookhaven Research Center 1.Over view of Vertex detector 2.Physics.
The Physics Potential of the PHENIX VTX and FVTX Detectors Eric J. Mannel WWND 13-Apr-2012.
Performance of the PHENIX Muon Tracking System in Run-2 Ming X. Liu Los Alamos National Lab (for the PHENIX Collaboration) –Detector Commissioning –Detector.
The SLHC and the Challenges of the CMS Upgrade William Ferguson First year seminar March 2 nd
David L. Winter for the PHENIX Collaboration PHENIX Silicon Detector Upgrades RHIC & AGS Annual Users' Meeting Workshop 3 RHIC Future: New Physics Through.
Jornadas LIP, Dez P. Martins - CFTP-IST The NA60 Silicon Vertex Telescopes Dimuon measurements Dimuon measurements Vertex telescope used in: Vertex.
Simulation issue Y. Akiba. Main goals stated in LOI Measurement of charm and beauty using DCA in barrel –c  e + X –D  K , K , etc –b  e + X –B 
PHENIX Silicon Pixel Detector Construction, Operation, and the first Results Atsushi Taketani RIKEN Nishina center RIKEN Brookhaven Research Center Outline.
1 Performance of the LHCb VELO Outline LHCb Detector and its performance in Run I LHCb Detector and its performance in Run I LHCb VELO LHCb VELO VELO performance.
1 Perspectives for quarkonium production in CMS Carlos Lourenço, on behalf of CMSQWG 2008, Nara, Japan, December 2008.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Performance of PHENIX High Momentum Muon Trigger 1.
November 18, 2008 John Haggerty 1 PHENIX In Run 9 John Haggerty Brookhaven National Laboratory.
Pixel hybrid status & issues Outline Pixel hybrid overview ALICE1 readout chip Readout options at PHENIX Other issues Plans and activities K. Tanida (RIKEN)
Leo Greiner TC_Int1 Sensor and Readout Status of the PIXEL Detector.
Frontend of PHENIX Si pixel K. Tanida (RIKEN) FEM/DAQ meeting for PHENIX upgrade (10/24/02) Outline Overview of PHENIX Si pixel detector ALICE1 chip readout.
Silicon Vertex Tracker (VTX) for PHENIX Experiment at RHIC Y. Akiba (RIKEN) for PHENIX collaboration Detector Advisory Committee Meeting November 22, 2003.
Leo Greiner IPHC DAQ Readout for the PIXEL detector for the Heavy Flavor Tracker upgrade at STAR.
Update on the HBD Craig Woody BNL DC Meeting June 8, 2005.
The PHENIX Event Builder David Winter Columbia University for the PHENIX Collaboration DNP 2004 Chicago, IL.
The W program at PHENIX R. Seidl (University of Illinois and RBRC) for the PHENIX Collaboration Workshop on Parity Violating Spin Asymmetries BNL, April.
1 LHC-Era Data Rates in 2004 and 2005 Experiences of the PHENIX Experiment with a PetaByte of Data Martin L. Purschke, Brookhaven National Laboratory PHENIX.
Takao Sakaguchi, BNL Run-11 PHENIX Run Coordinator PHENIX Run-11 Report Sakaguchi, RHIC retreat 1 RHIC retreat version.
The PHENIX Forward Silicon Vertex Tracker Eric J. Mannel IEEE NSS/MIC October 29, 2013.
Swadhin Taneja Stony Brook University On behalf of Vertex detector team at PHENIX Collaboration 112/2/2015S. Taneja -- DNP Conference, Santa Fe Nov 1-6.
1 Silicon Vertex Detector at PHENIX Atsushi Taketani RIKEN / RBRC 1.Physics Goal 2.Detector Concept 3.Structure 4.Pixel detector 5.Strip detector 6.Summary.
LHCb front-end electronics and its interface to the DAQ.
1 The PHENIX Experiment in the RHIC Run 7 Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island,
Sept. 7, 2004Silicon VTX Workshop - Brookhaven National Laboratory, Long Island, New York Prototype Design of the Front End Module (FEM) for the Silicon.
Solid State Detectors for Upgraded PHENIX Detector at RHIC.
Spin Physics with the PHENIX Silicon Vertex Tracker Junji Tojo RIKEN for the PHENIX Collaboration Advanced Studies Institutes - Symmetries and Spin July.
Performance of PHENIX High Momentum Muon Trigger.
2007 Run Update for STAR Jeff Landgraf For the STAR collaboration.
FVTX Electronics (WBS 1.5.2, 1.5.3) Sergey Butsyk University of New Mexico Sergey Butsyk DOE FVTX review
Connector Differential Receiver 8 Channels 65 MHz 12 bits ADC FPGA Receive/buffer ADC data Format triggered Events Generate L1 Primitives Receive timing.
Leo Greiner IPHC beam test Beam tests at the ALS and RHIC with a Mimostar-2 telescope.
W. Riegler, ECFA HL-LHC Experiment Workshop, Oct. 21st, 2014 The ALICE Upgrade.
Pixel Atsushi Taketani RIKEN RIKEN Brookhaven Research Center 1.Overview of Pixel subsystem 2.Test beam 3.Each Components 4.Schedule 5.Summary.
Performance of PHENIX High Momentum Muon Trigger.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
PHENIX DAQ RATES. RHIC Data Rates at Design Luminosity PHENIX Max = 25 kHz Every FEM must send in 40 us. Since we multiplex 2, that limit is 12 kHz and.
International Workshop on Radiation Imaging Detectors, Glasgow, July 25-29, 2004 Johann M. Heuser, RIKEN for the PHENIX Collaboration - RHIC and PHENIX.
P H E N I X / R H I CQM04, Janurary 11-17, Event Tagging & Filtering PHENIX Level-2 Trigger Xiaochun He Georgia State University.
Muon Arm Physics Program Past, Present + Future Patrick L. McGaughey Los Alamos National Laboratory Santa Fe June 17, 2003.
Jamaica 11 Jan.8, 2009 Muon Trigger Upgrade at PHENIX RIKEN/RBRC Itaru Nakagawa RIKEN/RBRC Itaru Nakagawa.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
AB c CEBAF Hall D ASIC Needs in Nuclear Science T. Ludlam Brookhaven National Lab 1 RHIC.
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
Study of polarized sea quark distributions in polarized pp collisions at sqrt(s)=500GeV with PHENIX Oct. 10, 2008 Tsutomu Mibe (KEK) for the PHENIX collaboration.
The BTeV Pixel Detector and Trigger System Simon Kwan Fermilab P.O. Box 500, Batavia, IL 60510, USA BEACH2002, June 29, 2002 Vancouver, Canada.
1 Performance and Upgrade Plans for the PHENIX Data Acquisition System Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration.
Super BigBite DAQ & Trigger Jens-Ole Hansen Hall A Collaboration Meeting 16 December 2009.
Quark Matter 2002, July 18-24, Nantes, France Dimuon Production from Au-Au Collisions at Ming Xiong Liu Los Alamos National Laboratory (for the PHENIX.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Evidence for Strongly Interacting Opaque Plasma
LHC experiments Requirements and Concepts ALICE
Silicon Pixel Detector for the PHENIX experiment at the BNL RHIC
ALICE – First paper.
Example of DAQ Trigger issues for the SoLID experiment
SVT detector electronics
DCM II DCM function DCM II design ( conceptual ?)
The CMS Tracking Readout and Front End Driver Testing
DCM II DCM II system Status Chain test Schedule.
SVT detector electronics
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

1 Upgrades for the PHENIX Data Acquisition System Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island, NY

2 RHIC/PHENIX at a glance RHIC: 2 independent rings, one beam clockwise, the other counterclockwise sqrt(S)= 500GeV * Z/A ~200 GeV for Heavy Ions ~500 GeV for proton-proton (polarized) ‏ PHENIX: 4 spectrometer arms 15 Detector subsystems 500,000 detector channels Lots of readout electronics Uncompressed Event size typically KB for AuAu, CuCu, pp Data rate ~6KHz (Au+Au) ‏ Front-end data rate GB/s Data Logging rate ~500MB/s, 700 MB/s max

3 TOF-W RXNP MPC-N Our youngest Detector Systems HBD

4 Need for Speed: Where we are ATLAS CMS LHCb ALICE ~25~40 ~100 ~300 All in MB/s all approximate ~100 ~ ~1250 Lvl1-Triggers in Heavy Ions have a notoriously low rejection factor that's because so many events have something that's interesting (different from LHC) ‏ But hey, we could write out almost everything that RHIC gave us, so why bother... this approach has served us really well. It also opened up access to processes that you can't exactly trigger on, it “just” takes some more work offline.

MB/s This shows the aggregated data rate from the DAQ to disk in a RHIC fill We are very proud of this performance... Decay of RHIC Luminosity Length of a DAQ run It's not the best, it's one where I was there... the best RHIC fill best went up to 650MB/s

6 Run 10 Event statistics 3150 TB 950 TB Physics PHENIX Raw Data 100 TB

7 Upgrade Programs RHIC will give us several luminosity and beam livetime upgrades The era where could mostly write out “everything” is coming to an end The Future we will add detectors in the central region which will significantly increase our data volume

8 Upgrades 3 main new detectors (that's in addition to the ones I showed before as “on board”): The Vertex/Forward Vertex detectors A Muon trigger upgrade RPC detectors 800-pound gorilla 23911Total %6FVTX %/0.16%3VTX pixel %/2.5%2VTX strip Data rate (Gbps)* Event size (kbyte)‏ OccupancyDCM groups Detector Triples current event size 23911Total %6FVTX %/0.16%3VTX pixel %/2.5%2VTX strip Data rate (Gbps)* Event size (kbyte)‏ OccupancyDCM groups Detector VTX pixel %/0.16% VTX Pixels will be installed this summer for Run 11

9 Upgrades Central Silicon Vertex Trackers “VTX” Pixel Strippixel

10 Silicon Pixel in Run 11 ALICE1LHCb readout chip: Pixel: 50 µm (f) x 425 µm (Z). Channels: 256 x 32. Output: binary, read-out in 25.6  Radiation Hardness: ~ 30Mrad Sensor module: 4 ALICE1LHCb readout chips. Bump-bonded (VTT) to silicon sensor. Thickness: 200  m Thickness: r/o chips 150 µm Half-ladder (2 sensor modules+bus) 1.36 cm x 10.9 cm. Thickness bus: < 240 µm. SPIRO module Control/read-out a half ladder Send the data to FEM FEM (interface to PHENIX DAQ) Read/control two SPIROs Interface to PHENIX DAQ active area  r  1.28 cm = 50mm x 256  z 1.36 cm = 425mm x 32 Solder bump ~20  m

All chips on Ladder #6 has good hit map by beta-ray source test 1 st Complete Pixel Ladder on Dec 25, 2009 The hitmaps are a great success, but from where I stand, the fact that we are reading out the ladder is most important

12 Buffer Box Data Flow ATP SEB Gigabit Crossbar Switch To HPSS Buffer Box Classic Event builder architecture DCM Data Collection Modules (100's) Sub Event Buffers (~35) Crossbar Switch Assembly & Trigger Processors (~60) Buffer Boxes (7) Data Concentration Interaction Region Rack Room

13 The A”T”P ATP “Assembly and Trigger Processor” was deemed important at the time when PHENIX was designed Meant to run what is today known as a HLT We soon learned that we can do without it Data logging capability has kept pace with the MB data rate, which is great Opens access to processes which you simply can't trigger on With all the upgrades I'll show, we are determined to keep it that way

14 Need for Speed Data Collection Module (DCM) was modern in its days, DSP-based + some FPGA DCM II uses latest FPGA technology, FPGA is the main component 10G networks are becoming a commodity Allows to make better use of multi-core machines Saves money and power and A/C in the end In the same spirit, replace PCI with PCI Express ATP SEB Gigabit 10G Crossbar Switch Buffer Box

15 DCM-II and “jSEB”-II jSEB (“SubEvent Buffer”) is the PCI-Express card that reads a number of DCM's About a factor of 15 more bandwidth than the older generation DCM II Partitioner II INTERFACE BUSY DATA L1 PC JSEB II PC SEB JSEB II GTM L1 Custom Backplane FEM SEB DCM

16 Force-10 Switch Standard Gigabit ports “MRJ-20” cable bundles with 6 Gig ports each go to patch panels (or directly to the machines) May make our cable distribution easier, bring a few bundles to the racks 10GbE ports Standard fibers with “SFP+” connectors

17 Will add buffer boxes as needed Buffer Box Putting it together DCM II Partitioner II INTERFACE BUSY DATA L1 PC JSEB II PC SEB JSEB II GTM L1 Custom Backplane FEM ATP SEB Buffer Box DCM II Partitioner II INTERFACE BUSY DATA L1 PC JSEB II PC SEB JSEB II GTM L1 Custom Backplane FEM DCM II Partitioner II INTERFACE BUSY DATA L1 PC JSEB II PC SEB JSEB II GTM L1 Custom Backplane FEM The new jSEB-II's will exceed the limits of a Gigabit connection, need 10GbE Will have 5 such jSEB-IIs in Run for the full system in Run 13 The existing detectors will continue to use the current readout The DAQ upgrades are geared towards maintaining the current event rate, not to increase it – for now. The existing detectors will keep their readout electronics and will limit us to the current rate

18 RPC (resistive plate counter) RPC3 Test assembly of RPC-3 half octant support structure at UIUC Adds timing resolution to the Muon detectors so we can distinguish muons from the vertex fron those traversing the IR

19 Outlook PCI Express 10GB/s Networks DCM II Upgrade New Detectors on board or coming to us to search for dedicated signals New hardware components to help maintain our current speed The End

20 Backup

21 Pictures

22 Cabling “plan” Current Switch

23 Building up to record speed Over the previous runs we have been adding improvements Had lighter systems, d+Au, p-p, Cu-Cu in previous runs, less of a challenge than 200GeV Au+Au (most challenging) Distributed data compression (run 4) ‏ Multi-Event buffering (run 5) ‏ Mostly consolidating the achievements/tuning/etc in run 6, also lots of improvements in operations (increased uptime) ‏ 10G Network upgrade in run 7, added Lvl2 filtering Ingredients:

24 Data Compression LZO algorithm New buffer with the compressed one as payload Add new buffer hdr buffer LZO Unpack Original uncompressed buffer restored This is what a file then looks like On readback: This is what a file normally looks like All this is handled completely in the I/O layer, the higher-level routines just receive a buffer as before. Found that the raw data are still gzip-compressible after zero-suppression and other data reduction techniques Introduced a compressed raw data format that supports a late-stage compression

Run 11 Counts stripspixelFVTXTotal Fibers DCM2 module Partitioner JSEB * * Include JSEB II for the crate controllers The build for the final system running at full bandwidth. The ratio between partitioner 3 to DCM is 2 to 1 16 fibers  1 fiber

26 Data Collection Module-II Front-end cards on the detector are getting read out by those (8 per card) DCM-II connects to a custom bus on the right A number of DCM-II's are read out via a PCI Express Card 48V in (Isolated) 5V in (control) Download (readback) L1 data JTAG (640MB/sec) ( 160 MB/sec ) JTAG clock 320MB/sec 1.1V 2.5V 3.3V

27 Multi-Event Buffering: DAQ Evolution PHENIX is a rare-event experiment, after all -- you don’t want to go down this path Without MEB Multi-Event buffering means to start the AMU sampling again while the current sample is still being digitized. Trigger busy released much earlier deadtime is greatly reduced

28 The Multi-Event Buffering Effect

29 Upgrades All new detectors have electronics with high rate capability However, older detector readout limits the level 1 rate no way to upgrade any time soon - $$$$$ We will need to focus on rare events more Front end pipelines Readout buffers Processor farms Switching network Detectors Lvl-1 HLT 40MHz 100KHz 100Hz Remember, our Lvl1 is not the LHC Lvl1... ours is before digitization HLT is no solution CMS Hence: FVTX has Lvl1 trigger “hookup” for displaced vertex triggers other upgrade is a trigger to begin with (W->muon)

30 u d Similar expression for W - to get Δῡ and Δd… Since W is maximally parity violating large measured Δu and Δd require large asymmetries. W Production Basics No Fragmentation!

31 MuonTrigger and RPC upgrade RPCs μ +/- u d W physics with polarized protons Trigger will allow to enhance the sample if high- momentum/straight-track muons RPC adds timing to reduce large background from non- collision muons (beam, cosmics)

FVTX Fitted track provides a DCA to the primary vertex (measured by central arm barrel VTX detector)‏  prompt Pinpoint the decay vertex to eliminate backgrounds! Endcap detects the following by displaced vertex (∆r, ∆z) of muons: D (charm)  μ + X B (beauty)  μ + X B  J/ ψ + X  μ+ μ-

21-October-2008 Jon S. Kapustinsky2008 IEEE NSS-MIC Dresden 33 FVTX Section View 80 cm Barrel 4 discs of Si sensor in acceptance of each Muon Arm Microstrips to accurately measure R coordinate of track Scheduled to be installed in FY11 Two endcap halves ½ of one endcap ½ disks