Presentation is loading. Please wait.

Presentation is loading. Please wait.

Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP.

Similar presentations


Presentation on theme: "Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP."— Presentation transcript:

1 Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP

2 Beat Jost, CernTrigger Panel 16 March 2003 2 Architecture (Reminder)

3 Beat Jost, CernTrigger Panel 16 March 2003 3 Features qSupport of Level-1 and HLT on common infrastructure qTwo separate multiplexing layers, reflecting different data rates of L1 and LHT qCommon CPU farm for both traffics qScalable, e.g. L1 upgrade to include trackers qSeparate storage network to de-couple data flows qDriven by Trigger needs

4 Beat Jost, CernTrigger Panel 16 March 2003 4 Physical Implementation – Rack Layout in D1 qRoom for 50 racks qAll racks identical qUpto 2300 1U boxes (≤46 boxes/rack) qUpto 150 subfarms (≤2-3 subfarms/rack)

5 Beat Jost, CernTrigger Panel 16 March 2003 5 Physical Implementation – Farm Rack qUpto 46 1U servers in one 59U rack q2 (3) subfarms per rack q2(3) data switches and one controls switch q1 patch-panel for external connection (total of 9 links) qUpgradeable to 3 subfarms per rack q2 Gb/s input bandwidth upgradeable 4 Gb/s

6 Beat Jost, CernTrigger Panel 16 March 2003 6 Physical Implementation – Rack Layout in D2 q5 SFC racks q1 rack for Readout Network qMany Racks for patch panels

7 Beat Jost, CernTrigger Panel 16 March 2003 7 Physical Implementation – Farm controls qBuilding Block of 9U height consisting of ã4 (upgradeable to 5) SFCs ã2 Controls PCs ã1 Patch panel ã2 spare spaces for upgrading number of subfarms ãOne storage switch aggregating the output data from the SFCs ãPatch panel for storage uplink 1 (2) GbEthernet links per crate

8 Beat Jost, CernTrigger Panel 16 March 2003 8 Plans qMid 2004 ãinstall UTP links between D3 and D2 and between D2 and D1 (Patch Panel  Patch Panel) ~ 2500 links. ãFiber links between surface and underground area qEnd 2004/Beginning 2005 ãInstall basic computing infrastructure åServers in computer room åBasic links top-bottom (initially GbEthernet, later prob. 10Gb Ethernet) ãAcquisition of first basic network equipment and rudimentary CPU farm. (basically HLT system) qDuring 2005 ãCommissioning of DAQ (actually online) system qBeginning 2006 ãPreparation of acquisition of final readout network and CPU farm qMid 2006 installation of final infrastructure (farm + switch)

9 Beat Jost, CernTrigger Panel 16 March 2003 9 Open Questions qExact composition of L1 trigger ãSo far assumed VeLo, TT, L0DU ãOthers? qIs ever data from ReadOut Supervisor needed in L1? ãInformation available at L0 L0 bunch current 8 bits Bunch ID (RS) 12 bits GPS40 bits Detector status24 bits L0 Event ID24 bits Trigger type3 bits L0 Force bit1 bit Bunch ID (L0DU)12 bits BX type2 bits L0 synch error1 bit L0 synch error forced1 bit qNeed to know where (which racks) SD electronics will reside (needed for Network connections) qNeed to know event sizes (per FE board, preferably)


Download ppt "Online View and Planning LHCb Trigger Panel Beat Jost Cern / EP."

Similar presentations


Ads by Google