Presentation is loading. Please wait.

Presentation is loading. Please wait.

TDAQ Report LVL1 Calo & Muon DAQ & HLT DCS. Level-1 Calorimeter Trigger Single-tower saturation level  Internally: All modules in the Calorimeter Trigger.

Similar presentations


Presentation on theme: "TDAQ Report LVL1 Calo & Muon DAQ & HLT DCS. Level-1 Calorimeter Trigger Single-tower saturation level  Internally: All modules in the Calorimeter Trigger."— Presentation transcript:

1 TDAQ Report LVL1 Calo & Muon DAQ & HLT DCS

2 Level-1 Calorimeter Trigger Single-tower saturation level  Internally: All modules in the Calorimeter Trigger logic chain functioned All modules in the Calorimeter Trigger logic chain functioned Fast serial links to CPM and JEM worked well, verifying Fast serial links to CPM and JEM worked well, verifying –BC-MUX scheme for CPM (halves number of links) –PPM formation of jet elements  Produced e.m., jet, and total-energy triggers 20 GeV e.m. trigger threshold Slope ~0.55 to be corrected by: e/h ratio: x1.5 E T not E: x1.25 to get ~1.0 Slope ~1 Success in test-beam  Integrated with: LAr and Tile Calorimeters, via TileCal patch-panels and receivers LAr and Tile Calorimeters, via TileCal patch-panels and receivers Central Trigger Processor Central Trigger Processor Region-of-Interest Builder Region-of-Interest Builder ATLAS run-control, etc. ATLAS run-control, etc. ATLAS DAQ, via RODs and ROS ATLAS DAQ, via RODs and ROS

3 PreProcessor Preprocessor Module Control problems seen in test-beam have been understood and corrected Will build improved prototype Could do early installation using existing prototypes if necessary Multi-chip modules (MCMs) Tested successfully under: Severe vibration Temperature cycling (they even work at 120°C!) PRR for ASIC/MCM assembly was held in January ASICs Production ASIC wafers have now been series tested, and in principle enough (~4400) have passed for final system Early wafers had yields ~60%, as expected (process, die size) Recent batches have quality control problems – yield lower and very inconsistent Being discussed with the manufacturer, more will be made 4 AnIn 16 MCM LVDS

4 Cluster Processor Module Latest prototypes look very good FDR planned for March Jet/Energy Module Needs only minor design changes and a few more system tests FDR planned for April Common Merger Module FDR was in September Passed system tests requested by FDR 6 CPMs, 2 JEMs, 2 CMMs running in a crate with maximum backplane traffic Crate-to-crate merging PRR planned for 28 February System CMM Crate CMM CPMs JEMs Cluster & Jet/Energy Processors

5 Receivers Production TileCal receivers (2 crates) now built and being shipped to CERN from Pittsburgh LAr receivers (6 crates) will follow at ~ one crate-full per month Backplane FDR for backplane was on 6 December (no PRR needed) Preparing for production (6 needed for CP and JEP) ROD Standalone tests of first 9U ROD have gone well Second module is now assembled Firmware writing (11 variants!) is going well Analogue Receivers, Custom Backplane, and ROD

6 Held joint Calorimeter/Trigger workshop on installation, commissioning and calibration 1 Feb. Aimed at initiating and reviving discussions on: Joint installation and commissioning plans, including testing Calibration requirements and procedures, mainly during installation phase and leading to normal running Infrastructure needs, including DAQ etc. Must agree on responsibilities in boundary areas between calorimetry and trigger Will set up working group of responsible people, etc. Must agree on how to progress Who designs and writes which software, and to what timetable? Operational responsibilities – who takes care of what? What info is in which database, how accessed and controlled? Commissioning and Calibration

7 Full chain of prototype system has been operating for some time in lab tests and, last year, at H8 with 25 ns bunch structure beam The PS boards used Version 4 of “SLB” ASIC Minor error in readout part of ASIC at high rate was understood, and revised versions were submitted towards the end of last year Recently received prototypes of new version (Version 4 ECO2) of SLB ASIC This is fully functional and the problem with the previous version has been solved Now have fully-functional versions for all ASIC types in the end-cap muon trigger system Endcap Muon Trigger

8 On-detector electronics related to the trigger Splitter boxes Production and testing well advanced; supply not a problem for detector integration Pad boxes Depends crucially on Coincidence Matrix Array ASIC Final motherboard prototype under evaluation at CAEN Production of Pad-or mezzanine boards completed Cabling (within and between stations) Requires a lot of detailed design work Barrel Muon Trigger

9 Original prototype ASIC worked according to specification, but programmable delay range had to be extended for final detector cabling Redesigned ASIC submitted last September Small functional changes, but major redesign Packaged protos received by ASIC testing company Last week we received encouraging news: all test vectors passed without error Must now do evaluation of the protos in Pad test system Essential to validate the ASIC in the system as well as with test vectors before drawing final conclusions – slice test in Rome and cosmic-ray stand in BB5 Then place order for main production as soon as possible CMA ASIC

10 A small number of Pad boxes equipped with the original ASIC are available for testing RPC detector assemblies Number limited by availability of proto ASICs (old version) 8 boxes now available in BB5 Some more Pad boxes will become available soon equipped with the new prototype ASICs Boards for 10 additional boxes available Schedule is extremely critical for delivery of the Pad boxes in production quantities Main installation of RPC system starting late August Plan prepared to ensure that Pad electronics will become available very quickly after delivery of ASICs TC organized a review of the plan last November Big effort is being made to bring in some extra effort during this critical period Pads

11 ROBIN  TDAQ component that receives and buffers the data from the RODs  Final prototypes produced in Jan. 2005  PRR scheduled March 1 st,2005: preparations completed … including documents  Subject to PRR, Pre-series production (50 boards) to start March 2005  Planning foresees a volume production (650 units) to be completed in 3Q05  200 boards to be installed and commissioned in 50 ROSs before end 2005

12 RCD provides an application framework to interface the RODs for Control/Configuration Monitoring (Statistics/Event sampling) Data Readout (through VME bus) Synchronous readout of multiple crates provided Commissioning phase 1 will be largely based on RCD (all ROS units not available yet) Development history: 2003 – Initial implementation based on ROS software Design mostly driven by ROS requirements 3Q04 – completed first major upgrade Largely used in Combined testbeam for ROD configuration/control 1Q05 – completed second major upgrade Based on CTB feedback New code included in TDAQ-01-01-00 release RCDRCD RODROD RODROD RODROD RODROD Commands Data Overview of ROD Crate DAQ

13 Introduced alternative user API for data readout Old one still supported Introduced handling of VME interrupts Enhanced configuration mechanism More flexible publishing mechanism Introduced multi-crate Event Builder Architecture enhancements are documented in ATL-DQ-ES-0066 Detector representatives are being individually contacted to schedule software upgrades Latest RCD changes

14 Progress in Measurements & Analysis Group Performance measurements in testbeds in parallel with discrete event simulation modelling  Predict the behavior in test-bed Extrapolate to the final system size Suggest optimizations for the final system TDR proposed separate main switches for LVL2 & EB traffic TDR proposed separate main switches for LVL2 & EB traffic Optimisation  Optimisation  Mix L2 and EB nodes on 2 data switches: Better performance Moderated traffic in both Switches Queue sizes smaller in mixed network  More reliable system The EB and L2 systems are divided in 2 More flexibility Bigger systems possible in testbeds

15 System scalability LVL2 Nodes: not running algorithms, driving the DAQ as fast as they can. → ROS: 30% Final Request Rate → SFI: 41% GigaBit Bandwidth → L2PU: 17 times more ROI rate

16 HLT/DAQ Pre-series  Fully functional, small scale, version of the complete HLT/DAQ system  Equivalent to a detector ’ s ‘ module 0 ’  Purpose and scope of the pre-series system:  Pre commissioning phase: oTo validate the complete, integrated, HLT/DAQ functionality oTo validate the infrastructure, needed by HLT/DAQ, at point-1. Note it will be provisionally installed at point 1 (USA15 and SDX1)  Commissioning phase oTo validate a component (e.g. a ROS) or a deliverable (e.g. a Level-2 rack) prior to its installation and commissioning  TDAQ post-commissioning development system. oValidate new components (e.g. their functionality when integrated into a fully functional system). oValidate new software elements or software releases before moving them to the experiment.

17 ROS, L2, EFIO and EF racks : one Local File Servers, one or more Local Switches One Switch rack - TDAQ rack - 128-port GEth for L2+EB One ROS rack - TC rack + horiz. Cooling - 12 ROS 48 ROBINs One Full L2 rack - TDAQ rack - 30 HLT PCs Partial Superv’r rack - TDAQ rack - 3 HE PCs Partial EFIO rack - TDAQ rack - 10 HE PC (6 SFI - 2 SFO - 2 DFM) Partial EF rack - TDAQ rack - 12 HLT PCs Partial ONLINE rack - TDAQ rack - 4 HLT PC (monitoring) 2 LE PC (control) 2 Central FileServers RoIB rack - TC rack + horiz. cooling - 50% of RoIB 5.5 Pre-Series SDX1USA15

18 Inside SDX1

19 Pre-Series

20  Work Packages for installation and commissioning in point 1 being defined now  Installation subject to communicated delivery dates  Work packages to be discussed with technical coordination asap  Planning of exploitation, operations & maintenance in progress

21 Andre DosAnjos Gökhan Ünel Haimo Zobernig Lucian Leahu Luis Bolinches Marc Dobson Marius Leahu Matthias Wiesmann Stefan Stancu Topic so far addressed: 1. Users / Authentication 2. Booting / OS / Images 3. Software Istallation 4. File Systems 5. Farm Monitoring 6. Networking in general 7. Remote access to nodes SysAdmin Task Force: active since mid December 2004 Goal: preparing a proposal for Node system administration & management at Point1 Task Force Members: Currently: Collecting Input & discussing with various people on a draft document Soon: Make an EDMS note

22 Large Scale System Tests Data Challenges for control aspects of the HLT/DAQ system Annual exercise for last 3-4 years with increasing numbers of processors Tests this year planned in Canada, UK and CERN following on from last year’s tests CERN tests on LXSHARE June/July timeframe in agreement with IT and discussed in LCG PEB 1 month with # processors increasing from 200 to ~1,000

23 HLT Progress Recent HLT workshop in Barcelona Review status and plans for the various components required to Integrate and Test HLT selection s/w Infrastructure Issues related to HLT Selection HLT Core s/w and plans Selection system performance Trigger configuration Testbeds and commissioning Monitoring in PT (Athena) Follow-up established in particular on Timing & performance measurement plans Design review of core selection s/w Trigger configuration HLT commissioning Active participation of one of the Offline commissioners Preparation of various aspects of commissioning of HLT Understand HLT needs of detectors for their commissioning HLT planning for pre-series

24 HLT Issues S/w stability and areas of interface with offline Software testing Modularity (complexity & dependencies) – considerable recent progress with ID s/w Data Preparation (recent discussions for LAr improvements in calo trigger software workshop) Evaluate timing performance of full system and isolate (and replace if necessary) elements with insufficient performance - particularly critical for LVL2  Established regular technical discussions with offline to clearly identify areas which need improvement – plan & execute the work Priority has to increase in detector s/w for HLT (Steinar’s talk) Trigger configuration Work to gain a complete picture of the requirements of trigger configuration (LVL1/HLT) Common LVL1 & HLT trigger configuration prototype using condDB in progress

25  Study electron/pion separation on CTB data using calorimeter and Inner Detector information from Level2 trigger algorithms.  Compare results with CTB simulated data Preliminary results of electron identification using LVL2 calorimeter information. Most TB runs were not taken with LAr and Tile ROD in Physics Mode. Therefore T2Calo has been modified to make use of the offline calorimeter cells Physics Mode Runs will be looked at in the future Beam instrumentation (Cherenkov, muTag, muHalo) Calorimeter information (ET, hadronic leakage, shower shapes variables) TRT information (number of tracks, hits, EM cluster-track matching) ELECTRON SELECTION STRATEGY CTB HLT Trigger Studies

26 e- 50 GeV Run 2102410 ++  e T2Calo EM E T (MeV)T2Calo EM E T E T > 3 GeV rejects muons MuTag vs. T2Calo EM E T (MeV) muTag < 460 rejects muons T2Calo EM E T

27 Muon Sagitta reconstructed by  Fast @ H8 Straight Muon Beam of 250 GeV MOORE recontruction:  ≈ 60  m The sagitta reconstruction is shifted by 300  m because  Fast doesn’t make use of allignment corrections. The  width could be due to a wrong calibration: checks ongoing. MOORE mm

28 PESA Performance First meeting to re-focus the on-line Physics and Event Selection Validation and Performance activities Aims to address coherently the physics performance of the on-line selection in the areas of Electrons and photons, Muons, Jets / Taus / ETmiss, b-tagging, B-physics Building on the work of the existing “vertical slices”, some of them deployed also in the recent Combined Test Beam The goal is the definition of complete Trigger Menus, validated against selected Physics channels List of items of increasing complexity, moving from simple processes (like Z  2e or Z  2  ) to others capable of steering more complex menus (like H  2e2  top, …) Aim for a full exercise on the time scale of DC3 Prepared also for the HLT commissioning during the cosmic data taking Devise specific algorithms if needed (e.g. select non-pointing tracks) Understand detector needs and requirements  e.g. recent discussions with LAr and MDT

29 PESA Performance Presentations about the different selection schemes to identify objects with the High Level Triggers http://agenda.cern.ch/age?a051058 Walk through available software (including steering of the selections) Emphasize areas where reconstruction, combined performance and physics groups can bring in their expertize to optimise selections and help shaping the Trigger Menus Need help to exploit selections on various data samples Tune cuts, add details, evaluate rates and performances Aim in ATLAS at Trigger-aware Physics analyses and Physics-aware Trigger Selection Analysis groups to evaluate their understanding and sensitivity to current trigger strategy and performance and propose enhancements/additions to trigger strategy and performance  See Fabiola & Steinar’s talk tomorrow morning

30 CAN system: Production well advanced ELMB: production finished ELMB motherboard: prototype ok, order placed CAN Power Supply Unit (PSU): pre-series available, production organised by PF/ESS for 3Q05 Other equipment in CERN stores SW: Working versions (mostly) available JCOP Framework for standard devices Finite State Machine Configuration DB JCOP prototype being evaluated Conditions DB Using Lisbon MySQL for commissioning data PVSS API manager to inject data in COOL being studied, but technical problems not yet solved  Subdetectors are waiting for the database(s)!  DCS Components

31 Aims of DCS prototype: Realistic testing of FE I/O Test of JCOP SW components Interfacing to external services Stand-alone operation of sub-systems Integrated operation of Inner Detector … service for detector construction !!! The following slides are from P.Ferrari and represent the work of the ID subdetectors DCS for ID integration in SR1

32 DCS Setup in the SR Building SR DAQ DDC GCS supervisor LAN SCT SCS master Power TRT SCS master SCS ID evaporative Cooling SCS CIC Rack/Env Pixel SCS master SR envsensors:Temp,Humidity,Pressure CANBUS IBOX ELMB Sensors:TempHum.Press. Sensors:Temp,Humidity PLC Control(regulatorsCompressor) ELMB Rackmonitoring Pixel LCS Power Pixel LCS Env HV/LV ELMB SCT Therm ENVEncl. SCT LCS Power Pack ETHERNET SCT PS ELMB CurrAir T.etc.. SCT PS ELMB TRT LV TRT HV monophasecooling ELMB PLC Sensors:Temp ILock IBOX TempHum.Press. ELMB TRT LCS Cooling

33 SCT Power Supply

34 TRT SCS

35 x

36 Summary LVL1 Good progress on module production and software On-detector muon trigger electronics is a critical area (ASICs, schedule) Focus moving to commissioning of LVL1 and aspects with detectors HLT/DAQ Elements needed for first stage of commissioning now in place Good progress on system performance and scalability studies Pre-series system being purchased and installation ~on time HLT testbeam analyses in progress HLT system performance issues - effort established to isolate and improve critical elements Working with detectors to understand calibration requirements Focus PESA work on complete menus, selection performance & commissioning DCS System being used widely by detectors for commissioning and testing work Good collaboration with DB group but much work remains to be done Manpower remains a great concern in some areas of the project in particular given the “client and server” nature of TDAQ Trying to address this where possible with increased coherence between TDAQ, detector software, physics & combined performance groups

37 Backup slides

38 Mixed LVL2 & EB nodes


Download ppt "TDAQ Report LVL1 Calo & Muon DAQ & HLT DCS. Level-1 Calorimeter Trigger Single-tower saturation level  Internally: All modules in the Calorimeter Trigger."

Similar presentations


Ads by Google