1 Cosmic commissioning Milestone runs Jamie Boyd (CERN) ATLAS UK Physics Meeting-- J Boyd -- 11 Jan. 2008 11.

Slides:



Advertisements
Similar presentations
RPC & LVL1 Mu Barrel Online Monitoring during LS1 M. Della Pietra.
Advertisements

Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Releases & validation Simon George & Ricardo Goncalo Royal Holloway University of London HLT UK – RAL – 13 July 2009.
ATLAS Tile Calorimeter Performance Henric Wilkens (CERN), on behalf of the ATLAS collaboration.
TRT LAr Tilecal MDT-RPC BOS Pixels&SCT 1 The Atlas combined testbeam Thijs Cornelissen, NIKHEF Jamboree, Nijmegen, December 2004.
GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC M. Della Pietra, P. Adragna,
1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Linda R. Coney – 24th April 2009 Online Reconstruction & a little about Online Monitoring Linda R. Coney 18 August, 2009.
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
TB & Simulation results Jose E. Garcia & M. Vos. Introduction SCT Week – March 03 Jose E. Garcia TB & Simulation results Simulation results Inner detector.
In order to acquire the full physics potential of the LHC, the ATLAS electromagnetic calorimeter must be able to efficiently identify photons and electrons.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
O FFLINE T RIGGER M ONITORING TDAQ Training 30 July 2010 On behalf of the Trigger Offline Monitoring Experts team.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
OFFLINE TRIGGER MONITORING TDAQ Training 5 th November 2010 Ricardo Gonçalo On behalf of the Trigger Offline Monitoring Experts team.
A. Gibson, Toronto; Villa Olmo 2009; ATLAS LAr Commissioning October 5, 2009 Commissioning of the ATLAS Liquid Argon Calorimeter Adam Gibson University.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
TRT Offline Software DOE Visit, August 21 st 2008 Outline: oTRT Commissioning oTRT Offline Software Activities oTRT Alignment oTRT Efficiency and Noise.
SL1Calo Input Signal-Handling Requirements Joint Calorimeter – L1 Trigger Workshop November 2008 Norman Gee.
Reflections of a System Run Coordinator Or Equivalent! Bruce M Barnett STFC, Rutherford Appleton Laboratory L1Calo Collaboration Meeting January.
The Status of the ATLAS Experiment Dr Alan Watson University of Birmingham on behalf of the ATLAS Collaboration.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
TGC Timing Adjustment Chikara Fukunaga (TMU) ATLAS Timing Workshop 5 July ‘07.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
Results from particle beam tests of the ATLAS liquid argon endcap calorimeters Beam test setup Signal reconstruction Response to electrons  Electromagnetic.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
1 OO Muon Reconstruction in ATLAS Michela Biglietti Univ. of Naples INFN/Naples Atlas offline software MuonSpectrometer reconstruction (Moore) Atlas combined.
MUON DAQ WORKSHOP Muon Week, CERN February 2014 Nicoletta Garelli (SLAC)
The Detector Performance Study for the Barrel Section of the ATLAS Semiconductor Tracker (SCT) with Cosmic Rays Yoshikazu Nagai (Univ. of Tsukuba) For.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
L1Calo EM Efficiencies Hardeep Bansil University of Birmingham L1Calo Joint Meeting, Stockholm 29/06/2011.
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
ID Week 13 th of October 2014 Per Johansson Sheffield University.
Muon Detectors Tile Calorimeter Liquid Argon Calorimeter Solenoid Magnet Toroid Magnets 46m 22m SemiConductor Tracker(SCT) Pixel Detector Transition Radiation.
WLCG November Plan for shutdown and 2009 data-taking Kors Bos.
ANDREA NEGRI, INFN PAVIA – NUCLEAR SCIENCE SYMPOSIUM – ROME 20th October
TDAQ and L1Calo and Chamonix (Personal Impressions) 3 Mar2010 Norman Gee.
Calorimeter Cosmics Patrick Robbe, LAL Orsay & CERN, 20 Feb 2008 Olivier, Stephane, Regis, Herve, Anatoly, Stephane, Valentin, Eric, Patrick.
DHH at DESY Test Beam 2016 Igor Konorov TUM Physics Department E18 19-th DEPFET workshop May Kloster Seeon Overview: DHH system overview DHE/DHC.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
ATLAS Tile Calorimeter Data Quality Assessment and Performance
Risultati del run di integrazione M4
Approved Plots from CMS Cosmic Runs (mostly CRUZET, some earlier)
Online Software Status
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
ATLAS Canada Alberta Carleton McGill Montréal Simon Fraser Toronto
The First-Level Trigger of ATLAS
HLT Jet and MET DQA Summary
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
CMS Pixel Data Quality Monitoring
DQM for the RPC subdetector
ATLAS DC2 & Continuous production
Presentation transcript:

1 Cosmic commissioning Milestone runs Jamie Boyd (CERN) ATLAS UK Physics Meeting-- J Boyd Jan

2 Outline Aims of commissioning runs M3 / M4 / M5 –Setup –Results –Problems The Future – M6 & beyond… Conclusions 2 ATLAS UK Physics Meeting-- J Boyd Jan. 2008

3 Aims of commissioning runs Main aim is to prepare the experiment for data-taking – by using the only source of real data we have before LHC turn on (cosmic rays) Integrate in new systems with the DAQ Take cosmic ray data for long periods of time – iron out problems found Run in factory mode – train shifters, develop ways of debugging the system effectively Not only commission the detectors but also the trigger, daq, online/offline software, monitoring, tier0, computing infrastructure, control room.... Format has been ~1 week long runs ~every 2 months ATLAS UK Physics Meeting-- J Boyd Jan. 2008

4 Differences between Cosmic Data and Collision Data Of course there are many differences between cosmic rays and collision data –Low multiplicity (mostly 1 track / event) –Low rate ~100Hz (<<40MHz) –Only muons (no El, Ph, Jet, Tau, Nu) –Tracks, Clusters not pointing at IP –Timing different (track hits top of detector, then bottom of detector) –Hits not in phase with the LHC clock (appear randomly within the 25ns) ATLAS UK Physics Meeting-- J Boyd Jan. 2008

5 Differences between Cosmic Data and Collision Data Muon systems, Trackers, Tile Calorimeter are designed to detect muons so detecting cosmic rays in these systems is in principle easy For LAr this is not the case –Muons leave very small energy deposit in LAr –Because not in phase with LHC Clock this means onboard DSP (optimal filtering) will not work to give an energy measurement –So in order to get a reasonable energy measurement LAr ship out 32 time samples and find the energy offline –This means that paradoxically the raw data event size for cosmics is much bigger than for physics ~10MB / event (totally dominated by LAr) –This means the LVL1 rate is limited to the SFO bandwidth (~200MB/s) –Also means the LAr conditions needed to analyze the data offline are much bigger than for physics data ATLAS UK Physics Meeting-- J Boyd Jan. 2008

6 Overview of the M-weeks ATLAS UK Physics Meeting-- J Boyd Jan WeekSystems presentTriggersNevts / Raw Data size (TB) M3 (4/6-18 /6) RPC (1/32), TGC (1/36), MDT (1/16) Tile (3/4), LAr(3/4) TRT(6/32) RPC (60Hz) Tile (<0.1 Hz) 2M / 9 TB M4 (23/8-3/9) RPC (4/32), TGC (2/36), MDT (2/16) Tile (3/4), LAr(3/4) TRT(12/32) (SCT DAQ Only) RPC (60Hz) Tile (<0.1Hz) 3M / 18 TB M5 (22/10-4/11) RPC (1/8), TGC (5/36), MDT (1/4) Tile (~1/2), LAr(~3/4) No ID (SCT/Pixel DAQ Only) RPC (20Hz (ps6)) TGC (20Hz(ps4)) Tile (<0.1Hz) L1Calo(<0.1Hz) 12M / 86 TB Systems coverage and trigger rates approximate – as they varied over the period

7 M3 - Overview 4 – 18 June Used TDAQ 1.7 First use of offline / tier0 –Offline software running at point1 (EventDisplay/Online monitoring) 12 series whereas running at tier0 13 series – due to TDAQ compatibility –No official way to patch the offline software - groupArea No ECR / BCR so BCID not in sync Monitoring a weak point First time HLT running algorithms online ATLAS UK Physics Meeting-- J Boyd Jan. 2008

8 M3 - EventDisplay ATLAS UK Physics Meeting-- J Boyd Jan. 2008

9 M4 - Overview 23 August - 3 Sept Used TDAQ 1.8 Offline / tier0 used release X –AtlasPoint1 patching for first time Allows us to know what patches are used for every run ECR / BCR used Lumi blocks used for the first time Problems –CTP timing changed w/o LAr / Tile changing readout timing – Calo data bad for 5 days Communication problem Bad that monitoring did not spot this for 5 days! –MDT went out of sync wrt other systems After M4 problem traced to BCR occurring too close to LVL1 Accept & fixed Not any good data (with MDT and Calo’s) ATLAS UK Physics Meeting-- J Boyd Jan. 2008

10 M4 - EventDisplay ATLAS UK Physics Meeting-- J Boyd Jan Can see more system coverage (MDT / TRT) Also improved colour scheme of event display!

11 M5 - Overview 22 Oct - 4 Nov TDAQ 1.8 Offline / tier0 used release X (AtlasPoint1 patching) First time triggers from L1Calo & TGC Improved monitoring (especially for calorimeters) New format –First week detector assessment, 1 day per system –Second week stable combined running More detailed shifters and checklists –Shiftleader & Run control shifter – as well as expert shifters from each system ATLAS UK Physics Meeting-- J Boyd Jan. 2008

12 M5 – Event Displays ATLAS UK Physics Meeting-- J Boyd Jan. 2008

13 System highlights ATLAS UK Physics Meeting-- J Boyd Jan. 2008

14 ID (TRT in M4) ATLAS UK Physics Meeting-- J Boyd Jan Number of offline TRT tracks per run in M4 In total 20k tracks, ~10^6 hits on track σ [±300 um]  226 um First R(t) calibration for TRT For each r bin, extract t(r), fit with a gaussian to get the average t(r). Use this to update the t 0 of the straw. Repeat until the procedure converges No alignment corrections applied yet

15 ID ATLAS UK Physics Meeting-- J Boyd Jan SCT and (for M5) pixel ran with DAQ only setups. This meant they could test their DAQ and become integrated in the monitoring infrastructure. (will save debugging time when the detectors become available) SCT online monitoring plot

16 LAr ATLAS UK Physics Meeting-- J Boyd Jan Pulse shape in Larg from µ To see a nice cosmic signal in the LAr requires a lot of work. Need calibration constants for all cells and need to remove all noisy cells. For every cell need to find the phase wrt the LHC clock, and use conditions for that phase. This makes the LAr reco very slow and very memory intensive, cant use it to predict speed, memory use for physics reco. With sophisticated online monitoring in M5 LAr can see cosmic rays online in 30mins of datataking!

17 Tile ATLAS UK Physics Meeting-- J Boyd Jan Online monitoring plots from Tile calorimeter (M5): Bottom Left shows the timing difference Bottom Right shows angle of cosmic ray (red are reference plots) Tile can now (M5) ‘see’ cosmic rays online in a few minutes of datataking starting

18 Calo ATLAS UK Physics Meeting-- J Boyd Jan. 2008

19 Muons ATLAS UK Physics Meeting-- J Boyd Jan Correlation b/w hit positions in RPC and MDT Event display from M4 MDT track segments do not point at Tile / RPC hits, because MDT out of sync Plot showing the correlation in z of hits from MDT and RPC showing that these are seeing the same tracks (MDT in sync) in M5.

20 Muons ATLAS UK Physics Meeting-- J Boyd Jan The position of the muon candidates extrapolated to the surface – can clearly see the position of the large and small shafts!

21 DAQ ATLAS UK Physics Meeting-- J Boyd Jan DAQ operations clearly getting better from M3 -> M5 for example – the time to start / stop a run now takes a few minutes whereas could take more than an hour in M3. Also when things do go wrong the reason is more easily understood, and easier to debug. This is a big step forward! Still need to improve the usefulness of the e-log (in my opinion) DAQ infrastructure (RODs, ROSs, SFI, SFO...) working well. Some problem when running HLT algorithms accessing RPC data as RPC using non final RODs – should be fixed for future runs Rate of SFO -> castor transfer during M4 (200MB/s is goal in this setup)

22 High rate tests ATLAS UK Physics Meeting-- J Boyd Jan To excerises the detector front end read out, we carried out high rate tests during MX weeks – final LVL1 rate should be ~100kHz Use Random trigger with high rate – push it as high as it can go Use prescales at HLT to keep some fraction of events for event building (but remove deadtime from HLT) Results from M5 are (these always include CTP+DAQ+) : MDT R/O rate ~100KHz, limited by Lvl2 back-pressure L1Cal ~50kHz Tile stopped at 25kHz during acceptance test, large fraction of events discarded. Successful running up to 100kHz if L1A uniformly distributed L1A. Complicated interaction b/w delays in busy propagation and buffer sizes. LAr ~60kHZ rate with physics format (smaller event size than cosmic mode). RPC not final RODs, PIX, SCT no detector readout. Overall: achieved combined run at 53kHz L1A with LArg, MDT, TGC, L1Cal, CTP (+Tile up to 20kHz)

23 LVL1 ATLAS UK Physics Meeting-- J Boyd Jan CTPMUCTPI Tile RPC SL TGC SL MBTSL1Calo RoIB Sub- detectors LVL1 setup from M5 First use of trigger presenter to display rates and prescales in real time LVL1 configuration written to COOL at end of each run (for first time) Data streaming done on LVL1 result – ie. Tile triggered events written to different BS file than RPC triggered events RPC configuration not pointing in phi – want to add this for future cosmic runs (TRT need better LVL1 timing than available from Tile trigger) L1Calo triggering experiment for first time in M5 (~90% preprocessers, ~100% processers, ~50% readout) Jet ROIs received and decoded by HLT HAD Correlation b/w Trigger Tower (L1Cal) and Tile Energy

24 HLT ATLAS UK Physics Meeting-- J Boyd Jan Have been running HLT algorithms using info from RPC/MDT Tile/LAr TRT All configured to pass all events In M5 HLT configured using trigger tool (db) Phi distbn for tracks found in TRT at HLT (red = data, black=sim)

25 Monitoring ATLAS UK Physics Meeting-- J Boyd Jan Online monitoring Runs in EF like athenaPT processes or GNAM Automatic comparison with references (DQMF) checking >1000 histograms by end of M5 Running well by end of M5 Offline monitoring Histograms produced at by tier0 reconstruction Scripts to merge across runs and put on webpage DQMF not working for offline yet Some memory problems with offline histograms (some 2D histograms >10MB each!)

26 Tier0 ATLAS UK Physics Meeting-- J Boyd Jan Tier0 running reconstruction (RecExCommission package) over MX week data Writing out: CBNT ntuple, ESD, muon calibration ntuple and monitoring histograms Latency between data being taken and tier0 processing data 1-2 hours Various problem found at tier0 (first time that reconstruction code see’s corrupt events) Crash rate reduced as code quality improves Reprocess the data a few weeks after the MX week – with improved code and calibrations (trying at tier1s too) Good test of the tier0 system Reconstruction time ~10s/evt – some (noisy) events much longer!

27 Data Distribution ATLAS UK Physics Meeting-- J Boyd Jan During M4 & M5 the raw data was distributed to the Tier-1’s. These plots show the transfer during M4. This is an important test of the Computing infrastructure

28 Atlas Control Room ATLAS UK Physics Meeting-- J Boyd Jan The ACR has been stressed by the MX weeks. Many people (~10) per desk. Has become obvious that systems need to use their satellite control rooms. Web cam:

29 Combined track fit Combined straight-line fit with TRT, RPC & MDT hits (80 hits in total) Fit Quality Chi^2/Ndof = 380 / 76 Difficult to include energy loss and multiple scattering effects without magnetic field

30 Future plans... M6 –Planned for 3-10 March 2008 –Will (hopefully) include some SCT – but no pixel detectors –Still no magnetic field either... After M6 plan depends on LHC schedule – idea would be to go into cosmic data-taking with full detector & field 24/7 ~1 month before beam pipe closes –Need to migrate to TDAQ 1-9 and offline 14 during this period! Before M6 we will have system weeks in Feb to integrate more parts of each system (TDAQ / Muons / L1Calo / ID) (calorimeter electronics undergoing refurbishment) Nothing happening in Jan - due to cooling maintenance at point1 System will effectively be in continuous use from begining of February until after LHC running! ATLAS UK Physics Meeting-- J Boyd Jan. 2008

31 Conclusions M-X weeks have been very successful Integrated nearly all systems (except Pixel, SCT & CSCs) Many problems found and fixed Commissioning / testing much more than just the detector systems –Control room –Operations (shifts, e-log,...) –DCS –DAQ / Trigger (LVL1 & HLT) –Online / Offline software –Monitoring –Tier0 –Data distribution Still much to do –Run with more systems, more coverage, magnetic field –Exercise alignment, calibrations... But these exercises have put us in an excellent position to get the most out of the system when we have first beam! ATLAS UK Physics Meeting-- J Boyd Jan. 2008

32 Useful information Useful information about: –Running RecExCommission –Access to raw MX week data –Access to (original & reprocessed) Tier-0 output –‘good run’ lists –Useful links to Tier-0 monitoring, COOL runlist, e-log Available through the Offline Commissioning wiki – ATLAS UK Physics Meeting-- J Boyd Jan. 2008