Download presentation
1
The ATLAS Experiment Getting
Ready for LHC Physics at LHC, Cracow, 4th July 2006 (P. Jenni, CERN) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
2
ATLAS Collaboration (As of the March 2006) 35 Countries
158 Institutions 1650 Scientific Authors total (1300 with a PhD, for M&O share) New application for CB decision in July DESY, Humboldt U Berlin (Germany) SLAC, New York U (US) Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, Bern, Birmingham, Bologna, Bonn, Boston, Brandeis, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, Casablanca/Rabat, CERN, Chinese Cluster, Chicago, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, Dortmund, TU Dresden, JINR Dubna, Duke, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima, Hiroshima IT, Indiana, Innsbruck, Iowa SU, Irvine UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, Mannheim, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Naples, Naruto UE, New Mexico, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Ritsumeikan, UFRJ Rio de Janeiro, Rochester, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, Southern Methodist Dallas, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto, TRIUMF, Tsukuba, Tufts, Udine, Uppsala, Urbana UI, Valencia, UBC Vancouver, Victoria, Washington, Weizmann Rehovot, Wisconsin, Wuppertal, Yale, Yerevan Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
3
Construction, integration and installation progress of the ATLAS detector
ATLAS superimposed to the 5 floors of building 40 Diameter m Barrel toroid length 26 m End-cap end-wall chamber span 46 m Overall weight Tons Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
4
The Underground Cavern at Pit-1 for the ATLAS Detector Length = 55 m
Width = 32 m Height = 35 m Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
5
An Aerial View of Point-1
(Across the street from the CERN main entrance) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
6
ATLAS Tracking (||<2.5, B=2T) : Length : ~ 46 m Radius : ~ 12 m
Weight : ~ 7000 tons ~ 108 electronic channels ~ 3000 km of cables Tracking (||<2.5, B=2T) : -- Si pixels and strips -- Transition Radiation Detector (e/ separation) Calorimetry (||<5) : -- EM : Pb-LAr -- HAD: Fe/scintillator (central), Cu/W-LAr (fwd) Muon Spectrometer (||<2.7) : air-core toroids with muon chambers Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
7
Physics example H ZZ 4 Z(*) Z mZ e, m
g t Z(*) H ZZ 4 “Gold-plated” channel for Higgs discovery at LHC Signal expected in ATLAS after ‘early' LHC operation Simulation of a H ee event in ATLAS Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
8
Magnet System Central Solenoid 2T field with a stored energy of 38 MJ
Integrated design within the barrel LAr cryostat The solenoid has been inserted into the LAr cryostat at the end of February 2004, and it was tested at full current (8 kA) during July 2004 It is now cold, and has been successfully tested in situ at a reduced current (awaiting the end-caps for field mapping at full current in August 2006) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
9
Toroid system Barrel Toroid: 8 separate coils End-Cap Toroid:
Barrel Toroid parameters 25.3 m length 20.1 m outer diameter 8 coils 1.08 GJ stored energy 370 tons cold mass 830 tons weight 4 T on superconductor 56 km Al/NbTi/Cu conductor 20.5 kA nominal current 4.7 K working point End-Cap Toroid parameters 5.0 m axial length 10.7 m outer diameter 2x8 coils 2x0.25 GJ stored energy 2x160 tons cold mass 2x240 tons weight 4 T on superconductor 2x13 km Al/NbTi/Cu conductor 20.5 kA nominal current 4.7 K working point Barrel Toroid: 8 separate coils End-Cap Toroid: 8 coils in a common cryostat Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
10
Barrel Toroid construction
Series integration and tests of the 8 coils at the surface were finished in June 2005 BT test area BT5 excitation tests to 22 kA current Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
11
Barrel Toroid coil transport and lowering into the underground cavern
Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
12
The first coil was installed in October 2004
Latest news: the system is being cooled down in situ, and current tests will start in August 2006 The last coil was moved into position on 25th August 2005 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
13
End-Cap Toroids All components are fabricated, and the assembly is now ongoing at CERN The ECTs will be tested at 80 K on the surface, before installation and excitation tests in the cavern The first ECT will move to the pit in October 2006, the second one in early 2007 The first of the two ECT cold masses inserted into the large vacuum vessel Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
14
Inner Detector (ID) The Inner Detector (ID) is organized
into four sub-systems: Pixels ( channels) Silicon Tracker (SCT) (6 106 channels) Transition Radiation Tracker (TRT) (4 105 channels) Common ID items Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
15
Pixels The Pixel project progresses well, but it was affected by two technical problem that require highest priority recovery actions: Corrosion leaks in the barrel cooling tubes (now under control, repair ongoing) Broken low-mass cables for the barrel services (repair/replacement strategy being put into place) At this stage the schedule is very tight and on the critical path, but all efforts are made to have the full system ready for installation in time for May 2007 The installation schedule has been adapted as to accommodate the late availability (Note that the Pixel sub-system can be installed independently from the rest of the Inner Detector) Two bi-staves assembled (final) in the L1 half-shell; L1 and L2 assembly proceed in parallel Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
16
The rate for assembled and fully qualified modules meets the needs
for a 3-hit system in time (1744 modules needed) Assembly and integration, as well as preparation of the installation tools, proceed according to schedule The first complete end-cap arrived at CERN few weeks ago, and it will soon undergo cosmic ray tests in the surface clean room SR1 Three completed Pixel disks (one end-cap) with 6.6 M channels Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
17
Silicon Tracker (SCT) All four barrel cylinders are complete and at CERN The pictures show different stages of the integration of the four barrel SCT cylinders The cylinders have been tested before: 99.7% of all channels fully functional Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
18
End-cap SCT All disks for the end-caps are finished as well
The first end-cap arrived end of February 2006 at CERN, the second one in April 2006 A completed end-cap SCT disk (Picture taken by a star-photographer, P. Ginter, as art-work…) Integration work on one of the end-cap SCT cylinders Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
19
Transition Radiation Tracker (TRT)
The module construction for the TRT is complete The barrel integration has been finished since about a year The first end-cap side (A and B wheels) has been stacked, the second side will be ready in June 2006, they are now being equipped with services The first of the two end-cap TRTs (A and B type wheels) fully assembled Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
20
surface clean room facility SR1
Examples of cosmic ray registered in the barrel TRT in the Inner Detector surface clean room facility SR1 Barrel TRT during insertion of the last modules (February 2005) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
21
at the surface with cosmics
End of February 2006 the barrel SCT was inserted into the barrel TRT, and this component will be ready for the final installation in ATLAS in August 2006 after further commissioning at the surface with cosmics Integrations of the two end-caps (SCT and TRT) are ongoing for installation end of 2006 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
22
Cosmics in the barrel TRT plus SCT
Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
23
Inner Detector services (cables and pipes) installation in the inner bore of
the barrel cryostat Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
24
LAr and Tile Calorimeters
Tile barrel Tile extended barrel LAr hadronic end-cap (HEC) LAr EM end-cap (EMEC) LAr EM barrel LAr forward calorimeter (FCAL) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
25
LAr EM Barrel Calorimeter Commissioning at the Surface
After many years of module constructions, the barrel EM calorimeter was installed in the cryostat, and after insertion of the solenoid, the cold vessel was closed and welded early 2004 A successful complete cold test (with LAr) was made during summer 2004 in hall 180 at CERN (dead channels much below 1%) LAr barrel EM calorimeter module at one of the assembly labs LAr barrel EM calorimeter after insertion into the cryostat Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
26
End of October 2004 the cryostat was
transported to the pit, and lowered into the cavern Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
27
Barrel LAr and Tile Calorimeters A cosmic ray muon registered
in the barrel Tile Calorimeter The barrel LAr and scintillator tile calorimeters have been since January 2005 in the cavern in their ‘garage position’ (on one side, below the installation shaft) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
28
November 4th 2005: Calorimeter barrel after its move into the
center of the ATLAS detector Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
29
Latest news: Cool-down completed, filled with LAr, in-situ
commissioning started... Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
30
LAr End-Caps The surface cold tests with LAr are finished, with
very good results (dead channels well below 1%) FCAL A before insertion End-Cap cryostat A before the insertion of the FCAL and closure End-Cap A during the surface cold tests Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
31
End-Cap LAr and Tile Calorimeters
The end-cap calorimeters on side C were assembled in the cavern by end of January 2006, and then moved partially into the ATLAS detector centre, in order to free the space under shaft C for muon end-cap installation The mechanical installation of the second end-cap is finished now as well Completed end-cap calorimeter side C, just before, and after, the partial insertion into the detector Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
32
Calorimeter electronics
LAr FE crate The installation of the LAr Front End (FE) electronics on the detector, as well as of the Back End (BE) read-out electronics in the control room, is proceeding to plans A particular concern are still the in-time availability of the low and high voltage LAr power supplies For the Tile Calorimeter, a control problem for the low voltage supplies has been understood, and a corrective action is being implemented Physics at LHC, Cracow, July-2006, P. Jenni (CERN) LAr barrel ROD system in USA15
33
EM beam test results: Energy resolution
Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
34
Impact on Higgs mass resolution
Simulations, mH=130 GeV Events H 4 e H Resolution: 1% (low lum) 1.2% (high lum) Acceptance: 80% within ±1.4 H 4e Resolution: 1.2% (low lum) 1.4% (high lum) Acceptance: 84% within ±2 Mass(GeV) Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
35
Muon Spectrometer Instrumentation
The Muon Spectrometer is instrumented with precision chambers and fast trigger chambers A crucial component to reach the required accuracy is the sophisticated alignment measurement and monitoring system Precision chambers: - MDTs in the barrel and end-caps - CSCs at large rapidity for the innermost end-cap stations Trigger chambers: - RPCs in the barrel - TGCs in the end-caps At the end of February 2006 the huge and long effort of series chamber production in many sites was completed for all chamber types Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
36
Barrel MDTs Installation of barrel muon station (40% done)
A major effort is spent in the preparation and testing of the barrel muon stations (MDTs and RPCs for the middle and outer stations) before their installation in-situ The electronics and alignment system fabrications for all MDTs are on schedule Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
37
First cosmics have been registered in situ for barrel chambers
In December 2005 in MDTs and in June 2006 in RPCs Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
38
End-cap muon chamber sector preparations 72 TGC and 32 MDT ‘Big-Wheel’
sectors have to be assembled ‘Big Wheel’ end-cap MDT sector This work is in full swing in the Hall 180 where previously the Barrel Toroid and the LAr integration and tests were done ‘Big Wheel’ end-cap TGC sectors Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
39
Preparations for the Big Wheels installation in the underground cavern have started
Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
40
aspects, with sample series chamber station in the SPS H8 beam
The large-scale system test facility for alignment, mechanical, and many other system aspects, with sample series chamber station in the SPS H8 beam Shown in this picture is the end-cap set-up, it is preceded in the beam line by a barrel sector Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
41
Example of tracking the sagitta measurements, following the day-night variation due to thermal
variations of chamber and structures, and two forced displacements of the middle chamber Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
42
Trigger, DAQ and Detector Control
Calo MuTrCh Other detectors 40 MHz 40 MHz 75 kHz ~2 kHz ~ 200 Hz LV L1 D E T R/O Read-Out Drivers FE Pipelines Read-Out Sub-systems Read-Out Buffers Read-Out Links 2.5 ms LVL2 ~ 10 ms ROIB L2P L2SV L2N RoI RoI data = 1-2% requests Lvl1 acc = 75 kHz ROD ROD ROD 120 GB/s ~ 300 MB/s ~2+4 GB/s ROS GB/s ROB H L T D A T F L O W RoI Builder L2 Supervisor L2 N/work L2 Proc Unit EB SFI EBN DFM Lvl2 acc = ~2 kHz Event Building N/work Dataflow Manager Sub-Farm Input Event Builder EFN SFO Event Filter EFP ~ sec ~4 GB/s EFacc = ~0.2 kHz Event Filter N/work Sub-Farm Output Event Filter Processors Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
43
ATLAS Trigger / DAQ Data Flow
LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500 ~1600 stores output dual-CPU nodes ~100 ~30 Network switches Event data pulled: partial events @ ≤ 100 kHz, full events @ ~ 3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) farm Network switches Builder Inputs (SFIs) ATLAS detector Read- Out Drivers (RODs) First- level trigger Read-Out Subsystems (ROSs) UX15 USA15 Dedicated links Timing Trigger Control (TTC) 1600 Links Gigabit Ethernet RoI Builder pROS Regions Of Interest VME ~150 PCs Data of events accepted by first-level trigger Event data requests Delete commands Requested event data stores LVL2 output Event data ≤ 100 kHz, 1600 fragments of ~ 1 kByte each Second- SDX1 USA15 UX15 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
44
Level-1 The level-1 system (calorimeter, muon and central trigger logics) is in the production and installation phases for both the hardware and software The muon trigger sub-system faces a very tight schedule for the on-chamber components, but is now proceeding satisfactorily Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
45
HLT/DAQ/DCS The High Level Trigger (HLT), Data Acquisition (DAQ) and Detector Control System (DCS) activities have continued to proceed according to plans Large scale system tests, involving up to 800 nodes, have further demonstrated the required system performance and scalability Scalability is particularly important for staging needs during the initial running of ATLAS A major emphasis was put on all aspects of the HLT and DAQ software developments Example of performance optimization Components of the DCS are in fabrication or already finished (ELMB), and are already widely used, and the s/w components are available The DCS is one of the first systems already in operation at Pit-1 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
46
Almost 1/3 of the final Read Out System (ROS) has been installed and commissioned in the underground control room USA15 The infrastructure at Point-1 is now fully active (including central file server and a number of local service machines with standard DAQ software, system administration, and networking) ROS racks for the LAr The pre-series of the final system at Point-1 (10% of final dataflow) is now in operation at Point-1 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
47
Inner detector Calorimetry Muon system Trigger
Channels No. ROLs Fragment size - kB Pixels 0.8x108 120 0.5 SCT 6.2x106 92 1.1 TRT 3.7x105 232 1.2 ATLAS total event size = 1.5 MB Total no. ROLs = 1600 Calorimetry Channels No. ROLs Fragment size - kB LAr 1.8x105 764 0.75 Tile 104 64 Muon system Channels No. ROLs Fragment size - kB MDT 3.7x105 192 0.8 CSC 6.7x104 32 0.2 RPC 3.5x105 0.38 TGC 4.4x105 16 Trigger Channels No. ROLs Fragment size - kB LVL1 56 1.2 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
48
ATLAS Computing Timeline
2003 POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (done) ATLAS complete Geant4 validation (done) ATLAS release 8 (done) DC2 Phase 1: simulation production (done) DC2 Phase 2: intensive reconstruction (re-scheduled) Combined test beams (barrel wedge) (done) Computing Model paper (done) Computing Memorandum of Understanding (done) ATLAS Computing TDR and LCG TDR (done) Start of Computing System Commissioning (in progress) Physics Readiness Documents (re-scheduled: early 2007) Start cosmic ray run GO! 2004 2005 2006 2007 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
49
goals, preconditions, clients and quantifiable acceptance tests
The computing and software suite has progressed on a very broad front, with a particular emphasis to make it as accessible as possible to the user community Examples: GRID production tools Software infrastructure Detector Description and graphics Framework and Event Data Model Simulation Tracking (ID and Muons) and calorimeters (LAr and Tiles) Database and data management Reconstruction and Physics Analysis tools Distributed analysis Computing System Commissioning along sub-system tests with well-defined goals, preconditions, clients and quantifiable acceptance tests Examples: Full Software Chain From generators to physics analysis Tier-0 Scaling Calibration & Alignment Trigger Chain & Monitoring Distributed Data Management Distributed Production (Simulation & Re-processing) (Distributed) Physics Analysis General ‘rehearsal’ of TDAQ/Offline data flow and analysis ATLAS computing is fully embedded in, and committed to, the LCG framework Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
50
Input from 350 k jobs 138 sites in 29 countries
Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
51
ATLAS Installation Schedule Version 8.0
- Beam pipe in place end of August 2007 - Restricted access to complete end-wall muon chambers and global commissioning until mid-Oct 2007 - Ready for collisions from mid-October 2007 Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
52
Figure 2 Operation Model (Organization for LHC Exploitation) CB
(Details can be found at ) Executive Board ATLAS management: SP, Deputy SP, RC, TC Collaboration Management, experiment execution, strategy, publications, resources, upgrades, etc. Publication Committee, Speaker Committee CB Detector Operation (Run Coordinator) Detector operation during data taking, online data quality, … Trigger (Trigger Coordinator) Trigger data quality, performance, menu tables, new triggers, .. Data Preparation (Data Preparation Coordinator) Offline data quality, first reconstruction of physics objects, calibration, alignment (e.g. with Zll data) Computing (Computing Coordinator) Core Software, operation of offline computing, … Physics (Physics Coordinator) optimization of algorithms for physics objects, physics channels Figure 2 (Sub)-systems: Responsible for operation and calibration of their sub-detector and for sub-system specific software TMB Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
53
ATLAS is highly motivated, and on track, for first collisions in 2007
Conclusions Many important milestones have been passed in the construction, pre-assembly, integration and installation of the ATLAS detector components Very major software, computing and physics preparation activities are underway as well, using the Worldwide LHC Computing Grid (WLCG) for distributed computing resources Commissioning and planning for the early physics phases have started strongly ATLAS is highly motivated, and on track, for first collisions in 2007 (ATLAS expects to remain at the energy frontier of HEP for the next 10 – 15 years, and the Collaboration has already set in place a coherent organization to evaluate and plan for future upgrades in order to exploit future LHC machine high-luminosity upgrades) (Informal news on ATLAS is available in the ATLAS eNews letter at Physics at LHC, Cracow, July-2006, P. Jenni (CERN)
54
Physics at LHC, Cracow, 4-July-2006, P. Jenni (CERN)
55
Physics at LHC, Cracow, 4-July-2006, P. Jenni (CERN)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.