Commissioning of the Silicon Strip Detector (SSD) of ALICE Panos Christakoglou a,b for the ALICE Collaboration a NIKHEF b Utrecht University - The 2009 Europhysics Conference on High Energy Physics, Krakow
the LHC - The 2009 Europhysics Conference on High Energy Physics, Krakow 2
The Silicon Strip Detector (SSD) SSD (Silicon Strip Detector): the two outer layers of the ITS 72 ladders (22 or 25 modules each) 1698 modules (> 2.5M output channels) Layer 5: r = cm - Layer 6: r = cm Acceptance | η | < 1.0 (matches the one of the TPC) - The 2009 Europhysics Conference on High Energy Physics, Krakow3 z - overlap: L5: 22 modules L6: 25 modules r - overlap: L5: 34 ladders L6: 38 ladders
SSD contributions: ITS-TPC track matching - The 2009 Europhysics Conference on High Energy Physics, Krakow4 The SSD provides dE/dx measurements and contributes significantly to the ITS-TPC track matching. PYTHIA simulated p+p events at √s = 10 TeV TPC inner radius: R in = 85 cm SPD outer radius: R 2 = 7.6 cm SSD inner layer: R 5 = 38.0 cm SSD outer layer: R 6 = 43.0 cm
Commissioning – Milestones - The 2009 Europhysics Conference on High Energy Physics, Krakow SSD+SDD were moved to the interaction point (March 2007). o The SSD was one of the first detectors to have all systems integrated C/A-side connection tests (July/September 2007). Cosmic run I: partial cooling only part of the C- side was powered (December 2007) Assembly of the SSD in Utrecht and transport to CERN. The full detector was tested at CERN in the experimental area. o Results showed that 16 modules of layer 5 and 9 of layer 6 were not working 25 dead modules out of 1698 (~1.3%) Cosmic run II: partial cooling (February 2008). Cooling plant upgrade (May 2008) Cosmic run III (June – August 2008) o Collected data for alignment and calibration Ready for first collisions (September 2008) o The SSD was included in the initial detector cluster for the first LHC collisions Mini-frame upgrade (April 2009): o Recabling of the SSD with local tests to validate connections. Cooling plant/ Firmware upgrades (May 2009) Re-commissioning phase (June 2009). Cosmic run IV (August 2009). o Collect more data for alignment and calibration o Concentrate on data with magnetic field. First collisions at 900GeV (?) (October 2009) First collisions at 10 TeV (?) (December 2009)
Detector Operation - The 2009 Europhysics Conference on High Energy Physics, Krakow6 Cooling system Water supplied at 17 o C – no heat transfer for the SDD and the TPC The system was initially common between the SSD and the SDD (different set points etc.) o Decision to try to disentangle the two systems as much as possible. In April 2008 we had the first major upgrade which resulted into a common water tank with different control systems for the two sub- detectors. o Possibility to control it remotely from the ACR. Detector Control System (DCS) The SSD was one of the first systems operated remotely via the DCS interface. Based on PVSS under a windows platform. Tree based structure that provides the possibility to navigate through different levels (sector, ladder, …) Automatic warnings, errors are implemented. Possibility to load different settings (bias values) apart from the default configuration.
Cosmic run statistics - The 2009 Europhysics Conference on High Energy Physics, Krakow7 Global runs Trigger: SPD FastOR o Coincidence between top outer SPD layer and bottom outer SPD layer o Rate: 0.18 Hz o The SSD was part of the majority of the global runs (first detector in hours of data taking!) in the same cluster with the rest of the ITS detectors and the TPC. AND
Gain calibration - The 2009 Europhysics Conference on High Energy Physics, Krakow8 Charge matching between p and n sides o Relative calibration from 40k cosmic clusters o Important to reduce noise and ghost clusters 11 % Layer 5 (SSD)
Alignment - The 2009 Europhysics Conference on High Energy Physics, Krakow9 Reasonable accuracy achieved – still some effects to explore (sag) Fit to layer 6, residual layer 5 Three methods to validate SSD survey information o Fit track on one SSD layer (2 points) residuals on other SSD layer o Fit one track on outer layer, one track on inner layer distance and angles between the two tracks o Extra clusters from acceptance overlaps distance between two clusters attached to same track on contiguous modules xy =48 m point =48/√2=34 m misal =27 m xy =25 m point =25/√2=18 m misal =0 m
Quality assurance strategy - The 2009 Europhysics Conference on High Energy Physics, Krakow10 Online Usage of MOOD (experts) and AMORE (shifters – fast feedback). Monitor raw data integrity as they arrive either from the LDC or from the GDC. To come: Quasi-online reconstruction using the HLT cluster Monitor reconstructed points. Implementation of the automatic checks and issuing of warnings/errors.. Noise ~ 2KeV Signal/Noise ~ 40 Offline Monitor simulation and reconstruction part. During the cosmic data taking feedback was provided on a regular basis (almost after the end of the run). Cosmic runs turned out to be quite clean. Usage of pure noise runs to detect fake clusters: o ~10 -4 noise clusters per module per event o Dependence on the trigger rate (for a rate ~MHz more noisy clusters were appearing fixed by the new firmware)
Offline commissioning - The 2009 Europhysics Conference on High Energy Physics, Krakow11 Geometry Implemented within the ROOT TGeo framework More than 5K lines of code Detector Algorithm Used online to calculate noise subtracted inside the C.F. but also to detect bad/noisy channels treated in the reconstruction. Cluster Finder (C.F.) Inclusion of the bad channels inside the cluster finder. Correct treatment of 1 side clusters o Clusters formed by a combination of good/bad strips either on the p or n side. Simulation Pile-up treatment using the HAL25 shaping time (~2.2 μs).
SSD performance: Energy loss measurements - The 2009 Europhysics Conference on High Energy Physics, Krakow12 Simulations The four outermost layers of the ITS (2xSDD + 2xSSD) contribute to the energy loss measurements by providing dE/dx values. PYTHIA p+p events at √s = 10 TeV Cosmics During the cosmic run campaigns of 2008 (field of 0.5 T) both layers of the SSD + one SDD layer were active in the acquisition. Tracks reconstructed in TPC+ITS Muon according to: Atomic Data Tables 78, (2001) 183. H. Bichsel, Rev. Mod. Phys. 60, (1988) 663 H. Bichsel, NIM A562 (2006) 154
Summary - Conclusions The SSD has been successfully commissioned during the cosmic run campaigns in Many GBs of cosmics data have been recorded using the SPD FastOR trigger that allowed us to perform the first part of the alignment and calibration of the detector. A new re-commissioning phase has just started due to the many upgrades which took place during the long shutdown. The new set of data that is going to be collected in July-August will be used for the further refinement of the detector’s calibration and alignment. The SSD is ready for the first LHC collisions!!! - The 2009 Europhysics Conference on High Energy Physics, Krakow13
THANK YOU!!! - The 2009 Europhysics Conference on High Energy Physics, Krakow14 7-track 450 GeV/c event collected with circulating LHC beam2 on Sept. 11 th 2008
BACKUP - The 2009 Europhysics Conference on High Energy Physics, Krakow15
Inner Tracking System 16 Design goals Optimal resolution for primary vertex and track impact parameter Minimize distance of innermost layer from beam axis ( ≈ 3.9 cm) and material budget Maximum occupancy (central PbPb) < few % 2D devices in all the layers dE/dx information in the 4 outermost layers for particle ID in 1/ 2 region LayerDet. Type Radius (cm) Length (cm) Resolution ( m)PbPb dN/dy=6000 rr ZPart./cm 2 Occupancy (%) 1SPD SPD SDD SDD SSD SSD The 2009 Europhysics Conference on High Energy Physics, Krakow
- The 2009 Europhysics Conference on High Energy Physics, Krakow17 SSD: Silicon Strip Detector Sensor FEE EndCap FEROM DAQ 8 SSD racks outside magnet 1 cm 50 cm 25 m 200 m ~ modules, ~ chips, >5m 2 2.6*10 6 analogue channels 144 EndCaps Water coolingCF supports ‘cones’ ITS-SSD: layer 5: 34Lx22M=748Mod layer 6: 38Lx25M=950Mod
SSD: Silicon Strip Detector 18 r - overlap: z - overlap: L5: 34 ladders L6: 38 ladders L5: 22 modules L6: 25 modules Ladder End ladder electronics Sensor: double sided strip: 768 strips 95 um pitch P-side orientation 7.5 mrad N-side orientation 27.5 mrad Hybrid:identical for P- and N-side Al on polyimide connections 6 front-end chips HAL25 water cooled carbon fibre support module pitch: 39.1 mm Al on polyimide laddercables - The 2009 Europhysics Conference on High Energy Physics, Krakow
- The 2009 Europhysics Conference on High Energy Physics, Krakow19 Module mass & radiation length mass [g] radiation length [% X 0 ] sensor module on ladder as installed mass of SSD on TPC 111 kg
- The 2009 Europhysics Conference on High Energy Physics, Krakow20 Measured dead time read-out + conversion time: 176 us backpressure limits trigger rate example: Stand-alone run with 3 LDC + 1 GDC L2a BC trigger dead time with L1 reject 7 us dead time with L2 reject 101 us, but most ladders fail after many random triggers: LV overvoltage/-current to be investigated
Status of SSD around 10 Sept 2008 Good High Current Not Working Sintef 5 half ladders not working, 8 switched off due to high current 131 out of 144 half ladders in use (90%) - The 2009 Europhysics Conference on High Energy Physics, Krakow
- The 2009 Europhysics Conference on High Energy Physics, Krakow22 Cooling performance water supplied at maximum pressure (0.9 bar) 17 °C from cooling plant (~minimum) SSD is cooling the environment can adjust sector temperatures to equalize heatflow to ambient TPC and SDD to provide guidelines blocked cooling line open connection example: 9 March, C-side mostly off
- The 2009 Europhysics Conference on High Energy Physics, Krakow23 SSD first physics configuration fraction of surface not useable: 14 % 13/144 halfladders switched off 65/1698 bad modules on other ladders 1.5 % bad strips white: no working module light: one module dark: two modules
- The 2009 Europhysics Conference on High Energy Physics, Krakow24 dead strips, hybrids dead strips are excluded from the datataking too noisy > 20 no signal: no noise, zero pedestal extreme pedestal > 512 depending on the cause the strip charge may be lost or distributed over the neighbours different treatment in clusterfinder not implemented yet, under study dead hybrids other hybrid still working, but no cluster position, only stripnumber no charge matching use this information in tracker? at present these modules are declared fully dead
- The 2009 Europhysics Conference on High Energy Physics, Krakow25 Defects Summary: Table (july 07) SSD sector SSD sector # bad modules (at least 1 side not ) % bad channels Layer5 QSide 51.20% Layer5 VSide % Layer6 QSide 4 (+1) 0.87% Layer6 VSide 6 (+5) (+1semiladder) 3.79% SSD SSD 27 (+18) 2.01%
26 SSD At A Glance (february ‘09) Total Bad Channels: 8% (211256) ~ 4% of Total Channels From Dead Half Ladders Any possibility to recover them? ~ 2,4% (62 k) are the "real" bad channels ~ 1,6% from “Half-Dead” Ladder + Dead Modules + “Half- Dead” Modules - The 2009 Europhysics Conference on High Energy Physics, Krakow
Alignment: Millepede SPD+SSD SSD Millepede realignment at ladder level + survey data for modules Single track impact parameter resolution ≈ 30/ 2 ≈ 21 m 27 realigned (Millepede) DATA, B=0 ~ 30 m ALICE Preliminary A. Dainese - The 2009 Europhysics Conference on High Energy Physics, Krakow