Download presentation
Presentation is loading. Please wait.
Published byKathlyn Hines Modified over 6 years ago
1
Physics at LHC Extra dimensions Composite quarks & leptons …….?
2
LHC Collisions
3
Layout of LHC (Geneva, Switzerland)
4
Experiments at the LHC pp s = 14 TeV Ldesign = 1034 cm-2 s-1
Heavy ions (e.g. Pb-Pb at s ~ 1000 TeV) TOTEM ATLAS and CMS : pp, general purpose 27 Km ring 1232 dipoles B=8.3 T (NbTi at 1.9 K) First Collisions 2007 Physics in 2008 ALICE : heavy ions, p-ions LHCb : pp, B-physics
5
CMS Detector CALORIMETERS IRON YOKE MUON ENDCAPS TRACKER
ECAL HCAL 76k scintillating PbWO4 crystals Plastic scintillator/brass sandwich IRON YOKE MUON ENDCAPS Cathode Strip Chambers (CSC) Resistive Plate Chambers (RPC) TRACKER Pixels Silicon Microstrips 210 m2 of silicon sensors 9.6M channels Superconducting Coil, 4 Tesla MUON BARREL Drift Tube Resistive Plate Chambers (DT) Chambers (RPC)
6
Wisconsin on CMS at LHC Personnel: Activities:
Professors D. Carlsmith, S. Dasu, M. Herndon, W. Smith Senior Scientist R. Loveless Senior Electronics Engineer T. Gorski Scientists P. Chumney, M. Grothe, A. Lanaro, A. Savin Postdoc J. Efron Computing Professionals: D. Bradley, A. Mohapatra, S. Rader Engineer M. Jaworski, Technician R. Fobes Grad Students: J. Leonard, M. Anderson, K. Grogg, J. Klukas, C. Lazaridis, M. Weinberg, M. Bachtis, L. Gray (& you?) PSL Engineers F. Feyzi, J. Hoffman, P. Robl, D. Wahl, D. Wenman PSL Draft/Tech: G. Gregerson, D. Grim, J. Pippin, R. Smith Activities: Calorimeter Trigger (W. Smith, CMS Trigger Coordinator) Endcap Muon System (R. Loveless, CMS Endcap Muon Manager) Computing/Physics (S. Dasu, US CMS Computing Adv. Bd, Chair)
7
Trigger & DAQ at the LHC
8
CMS Trigger & DAQ Systems
Detector Frontend Computing Services Readout Systems Filter Event Manager Builder Networks Level-1 Trigger Run Control Level-1 Trigger Requirements: Input: 109 events/sec at 40 MHz at full L = 1034 Output: 100 kHz (50 kHz for initial running) Latency: 3 sec for collection, decision, propagation Output direct to computer filter farm (no physical level-2)
9
Calorimeter Trigger Crate
One of MHz systems processing 0.4 Tbits/s. Rear: Receiver Cards Front: Electron, Jet, Clock Cards
10
Calorimeter Trigger Receiver Mezzanine Card (RMC) and Receiver Card (RC)
Front Receives 64 Calorimeter energy sums every 25 ns using 8 Vitesse 1.2 Gbaud links on Receiver Mezzanine link Cards, sends to Jet/Summary Card Adder mezz link cards Tech’n. R. Fobes (l.) & Eng’r. T. Gorski (r.) test links: DC-DC Back Bar Code PHASE ASICs BSCAN ASICs MLUs
11
Work on Calorimeter Trigger
Receiver Mezz. Card BSCAN ASICs Wisconsin scientists along with students are involved in: Testing electronics cards Writing real-time control and diagnostic software Writing detailed hardware emulation Physics simulation of trigger signals Left: Jet Summary Card back and front sides showing custom high-speed Wisconsin ASICs Asst. Scientists P. Chumney (l.) and M. Grothe (r.) test JSC: Sort ASICs BSCAN Phase ASIC
12
UW built the endcap disks
13
UW installs the chambers
Dr. Dick Loveless is the Endcap Muon Construction project manager Dr. Armando Lanaro is the manager in charge of installation L to R: F. Feyzi, D. Wenman, A. Lanaro J. Johnson, M. Giardoni, “Regis”, Y. Baek
14
UW aligns the Chambers ME+2 Student Jeff Klukas
Ref. DCOPS Student Jeff Klukas Crosshair Laser Z-sensors Z-tube Chamber Back Clinometer Chamber DCOPS’s Transfer Plate Mounting Tower Transfer Line DCOPS R-sensor ME+2 Prof. Duncan Carlsmith leads Alignment task Force
15
UW CMS Physics Activities
Wide-ranging physics interest and expertise in the group Analysis efforts coming to forefront as detector work finishes Ensure trigger systems operate at full performance at turn-on We studied simulated Higgs, SUSY, SM EW and QCD datasets Designed trigger systems that capture all the important physics efficiently while satisfying DAQ bandwidth requirements Design triggers for physics channels that are new to CMS Diffractive Higgs (Hbb), Vector boson fusion Higgs (Hbb) Full simulation and analysis for low mass Higgs Above channels and other VBF, Htt, HWW* Physics, Reconstruction & Software Dasu is co-leader of PRS Online Selection Group Updating trigger emulation and developing trigger validation tools Accumulate large physics samples our computing effort
16
LHC Start: Higgs Production/Decay
Prefer to characterize decay and production in terms of a matrix rather than a list. Each yes indicates there is a study from either CMS or ATLAS or both with a reference note on the particular combination. Low-mass search : H → and H → ZZ* → 4ℓ only channels with a mass peak, very good mass resol. ~1%. H → WW* → 2ℓ 2( GeV) high rate, no mass peak, good understanding of SM bkg. needed Decay / Production Inclusive VBF WH/ZH ttH H → γγ YES H → bb H → tt H → WW* H → ZZ*, Zℓ+ℓ-, ℓ=e
17
UW Higgs Physics Studies
Invariant Mass of Highest-Et Pair Photon Candidates Higgs Mass 120 GeV Study by summer student Mike Anderson GeV -1.5 < h < 1.5
18
CMS Computing - Tier-2 Tier 0 Tier 1 Tier 2
CERN/Outside Resource Ratio ~1:2 Tier0/( Tier1)/( Tier2) ~1:1:1 ~PByte/sec Online System ~ MBytes/sec Experiment CERN Center PBs of Disk; Tape Robot Tier 0 Tier 1 IN2P3 Center RAL Center FNAL Center INFN Center 10 Gbps Tier 2 Wisconsin Florida Caltech UCSD Tier-2 Resources in Open Science Grid Compute Servers ~500 CPUs per Tier-2 Bulk of MC Simulation Physics User Support ~100 Users per Tier-2 Physics data cache ~200 TB storage per Tier-2
19
UW Computing Activities
Rader, Radtke [DOE Task T] & Bradley, Maier, and Mohapatra [NSF] Support A CMS Tier-2 computer center from FY2005 Data-Intensive Science University Network (DISUN) institute Leadership in Grid Community OSG - Open Science Grid (develop tools and provide cycles) GLOW - Grid Laboratory of Wisconsin (Lead role in development and implementation with CS Condor group to benefit entire UW campus: IceCube, CS, Genomics, Chem. Eng., Med. Phys., etc., 1400 CPUs) Benefits both CMS and ATLAS + Computing support for all DOE tasks UW Tier-2 Facility: 547 kSI2000 CPU (x2 32x8-way new) & again in Fall 428 batch slots (+256 new) & again in Fall 90 TB raw storage mostly (+ 90 TB new) & “ “ 10 Gbps bandwidth network (via 100G to WAN) 24h/7d operation ~200 user accounts (CMS Worldwide) Rader and Radtke support benefits all tasks, not just Task T Help leverage NSF supported Tier-2, GLOW & OSG resources by all
20
CPU: Over 4.7 M hours served
Used by CMS, and others Used locally and over the grid OSG (all) CMS anal CMS OSG (all) CMS prod ATLAS Phenomenology CMS prod local Opportunistic users get 25% of CPU Efficient utilization by sharing with world Local Users
21
UW Computing Performance
UW continues to be the single largest provider of simulated events, amongst all CMS institutions Key to success in period was our middleware product called Jug, which was developed by our NSF ITR supported software engineer, Dan Bradley CMS moved to new software (CMSSW) 2006 developments for computing resources management Adaptation of ProdAgent to Open Science Grid New scheduler, computing-on-demand Contributions to Condor, OSG and ROOT projects CMS production responsibility on OSG was assigned to UW Assoc. Res. Ajit Mohapatra, who had successfully run the previous production 65 M fully simulated 08/ /2005 86 M reconstructed 08/ /2005 100M fully simulated in new CMSSW 08/2006-Now Use all of OSG
22
Grad Student Timeline 2007-2009: 2009-2010: 2011-2012: Classes
Qualifier Prelim on LHC Physics Topic : Move to CERN! Work on CMS trigger, muon or tracking systems Collect data for thesis Start thesis physics analysis : Complete thesis physics analysis on new discovery! Write thesis & graduate
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.