Download presentation
Presentation is loading. Please wait.
1
F. MerrittNSF Review 14-Jan-2005 1 TileCalorimeter Software, Jet Reconstruction, and Jet Calibration in ATLAS Frank Merritt Ambreesh GuptaAdam Aurisano Amir FarbinZhifong Wu Ed FrankRob Gardner [ Richard Teuscher (UC)Peter Loch (Arizona) ] [ Mark Oreglia (UC)Sasha Solodkov (Protvino) ] [ Matt Woods (UC) ] Version 3.0
2
F. MerrittNSF Review 14-Jan-2005 2 Outline TileCal: – Digitization of Tile signals – Offline Optimal Filtering – Calorimeter Objects: coordination with LAr JetEtMiss work: – First jet-finding algorithms – Ringberg Calorimeter workshop – Navigation of calorimeter objects – Calibration using samples: comparisons – Current tatus and work in progress – Test-beam work using Athena U.S.-Atlas Grid Activity – Rob Gardner: Grid3 – Tier-2 Proposal Towards Physics Analysis of Atlas – MidWest Physics Analysis Group – Susy work – North American Physics Workshop
3
F. MerrittNSF Review 14-Jan-2005 3 Early Involvement in ATLAS/Athena Roles in Athena development and ATLAS reconstruction –T. LeCompte (ANL): ATLAS-wide Tile data-base coordinator –F. Merritt (U.C.): ATLAS-wide Tile reconstruction coordinator Tile software biweekly telephone conferences: –Wednesday 10:00 am CST, every other week (organized by Chicago/Argonne) Major Chicago/ANL Tile involvement in JetEtMiss group. Biweekly telephone conferences (M. Bosman, convener): –Wednesday 10:00 am CST, every other week. –Minutes and agenda on web (JetEtMiss web page). Good working relationship with colleagues in Atlas –Primarily in Tile (esp. ANL) and JetEtMiss –Also with BNL (LAr Calorimeter), Arizona (HEC,FD), and with colleagues in Spain, Italy, and Russia
4
F. MerrittNSF Review 14-Jan-2005 4 Tile Cells and L1 Trigger Towers (total of 9856 signals in 4x64 modules)
5
F. MerrittNSF Review 14-Jan-2005 5 Chicago Contributions to Tile Reconstruction Software Development of new data classes corresponding to flow of data thru electronics (EF, FM, AS) –Includes objects corresponding to pmts, cell, towers –Also container objects, data structures, mapping. –Also essential for providing mapping, data structures, resolution effects, and finally reconstructed cell and tower energies in Atlas environment. Development of Optimal Filtering code for high-rate Atlas environment (RT,FM,AA) –Starting with code developed by R. Teuscher for Tile test-beam and electronics. –Uses bunch structure of beam to extract energy deposition in each beam crossing –Only in-time deposition is passed on for inclusion in cell energies. Calorimeter Navigation package (EF, AG) –allows decomposition of Jet into cells, towers, clusters. –allows access to characteristics of constituents, e.g. cell layer, type, status (for Tile and LAr): allows reweighting for calibration studies. –Separates navigable structure (representational domain) from behavior (OO domain). Interface to Conditions DataBase (EF, TLC, FM) –TileInfo class provides access to constants through single interface (many accessor methods) –Constants set at initialization and stored in Transient Detector Store (TDS) –Parameters will be automatically updated when time interval expires.
6
F. MerrittNSF Review 14-Jan-2005 6 Tile Data Objects Tile Algorithms TileDeposits (local energy dep in scint) TileHit (signal seen by PMT) TileDigits (with time struct. and noise) TileRawChannel (after optimal filtering) TileCell (calibrated cell energy) TileOpticalSimAlg TileElectronicsSimAlg TileOptimalFilter TileCellMaker
7
F. MerrittNSF Review 14-Jan-2005 7 Tile Shaping Function
8
F. MerrittNSF Review 14-Jan-2005 8 Example of Optimal Filtering reconstruction of in-time signal with two pile-up background events
9
F. MerrittNSF Review 14-Jan-2005 9 Optimal Filter Algorithm #3 This is a variation of Algo #2, where in the very first step we do a 10P fit to all 9 crossing amplitudes as well as the pedestal. In order to do this, we need to add a constraint term to the chisquare, and what we use is: (P0-PC)^2/sigma^2. P0 is the first parameter (the ped level), PC is the nominal ped level (=50), and sigma is taken to be about 10 (6 times bigger than digits noise). This very loose constraint is enough to allow the program to calculate amplitudes for all 9 crossings 1.Start with a crossing configure of all Ndig amplitudes plus pedestal (Ndig+1 parameters). 2.Carry out a 10P fit to Pedestal plus 9 crossings, with gaussian constrain on pedestal. Go to 4. 3.Apply the S matrix of this configuration to the digits vector to obtain a vector of fitted amplitudes and the errors for each of these. 4.Find the amplitude with the lowest significance (A/Sigma = minimum). 5.If the significance of this amplitude is less than a cut value, drop this amplitude and go to step 3. The algorithm continues until all spurious amplitudes have been rejected, and the remaining ones all have significance greater than the cut value..
10
F. MerrittNSF Review 14-Jan-2005 10 Hadron Calibration Strategies for Atlas from Ringberg Castle Workshop July 22-3, 2002 Frank Merritt University of Chicago (with Peter Loch University of Arizona) September 17, 2002
11
F. MerrittNSF Review 14-Jan-2005 11 Lessons from the Ringberg workshop (from the “other detector” talks) H1: LAr/lead and LAr/steel, non-compensating: 50%/ E + 1.6% Zeus: Coarser subsystems, but compensating: 35%/ E + 1% Extensive test beam studies are a great advantage, especially in studying rsponse near cracks or other difficult regions of the detector. Careful monitoring of the detector is essential. This includes monitoring with sources, studying aging effects (including gas purity), and continual monitoring of energy profiles, track vs cluster comparisons, etc. But this does not determine the overall energy scale (note D0 in particular). It is absolutely essential to base this on clear in-situ physics measurements: e.g. “double-angle” methods in HERA, W decays or Z-jet events in D0. Energy flow corrections can give an enormous improvement in resolution -- on the order of 20% in the experiments presenting talks. This depends critically on the detector, and especially calorimeter granularity. Noise reduction techniques in the calorimeter were important in all experiments. Getting the best final resolution takes an enormous effort, and many years. There were no great surprises here, but the reviews of the problems that others have faced and solved was stimulating, encouraging, and very useful.
12
F. MerrittNSF Review 14-Jan-2005 12 Recent and Ongoing Chicago Projects in ATLAS Calorimetry (2003-5) Development of JetRec package (A. Gupta) –Development of new jet-finding algorithms for Atlas Cone algorithm, kt, seedless cone Associated structures and tools for split-merge, etc. Reconstruction Task Force recommendations for changes in Athena structure. –A series of meetings with calorimeter colleagues to reconsider design: meetings in Tucson, BNL, Barcelona –Common CaloCell objects with same interface for all calorimeters –Significant changes in Jet structure, with all jet objects inheriting from P4Mom and iNavigable (extends navigation interface to essentially all objects that have energy and position) Work on hadron energy calibration and determination of hadron energy scale –Different calibration schemes developed: BNL, Chicago, Pisa –Creation of Jet Calibration package (AG) for comparing different calibration approaches. Work in Atlas JetEtMiss Working Group –F. Merritt and A. Gupta become co-conveners of the group (with D. Cavalli, Milano) –Organize i-weekly phone conferences with participation from many Atlas colleagues in U.S. and Europe –Plan Combined Performance sessions for Atlas Software weeks (4 per year) –Close contact with BNL, Pisa, many others. Extensive development of Atlas analysis capabilities [Atlas-wide]. –Data Challenge 1 –Data Challenge 2 (2004-5)
13
F. MerrittNSF Review 14-Jan-2005 13 Hadron Calorimeter Calibration Three Weighting Schemes Being Studied “Pseudo-H1 weighting” [Frank Paige (BNL)] –Estimates weight for each CaloCell depending on energy density in cell. Independent of Jet energy. Weight by Sampling Layer [Ambreesh Gupta (U.C.)] –Estimates weight for each sampling layer in the calorimeter depending on Jet energy (but not on cell energy). Pisa weights [C. Roda, I. Vivarelli (Pisa)] –Estimates weight for each CaloCell depending on both cell energy and jet energy (and parameterized in terms of Et rather than E).
14
F. MerrittNSF Review 14-Jan-2005 14 Main problem areas Calorimetry effects: –Non-compensation of Atlas calorimeters –Cracks and dead material –Boundaries between calorimeters Definition of “truth” –Can apply reco algorithms to MC particle list to obtain MC “jets”. But is this truth? Clustering is different, propagation is different. –Can sum all MC particles in cone around reco jet. Noise. –Want to reject cells with no real energy,but also need to avoid bias: rejecting E +300 GeV bias per event! –=> Use cluster-finding algorithm to reduce noise.
15
F. MerrittNSF Review 14-Jan-2005 15 Sampling Layers EM Cal LAr calorimeter HAD Cal Tile+HCAL+FCAL No noise added Calibration weights derived in four eta regions - 0.0 - 0.7, 0.7 - 1.5, 1.5 - 2.5, 2.5 - 3.2 The weights have reasonable behavior in all eta regions. “Sampling Weights” (Ambreesh Gupta)
16
F. MerrittNSF Review 14-Jan-2005 16
17
F. MerrittNSF Review 14-Jan-2005 17
18
F. MerrittNSF Review 14-Jan-2005 18 /E = (97% / E) 4% /E = (127% / E) 0% /E = (114% / E) 8% /E = (68% / E) 3% Scale & Resolution Sampling Weights
19
F. MerrittNSF Review 14-Jan-2005 19 /E = (75% / E) 1% /E = (115% / E) 3% /E = (138% / E) 0% /E = (271% / E) 0% Scale & Resolution H1 Style Weights Different definition of truth, compared to those used in deriving the weights
20
F. MerrittNSF Review 14-Jan-2005 20 Improving sampling wt’s (A. Gupta) Using sampling weight for each calorimeter layer is not very useful -- large fluctuation in a single layer. But using fraction of energy deposited in EM and HAD have useful information on how jets develops. To make weights use energy fraction information in EM and HAD calorimeter. 25 GeV 100 GeV 400 GeV 1000 GeV Fraction of Jet energy in EM and HAd
21
F. MerrittNSF Review 14-Jan-2005 21 Ongoing work and plans for next two months (in preparation for Rome Physics Workshop) 1.Pisa wieghts are in the process of being put into JetRec for comparison to H1 and Sample Weighting. 2.Will introduce a top-level calibration selector tool in JetRec that can be switched through jobOpt. 3.Will carry out comparisons in January with the goal of establishing a benchmark calibration by early February. 4.Produce new DC2 weights by mid-February (already in progress; F.P. and S.P.) 5.Extend calibration to different cone sizes (R=0.4 and R=0.7). 6.Plan to write a few standard jet selections to ESD (e.g., R=0.7, R=0.4 cone, Kt) 7.Investigate other improvements in jet-finding and jet calibration if time permits. improved definition of truth. improved noise suppression techniques. more extensive studies of jet-finding with topological clusters. additional parameters in sample weighting.
22
F. MerrittNSF Review 14-Jan-2005 22
23
F. MerrittNSF Review 14-Jan-2005 23 Comparison with jet-finding applied to topological clusters:
24
F. MerrittNSF Review 14-Jan-2005 24 Study variations in calibration for different physics processes (F.P.)
25
F. MerrittNSF Review 14-Jan-2005 25 Formation of U.S. Atlas Midwest Physics Group Spearheaded and organized by A. Gupta (U.C.) and Jimmy Proudfoot (ANL) –Emphasis on physics analysis rather than software development. –Provides mutual support and common focus for midwest U.S. institutions –Monthly meetings, useful website. Tutorials on Athena reconstruction (given by Ambreesh) – compute environment, job setup, data access, histograms –how to modify the code –jets reconstruction, event analysis, ntuple production Physics topics include: –Susy (Chicago group) –Higgs (Wisconsin) –Z+jets (ANL) –Top –Jet cross-sections –Di-boson production –Triggering and fast tracker
26
F. MerrittNSF Review 14-Jan-2005 26 US Atlas Mid-West Physics Group ( http://hep.uchicago.edu/atlas/usatlasmidwest/) Interested Individuals Meetings, Agenda, and Minutes Tutorials on Running Athena Reconstruction Analysis with Root Useful Data Sets Identified Analyses Links Page maintained by: Ambreesh Gupta: mailto:agupta@hep.uchicago.edu, Jimmy Proudfoot: mailto:proudfoot@anl.gov Last update: 13th December 2004mailto:agupta@hep.uchicago.edumailto:proudfoot@anl.gov
27
F. MerrittNSF Review 14-Jan-2005 27 Plans for 2005 ….. and Beyond High level of current activity: –North American Atlas Physics Workshop (Dec 21-22, 2004); 4 Chicago talks: “Jet Calibration and performance” – F. Merritt ‘Calorimeter response to hadrons from CTB” – M. Hurwitz “Early Commissioning of the Atlas Detector” – J. Pilcher “SUSY Studies in DC2” – A. Farbin Workshop on calorimetry at BNL: Feb 2, 2005 Development of Chicago-based data processing –Further development of grid-based computing tools –Can have significant impact on Chicago physics capabilities Need extensive background studies for many searches Need high-statistics analysis for many calibration studies –Potentially very important for U.S. Atlas role and for grid development –Tutorial organized by Amir Farbin for next Midwest Physics meeting (February 2005). Preparations for Physics Workshop in Rome, June 2005. –Need to produce/choose best hadron energy calibration constants by mid-February And …..
28
F. MerrittNSF Review 14-Jan-2005 28 Calorimetry in Atlas 2004 Combined Test Beam (M. Hurwitz) Data-taking May-October 2004 Pixel, SCT, TRT, LAr, TileCal, MDT, RPC integrated (not all at once) Integrated triggers, e.g. full calo trigger chain used for first time Mostly beam with no RF structure, except a few runs with a 25 ns bunched beam Electron and pion beams contaminated with muons Mostly 20-350 GeV, some Very Low Energy runs at 1-9 GeV Beam
29
F. MerrittNSF Review 14-Jan-2005 29 First correlation plot Muons Electrons Pions 150 GeV pion beam contaminated with electrons and muons
30
F. MerrittNSF Review 14-Jan-2005 30 Standalone Resolution (1) Parametrize resolution: σ/E = a b/√E
31
31 Grid Computing input to NSF Review Rob Gardner UC NSF Review January, 2005
32
32 Overview of Grid Computing at UC US ATLAS Distributed Computing at Chicago Personnel: R. Gardner – L3 project manager for Grid Tools and Services M. Mambelli – lead developer of DC2 Capone execution service Y. Smirnov – DC2 production team and code testing A. Zahn – UC Tier2 systems administrator Responsible for Grid execution software for ATLAS code Data Challenge 2 (DC2) production software for Grid3 User production and distributed analysis U.S. Grid Middleware contact to international ATLAS U.S. Physics Grid Projects – Chicago contributions NSF GriPhyN, iVDGL Grid3, Open Science Grid Coordination of Grid3 and Grid3 Metrics collection and analysis Leading the Integration and validation Activity of the OSG Integration of GriPhyN (Virtual Data) software with ATLAS Prototype Tier2 center for ATLAS DC2 and Grid3, OSG
33
33 Chicago Grid Infrastructure Prototype Tier2 Linux Cluster NSF iVDGL project funded High Performance / High Availability 64 compute nodes (dual 3.0 GHz Xeon processors, 2 GB RAM) 3 gatekeepers and 3 interactive analysis systems all Raid0 4 storage servers provide 16 TB of attached RAID storage. TeraPort Cluster NSF MRI Grant, joint IBM project Integration and interoperability with the TeraGrid, OSG, and LCG 128 nodes with dual 2.2 GHz 64 bit AMD/Opteron processors (256 total) with 12 TB of fiber channel RAID, all connected with Gigabit Ethernet. Enterprise SUSE8 with the high performance GPFS file system
34
34 Contributions UC made leading contributions to iVDGL/Grid3 and continues to work on its successor, OSG
35
35 ATLAS Global Production System USATLAS
36
36 UC Tier2 Delivery to ATLAS DC2 Online May 2004 Performance comparable to BNL (Tier1) DC2 production Fraction of completed DC2 jobs USATLAS 9/04
37
37 U.S. ATLAS Grid Production G. Poulard, 9/21/04 # Validated Jobs total Day UC developed the Grid3 production code for US ATLAS 3M Geant4 events of ATLAS, roughly 1/3 of International ATLAS Plus digitization, pileup and recon jobs Over 150K jobs executed Competitive with peer European Grid projects LCG and NorduGrid
38
38 Midwest Tier2 Proposal Joint proposal with Indiana University to US ATLAS Takes advantage of excellent Chicago networking (IWIRE, Starlight) ~10Gbps Leverage resources from nearby projects (eg. TeraGrid)
39
39 References US ATLAS Software and Computing, http://www.usatlas.bnl.gov/computing/ http://www.usatlas.bnl.gov/computing/ US ATLAS Grid Tools and Services http://grid.uchicago.edu/gts http://grid.uchicago.edu/gts UC Prototype Tier 2 http://grid.uchicago.edu/tier2/http://grid.uchicago.edu/tier2/ iVDGL: “ The International Virtual Data Grid Laboratory ” http://www.ivdgl.org/http://www.ivdgl.org/ Grid3: “ Application Grid Laboratory for Science ” http://www.ivdgl.org/grid3/ http://www.ivdgl.org/grid3/ OSG: Open Science Grid Consortium http://www.opensciencegrid.org/ http://www.opensciencegrid.org/
40
40 From Amir Farbin’s talk at Tucson: The Atlas Computing Predicament: Situation for the past 6 months: You want to try an analysis… you’ll soon discover: Software problems: 9.0.x “reconstruction” release not quite ready ESD/AOD production has be unreliable until very recently No reconstruction output (everyone needs to reco themselves) Resource problems: Large pool of batch machines CERN- overloaded… takes days until jobs start BNL- has only 22 batch machines Resources busy w/ DC2 production and other users No place to run your jobs! Possible Reasons: Timing issues: Hardware purchasing ramp up? Tier 2 deployment? Conflict other Important Priorities: DC2 is a GRID exercise. It will soon be replaced by “Rome Production”. Tier 0 reconstruction is a computing exercise. It will mostly produce mixed events (not very useful for studies). Only 10% of DC2 will eventually be reconstructed. No large samples of reconstructed events available for analysis studies. Lots of important software developments in past 6 months
41
41 “How about the GRID3?” Up to 3000 processors available NOW in the US. ATLAS is involved in DC2 “production” work (run by experts) Individual users are not explicitly supported Distributed analysis tools not yet implemented on the GRID Existing tools have specific (and limited) functionality (ie production) No concept of individual users… Difficult to learn how the pieces fit together But w/ help from Rob Gardner and his group (Marco Mambelli & Yuri Smirnov) I was able to “hack” a working solution called UserJobManager. ---Rob Gardner
42
42 UserJobManager A collection of simple scripts which Install user transforms on GRID3 sites Everything needs to be “pre-installed” on site before jobs submission. Handle book-keeping of input/output 100,000’s of input/output files. Submit/resubmit jobs… decide What samples to run on What sites have been reliable What failed jobs are likely to succeed if resubmitted In DC2 these tasks handled by a production system (database, servers, clients, etc), production staff, and shifters. On a good GRID day (and there are many bad ones), I get 1000 reconstruction (ESD/AOD/CBNT) jobs done. (100K events/day) If interested (and adventurous) see: http://hep1.uchicago.edu/atlas07/atlas/UserJobManager/instructions.txt http://hep1.uchicago.edu/atlas07/atlas/UserJobManager/instructions.txt This is a “hack”… if everyone starts using these tools the GRID will break. 0 th step towards a bottoms-up approach to ATLAS user GRID computing.
43
43 Datasets Processed 400K events in 8.8.1 (ESD/CBNT) and/or 9.0.2 (ESD/CBNT/AOD) Files sitting at UC, BU, IU, and BNL. Registered in RLS (query ex: “dc2test*A0*reco*aod.pool.root*”). Need GRID certificate to access w/ gsiftp or DQ. CBNT ntuples available through http://hep1.uchicago.edu/atlas11/atlas/datasamples http://hep1.uchicago.edu/atlas11/atlas/datasamples Main problem now is that most interesting digitized datasets are in Europe. Problems w/ gsiftp servers and castor make transfers from Europe difficult. Yuri is trying new (expanded) version of DQ which will make transfers easier. Coordinating w/ people at BNL… they will begin copying files soon. Meanwhile I can copy ~1000 files/day using scripts which prestage data from castor and scp to UC. Problems w/ UC’s Tier2 have stalled transfers in past week.
44
44 Missing E T Dijet W Z(ll) Z( ) QCD (b-jet)SUSY DC2SUSY DC1 Top GeV
45
45 Summary DC2 + reco on GRID3 allowed us to begin examining backgrounds to SUSY in full simulation… (1 st time?) Iowa State has developed AOD analysis… (recently added MC wieghts for top events) UC & Iowa will collaborate… Understanding SUSY bkgs will be difficult Next steps: Explore techniques for estimating bkgs from data. Look into clever filtering of MC. Explore other topological variables. Explore signal extraction strategies: optimized cuts? ML fit? MV analysis? Try smearing… How well do we need to understand our detector before we can claim discovery?
46
F. MerrittNSF Review 14-Jan-2005 46 Plans for 2005 … and Beyond ……… There still are many, many things left to do before first collisions in 2007 (!) Further development of hadron energy calibration –Improve noise suppression using clustering algorithms –Extend and combine fitting approaches –Implement H1-based parameterization. Improve and test hadronic calibration using various methods and benchmarks: Gamma+jet Z+jet Dijet energy balancing Isolated charged hadrons Study sensitivity of calibration to physics process Many important tasks involved in commissioning studies with Tile at CERN –Compete checkout of Tile calorimeter –Devise high-statistics monitoring and validation procedures for jet calibration and monitoring –Write and test online and offiline monitoring software for Tile –Will need a significant presence at CERN for parts of this program Need to maintain and increase strong involvement in SUSY searches …. and a great many other physics topics still remain !!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.