Download presentation
Presentation is loading. Please wait.
Published byAmy Matthews Modified over 9 years ago
1
Met and Normalization Sarah Eno
2
I wanted to see if we can learn anything about the MET normalization issue using a toy monte carlo. first, we need a toy MC that does a reasonable job at reproducing the MET distribution (don’t think great job is needed). This attempt can be found at: http://home.fnal.gov/~sceno/forjpg/normalize.for
3
first, I need a model for generating the Pt(hat) of the event. I use qt=0.016E10*x/((10+x)**7.) x-axis: Pthat y-axis: integral cross section above this Pthat in mb solid line: my model dots: from jetmet QCD sample agreement is reasonable (don’t think I need great agreement for what I’m doing)
4
resolution per min bias event: 2.4 1.1*sqrt(2.*pthat) make 1 event from a Pthat bin, and then overlay with 17 min bias from a sample of 10000 min bias (ie, random Pthat according to distribution on previous slide) using ORCA4 type recyling events with 10<pthat<15 x-axis: MET y-axis: number of events black: my simulation red: hlt data (towmet, thres 0.01 GeV)
5
30-50 80-120
6
300-380 170-230
7
data (the jump at 100 from the 20-30 bin comes from a flat tail beyond 100 in this bin, and not in the surrounding bins. it has twice the stats of the other bins, and 7 events above 100) toymc x-axis: met cut, y-axis: rate in khz for events that pass that cut
8
toymc, 1,000,000,000 min bias let’s see how the various normalizations of the binned data do...
9
well, first let’s take a detour (you’ll see why…) consider MET in the case where =5 here, you can see that above MET of about 80, the bins below 80- 120 are negligible
10
because of the limited statistics of the overlay sample, this normalization systematically underestimates the rate at high MET (MET> about 50) “Paris” normalization: rate = 32*(sigma_i/sigma_tot), overlaying 17- 1 This method has the problem that, for crossings containing a high Pt hat event, (Npile-1/Npile) of the rate comes from high Pthat events in the min bias sample used for the overlay overlayed onto the lowest Pthat bin (assuming the bulk of the cross section is in the lowest Pthat bin), and only 1/Npile comes from simulating the high Pthat bin. so, where the rate is dominated by events with high Pthat, this method will tend to underestimate the rate by a factor of Npile (for Pthats high enough that the minbias sample used for overlay has run out of statistics) binned data min bias data
11
ratio of two previous plots.
12
“normal” normalization r = L*sigma_I (excluding 0-10 bin) overlaying 17 ratio of the two plots factor of npile (=5) where stats in 10-15 bin run out binned data
13
“dasu” normalization (= normal normalization, except reject min bias pthat larger than the hard scattering), overlaying 17 not bad in this region binned data
14
17 interactions, “paris” normalization does pretty well until met of 80 or so
15
just for fun, try same thing for jet trigger smeared = pthat smeared with rms = 1.1 sqrt(pthat) (1 jet per event) (rate for zero cut is 17*crossing rate, since 1 “jet” per event) jet spectra from a few of the bins. for this, its easy to see that certain bins dominate at certain pthats
16
min bias paris norm normal norm
17
bin X sect number events single event rate (mb) (kHz) 10-15 7.17 94000 0.3 15-20 1.54 128331 0.09 20-30 0.657 205520 0.03 30-50 0.226 99241 0.02 50-80 0.00314 32043 0.001 20-30 30-50 about 7 events above 100 for 20-30 represent.2 kHz => 3 events in 30-50 flux to 0 (prob is 5%) => in 30-50+50-80 4.5 flux to 0 hlt data
18
x axis: MET y axis: rate in kHz above cut red 20-30 blue 30-50 just when it looks like red should cross blue, it kicks out a tail... hlt data 20-30 30-50
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.