Met and Normalization Sarah Eno
I wanted to see if we can learn anything about the MET normalization issue using a toy monte carlo. first, we need a toy MC that does a reasonable job at reproducing the MET distribution (don’t think great job is needed). This attempt can be found at:
first, I need a model for generating the Pt(hat) of the event. I use qt=0.016E10*x/((10+x)**7.) x-axis: Pthat y-axis: integral cross section above this Pthat in mb solid line: my model dots: from jetmet QCD sample agreement is reasonable (don’t think I need great agreement for what I’m doing)
resolution per min bias event: 2.4 1.1*sqrt(2.*pthat) make 1 event from a Pthat bin, and then overlay with 17 min bias from a sample of min bias (ie, random Pthat according to distribution on previous slide) using ORCA4 type recyling events with 10<pthat<15 x-axis: MET y-axis: number of events black: my simulation red: hlt data (towmet, thres 0.01 GeV)
data (the jump at 100 from the bin comes from a flat tail beyond 100 in this bin, and not in the surrounding bins. it has twice the stats of the other bins, and 7 events above 100) toymc x-axis: met cut, y-axis: rate in khz for events that pass that cut
toymc, 1,000,000,000 min bias let’s see how the various normalizations of the binned data do...
well, first let’s take a detour (you’ll see why…) consider MET in the case where =5 here, you can see that above MET of about 80, the bins below are negligible
because of the limited statistics of the overlay sample, this normalization systematically underestimates the rate at high MET (MET> about 50) “Paris” normalization: rate = 32*(sigma_i/sigma_tot), overlaying This method has the problem that, for crossings containing a high Pt hat event, (Npile-1/Npile) of the rate comes from high Pthat events in the min bias sample used for the overlay overlayed onto the lowest Pthat bin (assuming the bulk of the cross section is in the lowest Pthat bin), and only 1/Npile comes from simulating the high Pthat bin. so, where the rate is dominated by events with high Pthat, this method will tend to underestimate the rate by a factor of Npile (for Pthats high enough that the minbias sample used for overlay has run out of statistics) binned data min bias data
ratio of two previous plots.
“normal” normalization r = L*sigma_I (excluding 0-10 bin) overlaying 17 ratio of the two plots factor of npile (=5) where stats in bin run out binned data
“dasu” normalization (= normal normalization, except reject min bias pthat larger than the hard scattering), overlaying 17 not bad in this region binned data
17 interactions, “paris” normalization does pretty well until met of 80 or so
just for fun, try same thing for jet trigger smeared = pthat smeared with rms = 1.1 sqrt(pthat) (1 jet per event) (rate for zero cut is 17*crossing rate, since 1 “jet” per event) jet spectra from a few of the bins. for this, its easy to see that certain bins dominate at certain pthats
min bias paris norm normal norm
bin X sect number events single event rate (mb) (kHz) about 7 events above 100 for represent.2 kHz => 3 events in flux to 0 (prob is 5%) => in flux to 0 hlt data
x axis: MET y axis: rate in kHz above cut red blue just when it looks like red should cross blue, it kicks out a tail... hlt data