Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Progress report on Calorimeter design comparison simulations MICE detector phone conference 2006-01-27 Rikard Sandström.

Similar presentations


Presentation on theme: "1 Progress report on Calorimeter design comparison simulations MICE detector phone conference 2006-01-27 Rikard Sandström."— Presentation transcript:

1 1 Progress report on Calorimeter design comparison simulations MICE detector phone conference 2006-01-27 Rikard Sandström

2 2 Before I begin: Scraping in trackers At 6 pi mm, partial scraping in trackers. –Particles still make it through the experiment. Manually filtering events with more than 7 MeV energy loss in a tracker. –Done by using MC truth values. Tracker people will have to deal with this.

3 3 Outline The alternative detector geometry Techniques and methods used –PID simulations in 14 steps Present status Results so far

4 4 The alternative calorimeter The alternative calorimeter consists of one KLOE light layer in front, then ten plastic layers. –Used to call this smörgås, sandwich has two KLOE layers. The latter is no good idea, so sandwich now means the variant with only one KL layer. Plastic layers contain 9 cells each, at increasing thickness. (1 cm to 12 cm). –Increasing thickness gives best range(p) resolution for money. Total number of channels is constant between designs. –KL: 4x30x2 = 240 channels. –SW: (30+10x9)x2 = 240 channels. Abbreviations: –KL = KLOE Light (4 KLOE Light layers) –SW = Sandwich (1 KLOE Light layer, then plastic)

5 5 Reminder of run plan Stage 1 –Pi & Mu 100<pz<300 MeV/c Stage 6 –Mu & mu-decay 140 MeV/c 170 MeV/c 200 MeV/c 240 MeV/c Tilley’s TURTLE beam, with diffuser

6 6 Method #1 (examples follows) 1.Write a document explaining what to do and why Not in the document = not on the table. 2.Simulate beams of 10k events, wide distributions. 3.Use those to find useful variables for PID. 4.Find combinations of detectors, such that given A, expect B. 5.Make fits for all expected values, and create “discrepancy variables” 1-expected/measured. Zero means very muon like. 6.Run 120k events of muons per experimental scenario. ~ 2Gb of data per file 7.For every such scenario, also run 120k muons with 40 ns lifetime to generate background. Muons not decayed at TOF2 are filtered out of analysis.

7 7 Method #2 (examples follows) 8.Digitize every simulated beam. 9.Convert to ROOT trees, and tag good/bad event. 10.For every scenario, merge the muon sample with the background sample. 11.Filter out events while trying to not lose any muons. 12.Train a Neural Net on the half of the merged & filtered sample (training sample). 13.Using the weights acquired by Neural Net, assign a weight all other events (the test sample). 14.Evaluate the PID capabilities by looking at weights for the test sample.

8 8 100<p z <300 MeV/c

9 9 Sandwich

10 10 Example of a fit

11 11 Example of “discrepancy variable” used for Neural Net Discrepancy = 1-expected/measured

12 12 Discrepancy = 1-expected/measured

13 13 Stage 1, 100<p z <300 MeV/c Stage 1Mu, KLPi, KLMu, SWPi, SW Simulation100% Digitisation100% Fits100% RootEvent100% Neural Net 100%

14 14 Stage 6, 140±14 MeV/c Stage 6 140 MeV/c Mu, KLBG, KLMu, SWBG, SW Simulation100% Digitisation100% Fits100% RootEvent100% Problem! (bug 107) Problem! (bug 107) Neural Net 100% 0%

15 15 Stage 6, 170±17 MeV/c Stage 6 170 MeV/c Mu, KLBG, KLMu, SWBG, SW Simulation100% Digitisation100% Problem! (bug 107) Fits100% RootEvent100%0%Problem! (bug 107) 0% Neural Net0%

16 16 Stage 6, 200±20 MeV/c Stage 6 200 MeV/c Mu, KLBG, KLMu, SWBG, SW Simulation100% Digitisation100% Problem! (bug 107) Fits100% RootEventProblem! (bug 107) 0% Problem! (bug 107) Neural Net0%

17 17 Stage 6, 240±24 MeV/c Stage 6 240 MeV/c Mu, KLBG, KLMu, SWBG, SW Simulation0%30%100% Digitisation0% 100%Problem! (bug 107) Fits100% RootEvent0% Neural Net0%

18 18 Stage 6, Tilley’s TURTLE beam A problem with the diffuser does not allow it to be placed. Without a diffuser, too low emittance. If I have time I will try to solve the problem before Japan.

19 19 Bug 107 A vector in EmCalHit holding pointers to EmCalDigits seems to be corrupt. Very rare makes it hard to debug. –Why rare? Since the RootEvent converter uses the same class both Digitization and RootEvent suffers. Could be compiler/machine specific problem. –Then move all files to another computer, but we are talking of ~ 50 Gb of data.

20 20 Results - Stage 1 Neural Net –For training, used only muons which stayed muons until downstream TOF or beyond. Same for pions. –For testing, pions decaying to muons between TOFs where 1.treated as background. 2.omitted from analysis. –KLOE Light: Strongest variables are based on: –tof, barycenter, and fraction of energy in first layer. –Sandwich: Strongest variables are based on: –tof, barycenter, and total energy in calorimeter Signal acc.BG rej. (with  ->µ) BG rej (no  ->µ) 99.5% 49.2% 54.4% 99.0% 56.3% 62.2% 90.0% 80.5% 86.5% Signal acc.BG rej. (with  ->µ) BG rej (no  ->µ) 99.5% 61.0% 68.1% 99.0% 68.2% 75.3% 90.0% 79.1% 84.1% KLOE Light Sandwich

21 21 Results - Stage 6 Only 140 MeV/c, KLOE Light is finished. –Results are very promising, but I wait with presenting them until I can compare the different detectors.

22 22 Comments All momentum and tof measurements are MC truth. –Still waiting for tracker reconstruction to come back online. –For tof, might simply add a Gaussian.

23 23 Summary Stage 1 is finished –Only a matter of how to present it. Most of stage 6 is simulated, but only partly digitized. –A bug most be fixed to continue. The first stage 6 beam that could be analyzed looks promising.


Download ppt "1 Progress report on Calorimeter design comparison simulations MICE detector phone conference 2006-01-27 Rikard Sandström."

Similar presentations


Ads by Google