Presentation is loading. Please wait.

Presentation is loading. Please wait.

Kalanand Mishra BaBar Coll. Meeting February, 2007 0/8 Development of New Kaon Selectors Kalanand Mishra University of Cincinnati.

Similar presentations


Presentation on theme: "Kalanand Mishra BaBar Coll. Meeting February, 2007 0/8 Development of New Kaon Selectors Kalanand Mishra University of Cincinnati."— Presentation transcript:

1 Kalanand Mishra BaBar Coll. Meeting February, 2007 0/8 Development of New Kaon Selectors Kalanand Mishra University of Cincinnati

2 Kalanand Mishra BaBar Coll. Meeting February, 2007 1/8 The input variables for neural net are: likelihoods from SVT, DCH, DRC (both global and track-based) and momentum and polar angle (  ) of the tracks. Separate neural net training for ‘Good Quality’ and ‘Poor Quality’* tracks: gives two family of selectors - “KNNGoodQual“ and “KNNNoQual”. Overview of Neural Net Training * Poor Quality tracks are defined as belonging to one of the following categories: - outside DIRC acceptance - passing through the cracks between DIRC bars - no DCH hits in layers > 35 - EMC energy < 0.15 GeV

3 Kalanand Mishra BaBar Coll. Meeting February, 2007 2/8 Performance of ‘KNNxQual’ selectors Signal Efficiency Bkgd. Rejection The higher curve/ point represents better performance GoodQual NoQual GoodQual 0.3 < P < 0.5 An overall improvement An absolute improvement Deterioration Most of the tracks used in B-tagging have low momenta. And, the biggest consumer for such a selector is B- tagging group.

4 Kalanand Mishra BaBar Coll. Meeting February, 2007 3/8 Tried different algorithms …. Fisher Binary Ada Boost Simple Binary Split Bagger Decision Tree Ada Boost Decision Tree ( Provides the best separation ) Classifier Output Events

5 Kalanand Mishra BaBar Coll. Meeting February, 2007 4/8 AdaBoost Decision Tree Bkgd. Sgnl. Classifier Output Events For details on the algorithms and software used, see: arXiv:physics/0507143 (by Ilya Narsky) Decision Tree splits nodes recursively until a stopping criteria is satisfied. AdaBoost combines weak classifiers by applying them sequentially. At each step it enhances weights of misclassified events and reduces weights of correctly classified events.

6 Kalanand Mishra BaBar Coll. Meeting February, 2007 5/8 Ada Boost Decision Tree Training on “real data”. Visual inspection shows a significant improvement over the neural network performance. Need to retrain after randomizing the momentum dist. and with additional input variables.

7 Kalanand Mishra BaBar Coll. Meeting February, 2007 6/8 Performance in select momentum bins Low momentum: 0.3 < P < 0.5 GeV/c dE/dx - DRC transition region: 0.8 < P < 1.0 GeV/c Intermediate range: 1.9 < P < 2.1 GeV/c High momentum: 3.0 < P < 3.2 GeV/c

8 Kalanand Mishra BaBar Coll. Meeting February, 2007 7/8 Randomize the momentum distributions of signal and background events before training Add additional input discriminating variables : - # signal and bkgd. Cherenkov photons in the ring - # total drift chamber hits and hits in the last 5 layers - # hits in the silicon detector - …… other suggestions ! Add other background categories - proton, ….. Finalize the cuts and implement the selectors : - it will be a single family of selectors : no separate selectors for “good” and “poor” quality tracks - should we still call it a KNN selector ? Things to do ….

9 Kalanand Mishra BaBar Coll. Meeting February, 2007 8/8 Significant efforts underway to develop a “new version” of the KNN selectors. The goal is to develop a powerful non-LH kaon selector using the best performing classifier (or a combination of classifiers). Such a selector is expected to be able to replace the current KNN selectors for B-tagging purposes, and should be a meaningful alternative of the LH selectors for Physics analyses. Summary


Download ppt "Kalanand Mishra BaBar Coll. Meeting February, 2007 0/8 Development of New Kaon Selectors Kalanand Mishra University of Cincinnati."

Similar presentations


Ads by Google