Analysis Meeting 24Oct 05T. Burnett1 UW classification: new background rejection trees.

Slides:



Advertisements
Similar presentations
Sabino Meola Charged kaon group meeting 12 October 2005 Status of analysis.
Advertisements

Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Searching for Single Top Using Decision Trees G. Watts (UW) For the DØ Collaboration 5/13/2005 – APSNW Particles I.
7 June 06 1 UW Point Source Detection and Localization: Compare with DC2 truth Toby Burnett University of Washington.
Off-axis Simulations Peter Litchfield, Minnesota  What has been simulated?  Will the experiment work?  Can we choose a technology based on simulations?
Analysis Meeting 26 Sept 05T. Burnett1 UW classification: background rejection (A summary – details at a future meeting)
Bill Atwood, July, 2003 GLAST 1 A GLAST Analysis Agenda Overarching Approach & Strategy Flattening Analysis Variables Classification Tree Primer Sorting.
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 DC2 Discussion – What Next? 1)Alternatives - Bill's IM analysis - ???? 2)Backing the IM Analysis into Gleam.
Optimization of Signal Significance by Bagging Decision Trees Ilya Narsky, Caltech presented by Harrison Prosper.
Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status at UW Implement “nested trees” Generate “boosted” trees for all of the Atwood categories.
1 Update on Photons Graham W. Wilson Univ. of Kansas 1.More on  0 kinematic fit potential in hadronic events. 2.Further H-matrix studies (with Eric Benavidez).
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 Background Files Available Processed 500M v7r2 Bkg. events run at SLAC Processed 2M v7r2 All Gammas Processed.
MAE 552 – Heuristic Optimization Lecture 26 April 1, 2002 Topic:Branch and Bound.
Decision Trees for Error Concealment in Video Decoding Song Cen and Pamela C. Cosman, Senior Member, IEEE IEEE TRANSACTION ON MULTIMEDIA, VOL. 5, NO. 1,
Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status Implement pre-filters Energy estimation.
Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status at UW Background rejection for the v7r2 “500M” run at SLAC”: first look at v7r3.
Decision Trees Farrokh Alemi, Ph.D.. Coming Up How to construct a decision tree? –Components of a tree –Interviewing decision makers How to analyze a.
C&A 8 May 06 1 Point Source Localization: Optimizing the CTBCORE cut Toby Burnett University of Washington.
C&A 10April06 1 Point Source Detection and Localization Using the UW HealPixel database Toby Burnett University of Washington.
Analysis Meeting 17 Oct 05T. Burnett1 UW classification: background rejection Update with new all gamma and background samples.
Analysis Meeting 4/09/05 - T. Burnett 1 Classification tree Update Toby Burnett Frank Golf.
1 Using Insightful Miner Trees for Glast Analysis Toby Burnett Analysis Meeting 2 June 2003.
1 Accurate Object Detection with Joint Classification- Regression Random Forests Presenter ByungIn Yoo CS688/WST665.
Xiaomeng Su & Jon Atle Gulla Dept. of Computer and Information Science Norwegian University of Science and Technology Trondheim Norway June 2004 Semantic.
Analysis Meeting 14Nov 05T. Burnett1 Classification, etc. status at UW New application to apply trees to existing tuple Background rejection for the v7r2.
1 CC analysis update Repeat of CC analysis with R1.9 ntuples –What is the effect of improved tracking efficiency? Alternative PID methods: likelihood vs.
Classification and Regression Trees for Glast Analysis: to IM or not to IM? Toby Burnett Data Challenge Meeting 15 June 2003.
Analysis Meeting 8 Aug 05T. Burnett1 Status of UW classification and PSF fits Data selection Classification PSF.
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 Back Ground Rejection Status for DC2 a la Ins. Miner Analysis.
Bill Atwood, SCIPP/UCSC, Jan., 2006 GLAST 1 The 3 rd Pass Back Rejection Analysis using V7R3P4 (repo) 1)Training sample: Bkg: Runs (SLAC) v7r3p4.
Midwestern State University, Wichita Falls TX 1 Computerized Trip Classification of GPS Data: A Proposed Framework Terry Griffin - Yan Huang – Ranette.
Vector Boson Scattering At High Mass
Application of Data Mining Algorithms in Atmospheric Neutrino Analyses with IceCube Tim Ruhe, TU Dortmund.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Methodology - Conceptual Database Design. 2 Design Methodology u Structured approach that uses procedures, techniques, tools, and documentation aids to.
DC2 at GSFC - 28 Jun 05 - T. Burnett 1 DC2 C++ decision trees Toby Burnett Frank Golf Quick review of classification (or decision) trees Training and testing.
Artificial Intelligence Project #3 : Analysis of Decision Tree Learning Using WEKA May 23, 2006.
B-tagging Performance based on Boosted Decision Trees Hai-Jun Yang University of Michigan (with X. Li and B. Zhou) ATLAS B-tagging Meeting February 9,
Classification and Prediction Compiled By: Umair Yaqub Lecturer Govt. Murray College Sialkot Readings: Chapter 6 – Han and Kamber.
Measurement of the Atmospheric Muon Neutrino Energy Spectrum with IceCube in the 79- and 86-String Configuration Tim Ruhe, Mathis Börner, Florian Scheriau,
Training of Boosted DecisionTrees Helge Voss (MPI–K, Heidelberg) MVA Workshop, CERN, July 10, 2009.
ASSESSING LEARNING ALGORITHMS Yılmaz KILIÇASLAN. Assessing the performance of the learning algorithm A learning algorithm is good if it produces hypotheses.
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
Feb. 7, 2007First GLAST symposium1 Measuring the PSF and the energy resolution with the GLAST-LAT Calibration Unit Ph. Bruel on behalf of the beam test.
פרקים נבחרים בפיסיקת החלקיקים אבנר סופר אביב
Energy Efficiency in Public Buildings in Brazil – Guidelines for Implementation Anand Subbiah, Nexant Inc. April 30, 2002, Kubitschek Plaza Hotel, Brasilia,
Mark Dorman UCL/RAL MINOS Collaboration Meeting Fermilab, Oct. 05 Data/MC Comparisons and Estimating the ND Flux with QE Events ● Update on QE event selection.
Kalanand Mishra BaBar Coll. Meeting February, /8 Development of New Kaon Selectors Kalanand Mishra University of Cincinnati.
GLAST/LAT analysis group 31Jan05 - T. Burnett 1 Creating Decision Trees for GLAST analysis: A new C++-based procedure Toby Burnett Frank Golf.
B-tagging based on Boosted Decision Trees
CIS 335 CIS 335 Data Mining Classification Part I.
RECITATION 4 MAY 23 DPMM Splines with multiple predictors Classification and regression trees.
LNF 12/12/06 1 F.Ambrosino-T. Capussela-F.Perfetto Update on        Dalitz plot slope Where we started from A big surprise Systematic checks.
Helge VossAdvanced Scientific Computing Workshop ETH Multivariate Methods of data analysis Helge Voss Advanced Scientific Computing Workshop ETH.
Background Rejection Activities in Italy
Background Rejection Prep Work
Converted photons efficiency
HARPO Analysis.
Scripts & Functions Scripts and functions are contained in .m-files
Multi-dimensional likelihood
MiniBooNE Event Reconstruction and Particle Identification
Study of the
Layer Management and MIBs Sections Report
UW “Source Detection” Status
LAT performance for DC2 and the “4-panel plot”
Statistical Learning Dong Liu Dept. EEIS, USTC.
Methodology Conceptual Databases Design
Searching for photons in the LAT
CR Photon Working Group
Data Challenge 1 Closeout Lessons Learned Already
Presentation transcript:

Analysis Meeting 24Oct 05T. Burnett1 UW classification: new background rejection trees

Analysis Meeting 24Oct 05T. Burnett2 The eight selections (from Bill)

Analysis Meeting 24Oct 05T. Burnett3 The prefilter cuts Gamma classification categoryprefilter: remove if true vertex-high AcdActiveDist > -10 | CalTrackAngle >.5 | CalTrackDoca > 40 vertex-med AcdActiveDist > -199 | AcdRibbonActDist > |CalTrackDoca > 200 vertex-thin AcdActiveDist > -199 | AcdRibbonActDist > vertex-thick AcdUpperTileCount > 0 | AcdLowerTileCount > 1 |AcdRibbonActDist > track-high CalTrackDoca > 30 | CalTrackAngle >.3 track-med AcdActiveDist > -199 | AcdRibbonActDist > | CalTrackDoca > 40 | CalTrackAngle >.5 | CalXtalRatio >.85 track-thin AcdActiveDist > -199 | AcdRibbonActDist > | CalTrackDoca > 200 | EvtECalTransRms <.8 track-thick AcdActiveDist > -199 | AcdRibbonActDist > | AcdDoca 200 | EvtECalTransRms > 2.5 | CalMaxXtalRatio >.8 | Tkr1FirstChisq > 2.5 | Tkr1ToTTrAve > 2

Analysis Meeting 24Oct 05T. Burnett4 Implementation in merit Each tree is described by two files: –dtree.txt – ascii file with a list of weighted trees and nodes: tree: specify the weight to assign to the tree branch: variable index, cut value leaf: purity –variables.txt – list of the corresponding tuple variables Evaluation is by passing a vector of floats, ordered according to the variable list. Proposal to incorporate the prefilter cut in the tree description file structure

Analysis Meeting 24Oct 05T. Burnett5 Training details Weight signal and background to be the same Train on the EVEN events, with optional boosting Test with ODD events Save training and testing efficiency curves

Analysis Meeting 24Oct 05T. Burnett6 Boosting: what does it do? Nice interpolation for low background Not much improvement in actual separation (so far)

Analysis Meeting 24Oct 05T. Burnett7 Preliminary single-tree background

Analysis Meeting 24Oct 05T. Burnett8 What about the energy resolution? Validity fractions

Analysis Meeting 24Oct 05T. Burnett9 The fraction of time each estimate is best