Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status Implement pre-filters Energy estimation.

Slides:



Advertisements
Similar presentations
How SAS implements structured programming constructs
Advertisements

Introduction to Computer Science 2 Lecture 7: Extended binary trees
C4.5 algorithm Let the classes be denoted {C1, C2,…, Ck}. There are three possibilities for the content of the set of training samples T in the given node.
1 Progress report on Calorimeter design comparison simulations MICE detector phone conference Rikard Sandström.
-73- HMP654 Decision Analysis-Decision Trees A decision tree is a graphical representation of every possible sequence of decision and random outcomes (states.
Real-Time Human Pose Recognition in Parts from Single Depth Images Presented by: Mohammad A. Gowayyed.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Selecting Preservation Strategies for Web Archives Stephan Strodl, Andreas Rauber Department of Software.
7 June 06 1 UW Point Source Detection and Localization: Compare with DC2 truth Toby Burnett University of Washington.
Trees. 2 Definition of a tree A tree is like a binary tree, except that a node may have any number of children –Depending on the needs of the program,
Trees. 2 Definition of a tree A tree is like a binary tree, except that a node may have any number of children Depending on the needs of the program,
Analysis Meeting 24Oct 05T. Burnett1 UW classification: new background rejection trees.
Analysis Meeting 26 Sept 05T. Burnett1 UW classification: background rejection (A summary – details at a future meeting)
Bill Atwood, SCIPP/UCSC, Apr. 3, 2006 GLAST 1 Pass 4 and Preparing for the LAT Handoff to NASA Why we can't go with the DC2 Analysis (Pass 3) 1) Background.
Bill Atwood, July, 2003 GLAST 1 A GLAST Analysis Agenda Overarching Approach & Strategy Flattening Analysis Variables Classification Tree Primer Sorting.
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 DC2 Discussion – What Next? 1)Alternatives - Bill's IM analysis - ???? 2)Backing the IM Analysis into Gleam.
Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status at UW Implement “nested trees” Generate “boosted” trees for all of the Atwood categories.
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 Background Files Available Processed 500M v7r2 Bkg. events run at SLAC Processed 2M v7r2 All Gammas Processed.
Decision Trees for Error Concealment in Video Decoding Song Cen and Pamela C. Cosman, Senior Member, IEEE IEEE TRANSACTION ON MULTIMEDIA, VOL. 5, NO. 1,
Trees. Definition of a tree A tree is like a binary tree, except that a node may have any number of children –Depending on the needs of the program, the.
Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status at UW Background rejection for the v7r2 “500M” run at SLAC”: first look at v7r3.
C&A 8 May 06 1 Point Source Localization: Optimizing the CTBCORE cut Toby Burnett University of Washington.
C&A 10April06 1 Point Source Detection and Localization Using the UW HealPixel database Toby Burnett University of Washington.
Analysis Meeting 17 Oct 05T. Burnett1 UW classification: background rejection Update with new all gamma and background samples.
Bill Atwood, Nov. 2002GLAST 1 Classification PSF Analysis A New Analysis Tool: Insightful Miner Classification Trees From Cuts Classification Trees: Recasting.
Analysis Meeting 4/09/05 - T. Burnett 1 Classification tree Update Toby Burnett Frank Golf.
1 Using Insightful Miner Trees for Glast Analysis Toby Burnett Analysis Meeting 2 June 2003.
Analysis Meeting 14Nov 05T. Burnett1 Classification, etc. status at UW New application to apply trees to existing tuple Background rejection for the v7r2.
Classification and Regression Trees for Glast Analysis: to IM or not to IM? Toby Burnett Data Challenge Meeting 15 June 2003.
Analysis Meeting 8 Aug 05T. Burnett1 Status of UW classification and PSF fits Data selection Classification PSF.
Bill Atwood, SCIPP/UCSC, Oct, 2005 GLAST 1 Back Ground Rejection Status for DC2 a la Ins. Miner Analysis.
Bill Atwood, SCIPP/UCSC, Jan., 2006 GLAST 1 The 3 rd Pass Back Rejection Analysis using V7R3P4 (repo) 1)Training sample: Bkg: Runs (SLAC) v7r3p4.
A Presentation on the Implementation of Decision Trees in Matlab
CSE Lectures 22 – Huffman codes
Fundamentals of Python: From First Programs Through Data Structures
Fundamentals of Python: First Programs
RealProperty PLUS Commercial I Escalations and Passthroughs © 2009 Domin-8 Enterprise Solutions LLC. All rights reserved.
Introduction. 2COMPSCI Computer Science Fundamentals.
Intelligent Database Systems Lab Advisor : Dr. Hsu Graduate : Chien-Shing Chen Author : Satoshi Oyama Takashi Kokubo Toru lshida 國立雲林科技大學 National Yunlin.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Arindam K. Das CIA Lab University of Washington Seattle, WA MINIMUM POWER BROADCAST IN WIRELESS NETWORKS.
DC2 at GSFC - 28 Jun 05 - T. Burnett 1 DC2 C++ decision trees Toby Burnett Frank Golf Quick review of classification (or decision) trees Training and testing.
Artificial Intelligence Project #3 : Analysis of Decision Tree Learning Using WEKA May 23, 2006.
Tree Diagrams.  A tree diagram helps us think through conditional probabilities by showing sequences of events as paths that look like branches of a.
B-tagging Performance based on Boosted Decision Trees Hai-Jun Yang University of Michigan (with X. Li and B. Zhou) ATLAS B-tagging Meeting February 9,
Classification and Prediction Compiled By: Umair Yaqub Lecturer Govt. Murray College Sialkot Readings: Chapter 6 – Han and Kamber.
Training of Boosted DecisionTrees Helge Voss (MPI–K, Heidelberg) MVA Workshop, CERN, July 10, 2009.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
LogTree: A Framework for Generating System Events from Raw Textual Logs Liang Tang and Tao Li School of Computing and Information Sciences Florida International.
Main Index Contents 11 Main Index Contents Complete Binary Tree Example Complete Binary Tree Example Maximum and Minimum Heaps Example Maximum and Minimum.
Decision Tree Learning Presented by Ping Zhang Nov. 26th, 2007.
GLAST/LAT analysis group 31Jan05 - T. Burnett 1 Creating Decision Trees for GLAST analysis: A new C++-based procedure Toby Burnett Frank Golf.
CIS 335 CIS 335 Data Mining Classification Part I.
Probabilistic methods for phylogenetic tree reconstruction BMI/CS 576 Colin Dewey Fall 2015.
T. Burnett: Anagroup 5/24/04 1 Latest PSF and Point-source sensitivity prediction status Anagroup meeting Toby Burnett 24 May 04.
By N.Gopinath AP/CSE.  A decision tree is a flowchart-like tree structure, where each internal node (nonleaf node) denotes a test on an attribute, each.
Background Rejection Activities in Italy
Background Rejection Prep Work
C4.5 algorithm Let the classes be denoted {C1, C2,…, Ck}. There are three possibilities for the content of the set of training samples T in the given node.
C4.5 algorithm Let the classes be denoted {C1, C2,…, Ck}. There are three possibilities for the content of the set of training samples T in the given node.
MiniBooNE Event Reconstruction and Particle Identification
Classification by Decision Tree Induction
Chapter 11 Data Compression
Propagation Algorithm in Bayesian Networks
LAT performance for DC2 and the “4-panel plot”
Statistical Learning Dong Liu Dept. EEIS, USTC.
Authors: Wai Lam and Kon Fan Low Announcer: Kyu-Baek Hwang
Searching for photons in the LAT
CR Photon Working Group
Podcast Ch23d Title: Huffman Compression
Presentation transcript:

Analysis Meeting 31Oct 05T. Burnett1 Classification, etc. status Implement pre-filters Energy estimation

Analysis Meeting 31Oct 05T. Burnett2 Prefilters All of the Atwood background rejection trees require prefilters Simplifies trees Needed to implement IM trees Reject if true: AcdActiveDist > -199 | AcdRibbonActDist > | CalTrackDoca > 40 | CalTrackAngle >.5 | CalXtalRatio >.85 One of eight: the track/medcal case apply the classification tree only for events passing the filter

Analysis Meeting 31Oct 05T. Burnett3 Implemented by a simple file filter.txt has this text (each statement must be true) this is a tree, and is converted to same applied in training, testing, and final analysis AcdActiveDist < AcdRibbonActDist < CalTrackDoca < 40 CalTrackAngle <.5 CalXtalRatio <.85

Analysis Meeting 31Oct 05T. Burnett4 The gamma pre-filter cuts Gamma classification categoryprefilter: remove if true vertex-high AcdActiveDist > -10 | CalTrackAngle >.5 | CalTrackDoca > 40 vertex-med AcdActiveDist > -199 | AcdRibbonActDist > |CalTrackDoca > 200 vertex-thin AcdActiveDist > -199 | AcdRibbonActDist > vertex-thick AcdUpperTileCount > 0 | AcdLowerTileCount > 1 |AcdRibbonActDist > track-high CalTrackDoca > 30 | CalTrackAngle >.3 track-med AcdActiveDist > -199 | AcdRibbonActDist > | CalTrackDoca > 40 | CalTrackAngle >.5 | CalXtalRatio >.85 track-thin AcdActiveDist > -199 | AcdRibbonActDist > | CalTrackDoca > 200 | EvtECalTransRms <.8 track-thick AcdActiveDist > -199 | AcdRibbonActDist > | AcdDoca 200 | EvtECalTransRms > 2.5 | CalMaxXtalRatio >.8 | Tkr1FirstChisq > 2.5 | Tkr1ToTTrAve > 2

Analysis Meeting 31Oct 05T. Burnett5 New energy trees Need to implement this sequence, described in detail previously by Bill last August. need 4 trees, each to evaluate the likelihood that each of the energy estimation methods is best resulting probability is the best, energy corresponds to best

Analysis Meeting 31Oct 05T. Burnett6 New folder setup Each tree is described by three files in each folder: –dtree.txt – ascii file with a list of weighted trees and nodes: tree: specify the weight to assign to the tree branch: variable index, cut value leaf: purity –variables.txt – list of the corresponding tuple variables –filter.txt – optional filter Evaluation is by passing a vector of floats, ordered according to the variable list. new

Analysis Meeting 31Oct 05T. Burnett7 Preliminary look at UW energy analysis trees Single trees, same variables, good criterion as Bill

Analysis Meeting 31Oct 05T. Burnett8 Dispersion study, 0.25 cut

Analysis Meeting 31Oct 05T. Burnett9 Dispersion study, 0.50 cut

Analysis Meeting 31Oct 05T. Burnett10 Dispersion, 0.75 cut

Analysis Meeting 31Oct 05T. Burnett11 Conclude Best cut is probably energy-dependent