Bill Atwood, August, 2003 GLAST 1 Covariance & GLAST Agenda Review of Covariance Application to GLAST Kalman Covariance Present Status.

Slides:



Advertisements
Similar presentations
Introductory Statistics for Laboratorians dealing with High Throughput Data sets Centers for Disease Control.
Advertisements

Lesson 10: Linear Regression and Correlation
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
Basic geostatistics Austin Troy.
5-7: Scatter Plots & Lines of Best Fit. What is a scatter plot?  A graph in which two sets of data are plotted as ordered pairs  When looking at the.
The Simple Linear Regression Model: Specification and Estimation
x – independent variable (input)
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Bill Atwood, Core Meeting, 9-Oct GLAS T 1 Finding and Fitting A Recast of Traditional GLAST Finding: Combo A Recast of the Kalman Filter Setting.
Bill Atwood, SCIPP/UCSC, Dec., 2003 GLAST 1 DC1 Instrument Response Agenda Where are we? V3R3P7 Classification Trees Covariance Scaled PSF Pair Energies.
Bill Atwood, SCIPP/UCSC, Nov, 2003 GLAST 1 Update on Rome Results Agenda A eff Fall-off past 10 GeV V3R3P7 Classification Trees Covariance Scaled PSFs.
Measures of the relationship between 2 variables: Correlation Chapter 16.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
REGRESSION AND CORRELATION
Bill Atwood - July 2002 GLAS T 1 A Kalman Filter for GLAST What is a Kalman Filter and how does it work. Overview of Implementation in GLAST Validation.
Regression Chapter 10 Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania.
Business Statistics - QBM117 Least squares regression.
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
Lecture 3: Inferences using Least-Squares. Abstraction Vector of N random variables, x with joint probability density p(x) expectation x and covariance.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Lorelei Howard and Nick Wright MfD 2008
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Relationships Among Variables
Separate multivariate observations
1 CHAPTER M4 Cost Behavior © 2007 Pearson Custom Publishing.
2.4 Using Linear Models. The Trick: Converting Word Problems into Equations Warm Up: –How many ways can a $50 bill be changed into $5 and $20 bills. Work.
Regression and Correlation Methods Judy Zhong Ph.D.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Overview of Meta-Analytic Data Analysis
University of Colorado Boulder ASEN 5070 Statistical Orbit Determination I Fall 2012 Professor Jeffrey S. Parker Professor George H. Born Lecture 25: Error.
Mathematical Fundamentals
CORRELATION & REGRESSION
MULTIPLE TRIANGLE MODELLING ( or MPTF ) APPLICATIONS MULTIPLE LINES OF BUSINESS- DIVERSIFICATION? MULTIPLE SEGMENTS –MEDICAL VERSUS INDEMNITY –SAME LINE,
Correlation is a statistical technique that describes the degree of relationship between two variables when you have bivariate data. A bivariate distribution.
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
1 Chapter 3 Multiple Linear Regression Multiple Regression Models Suppose that the yield in pounds of conversion in a chemical process depends.
1 Everyday is a new beginning in life. Every moment is a time for self vigilance.
Regression. Population Covariance and Correlation.
Slide 9- 1 Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Correlation tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition, we would like to.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
TODAY we will Review what we have learned so far about Regression Develop the ability to use Residual Analysis to assess if a model (LSRL) is appropriate.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Correlation tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition, we would like to.
Correlation & Regression Analysis
October 16, 2014Computer Vision Lecture 12: Image Segmentation II 1 Hough Transform The Hough transform is a very general technique for feature detection.
Principal Component Analysis (PCA)
Scatter Plots. Scatter plots are used when data from an experiment or test have a wide range of values. You do not connect the points in a scatter plot,
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
Lecture 9 Sections 3.3 Objectives:
Regression Analysis AGEC 784.
STATISTICAL ORBIT DETERMINATION Kalman (sequential) filter
Statistics 101 Chapter 3 Section 3.
Chapter 5 STATISTICS (PART 4).
Multiple Regression.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Understanding Standards Event Higher Statistics Award
Happiness comes not from material wealth but less desire.
Basic Algebra 2 Teacher – Mrs. Volynskaya
Multiple Regression Models
Undergraduated Econometrics
The greatest blessing in life is
Objectives (IPS Chapter 2.3)
6.1 Introduction to Chi-Square Space
Correlation and Simple Linear Regression
Presentation transcript:

Bill Atwood, August, 2003 GLAST 1 Covariance & GLAST Agenda Review of Covariance Application to GLAST Kalman Covariance Present Status

Bill Atwood, August, 2003 GLAST 2 Review of Covariance a b Ellipse Take a circle – scale the x & y axis: Rotate by  : Results: Rotations mix x & y. Major & minor axis plus rotation angle  complete description. Error Ellipse described by Covariance Matrix: nn Distance between a point with an error and another point measured in  ’s: whereand Simply weighting the distance by 1/  2

Bill Atwood, August, 2003 GLAST 3 Review 2 Multiplying it out gives: Where I takewithout loss of generality. This is the equation of an ellipse! Specifically for 1  error ellipse ( n  = 1 ) we identify: andwhere Summary: The inverse of the Covariance Matrix describes an ellipse where the major and minor axis and the rotation angle map directly onto its components! And the correlation coefficient is defined as:

Bill Atwood, August, 2003 GLAST 4 Review 3 Let the fun begin! To disentangle the two descriptions consider where r = a/b  = 0  =  /4  =  /2  = 3  /4 Also det(C) yields (with a little algebra & trig.): Now we’re ready to look at results from GLAST! thus

Bill Atwood, August, 2003 GLAST 5 Covariance Matrix from Kalman Filter Results shown for Binned in cos(  ) and log 10 (E MC ) Recall however that KF gives us C in terms of the track slopes Sx and Sy. AxisAsym grows like 1/cos 2 (  ) Peak amplitude ~.4

Bill Atwood, August, 2003 GLAST 6 Relationship between Slopes and Angles For functions of the estimated variables the usual prescriptions is: when the errors are uncorrelated. For correlated errors this becomes and reduces to the uncorrelated case when The functions of interest here are: and A bit of math then shows that: and

Bill Atwood, August, 2003 GLAST 7 Angle Errors from GLAST   has a divergence at  . However   sin (  ) cures this.   decreases as cos 2 (  ) - while   sin (  ) increases as We also expect the components of the covariance matrix to increase as due to the dominance of multiple scattering. log 10 (E MC ) cos(  ) Plot measured residuals in terms of Fit  's (e.g. )

Bill Atwood, August, 2003 GLAST 8 Angle Errors 2 What's RIGHT: 1) cos(  ) dependence 2) Energy dependence in Multiple Scattering dominated range What's WRONG: 1) Overall normalization of estimated errors (  FIT ) - off by a factor of ~ 2.3!!! 2) Energy dependence as measurement errors begin to dominate - discrepancy goes away(?) Both of these correlate with with the fact that the fitted  2 's are much larger then 1 at low energy (expected?). How well does the Kalman Fit PSF model the event to event PSF?

Bill Atwood, August, 2003 GLAST 9 Angle Errors 3 Comparison of Event-by-Event PSF vs FIT Parameter PSF (Both Energy Compensated) Difficult to assess level of correlation - probably not zero - approximately same factor of 2.3

Bill Atwood, August, 2003 GLAST 10 Angle Errors: Conclusions 1) Analysis of covariance matrix gives format for modeling instrument response 2) Predictive power of Kalman Fit? - Factor of May prove a good handle for CT tree determination of "Best PSF"

Bill Atwood, August, 2003 GLAST 11 Present Analysis Status

Bill Atwood, August, 2003 GLAST 12 Present Status 2: BGE Rejection Events left after Good-Energy Selection: 1904 Events left in VTX Classes: 26 Events left in 1Tkr Classes: 1878 CT BGE Rejection factors obtained: 20:1 (1Tkr) 2:1 (VTX) NEED FACTOR OF 10X EVENTS BEFORE PROGRESS CAN BE MADE!