Physics 114: Exam 2 Review Material from Weeks 7-11

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Estimation of Means and Proportions
Statistical Techniques I EXST7005 Miscellaneous ANOVA Topics & Summary.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Physics 114: Lecture 16 Linear and Non-Linear Fitting Dale E. Gary NJIT Physics Department.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Sampling Distributions
Professor William Greene Stern School of Business IOMS Department Department of Economics Statistical Inference and Regression Analysis: Stat-GB ,
SIMPLE LINEAR REGRESSION
Inference about a Mean Part II
SIMPLE LINEAR REGRESSION
Chi Square Distribution (c2) and Least Squares Fitting
Understanding sample survey data
Standard error of estimate & Confidence interval.
SIMPLE LINEAR REGRESSION
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Statistical Techniques I EXST7005 Exam 2 Review. Exam Coverage n There will be problems requiring the use of F and Chi square tables. Probabilities from.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Estimation of Statistical Parameters
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Physics 114: Exam 2 Review Lectures 11-16
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Lab 3b: Distribution of the mean
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
July, 2000Guang Jin Statistics in Applied Science and Technology Chapter 7 - Sampling Distribution of Means.
Lecture 7 Dustin Lueker. 2  Point Estimate ◦ A single number that is the best guess for the parameter  Sample mean is usually at good guess for the.
CHEMISTRY ANALYTICAL CHEMISTRY Fall Lecture 6.
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
Richard Kass/F02P416 Lecture 6 1 Lecture 6 Chi Square Distribution (  2 ) and Least Squares Fitting Chi Square Distribution (  2 ) (See Taylor Ch 8,
Chapter 6: Random Errors in Chemical Analysis. 6A The nature of random errors Random, or indeterminate, errors can never be totally eliminated and are.
Multiple Regression Analysis: Inference
Physics 114: Lecture 16 Least Squares Fit to Arbitrary Functions
Lecture 11: Simple Linear Regression
The Maximum Likelihood Method
Physics 114: Lecture 13 Probability Tests & Linear Fitting
The Reasons for the Steps of Descriptive Statistics
Confidence Intervals and Sample Size
Regression Analysis: Statistical Inference
Physics 114: Lecture 14 Linear Fitting
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Physics 114: Exam 2 Review Weeks 7-9
3. The X and Y samples are independent of one another.
Multiple Regression.
Statistics in Applied Science and Technology
INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Test Review: Ch. 7-9
CHAPTER 29: Multiple Regression*
CONCEPTS OF ESTIMATION
Physics 114: Lecture 14 Linear Fitting
Modelling data and curve fitting
Chi Square Distribution (c2) and Least Squares Fitting
Simple Linear Regression
Arithmetic Mean This represents the most probable value of the measured variable. The more readings you take, the more accurate result you will get.
Physics 114: Lecture 14-a Linear Fitting Using Matlab
Interval Estimation and Hypothesis Testing
John Federici NJIT Physics Department
SIMPLE LINEAR REGRESSION
CHAPTER 14 MULTIPLE REGRESSION
Lecture 7 Sampling and Sampling Distributions
Statistics II: An Overview of Statistics
Product moment correlation
SIMPLE LINEAR REGRESSION
Simple Linear Regression
Chapter 8 Estimation.
STA 291 Summer 2008 Lecture 14 Dustin Lueker.
MGS 3100 Business Analysis Regression Feb 18, 2016
STA 291 Spring 2008 Lecture 14 Dustin Lueker.
Presentation transcript:

Physics 114: Exam 2 Review Material from Weeks 7-11 John Federici NJIT Physics Department

Concepts Covered on the Exam The list of concepts covered on the exam are: Central Limit Theorem, averages of averages Combining data with different standard deviations Confidence intervals, Confidence level SIGNIFICANT DIGITS in error Chi-squared test, goodness of fit Weighted mean and Error Least-squares fitting, minimizing chi-square, linear least-squares fitting Degrees of freedom Fitting a polynomial Linearization of fitting equation.

Suggested materials to Review Review lecture notes for weeks 7-11. Pay attention to CONCEPTS and specific examples given in class. There are no questions on Matlab code. HOWEVER, You should be able to interpret the “results” window of the Curve Fitting App. Review HW#7, problem 1, Problem 2, Problem 3. NOTE, if you are given a problem similar to problem 3, the number of data points will be small enough that you can use the EQUATIONS at the end of the exam to calculate your answers. Review HW#8, Problem 1, Problem 3. Review HW#9, Problem 1, Problem 2. Review concepts of HW#11, Problem 2.

Mean of Means and Standard Error One can join multiple sets of measurements to refine both the estimated value (mean of means) and the standard error (standard deviation of the mean). The mean of means is given by, , where the xi are individual measurements of the mean. If the standard deviations of each of the measurements are all the same (si = s), then they cancel and we have the usual Likewise, the rule for combining data sets with different errors is And for equal errors this is This last is a key result to remember—combining measurements reduces the standard deviation by the square root of the number of measurements. Do example: x = [10.7, 7.2, 11.2, 9.9, 11.3], s = [2, 2, 2, 1.5, 2.5]. Ans: 10.0±1.4

Weighted Mean and Error Perhaps the errors themselves are not known, but the relative weighting of the measurements is known. For example, say you want to combine means taken with different numbers of measurements (or different integration times). Defining the weights as proportional to the variances kwi = si2, the proportionality constant cancels and we have We can then define an average standard deviation: After obtaining that average standard deviation, the standard error (standard deviation of the mean) is, as before, decreased by the square-root of the number of measurements:

Probability Distribution The Gaussian distribution (bell curve) shows the expected distribution of measurements about the mean. This can be interpreted as a probability. Thus, ~68% of measurements should fall within 1s of the mean, i.e. Likewise, ~95% of measurements should fall within 2s of the mean. In science, it is expected that errors are given in terms of ±1s. Thus, stating a result as 3.4±0.2 means that 68% of values fall between 3.2 and 3.6. In some disciplines, it is common instead to state 90% or 95% confidence intervals (1.64s, or 2s). In the case of 90% confidence interval, the same measurement would be stated as 3.4±0.37. To avoid confusion, one should say 3.4±0.37 (90% confidence level).

Chi-Square Probability Chi-square is a criterion for the goodness of fit of a function, e.g. y(x), and is defined as In other words, it is just the sum of the squared deviations of points from the function, normalized by the variances. When the fit is good, we normally expect the squared deviations to average around s 2, so each term is about 1 and the total chi-square is about equal to n, the number of degrees of freedom. For the special case of a linear fit to a set of points (y(x) = a + bx), We can find the best fit straight line by minimizing chi-square. Generally, we can find the best fit of any function by replacing y(x) with another equation representing that function.

Linear Least Squares Fitting Minimizing chi-square, we found that we could solve for the parameters a and b that minimize the difference between the fitted line and the data (with errors si) as: where In the case of equal errors, they cancel and we can drop the s and replace with N. The uncertainties in the parameters are:

Matlab Commands Remember, we also had used the CURVE FITTING APP to fit data

Reduced Chi-Square Recall that the value of c2 is It is often easier to consider the reduced chi-square, which is about unity for a good fit. If we compare points to a fit of a sine function, changing parameters changes c2, and obviously the minimum chi-square is the best fit. change amplitude change frequency

Degrees of Freedom n The number of degrees of freedom represent the ways in which things can be varied independently. It is generally the number of independent data points (the number of measurements), reduced by the number of parameters deduced from the measurements. Thus, if the data points are used to determine a mean, then the data points can be varied, but are constrained to have the given mean. This constraint must be subtracted from the number of points, so in this case the number of degrees of freedom is n = N – 1. If we use the data points to define a line (i.e. solve for two parameters a and b for the line), then n = N – 2. You should learn to recognize the number of parameters needed to fully describe a function. The sine wave of the previous example can be adjusted in three ways (has three parameters). We showed two (amplitude and frequency). Can you guess the third? For this fit, we have n = N – 3. We need to know this in order to use the reduced chi-square as a measure of when we have an acceptable fit.

Practice Problem

Practice Problem

Practice Problem

Sample Problem … Cont.

Sample Problem … Cont.

Sample Problem … CONT

Sample Problem 2