Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.

Slides:



Advertisements
Similar presentations
The Maximum Likelihood Method
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
SOLVED EXAMPLES.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
The General Linear Model. The Simple Linear Model Linear Regression.
Simple Linear Regression
The Simple Linear Regression Model: Specification and Estimation
Classification and risk prediction
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Chi Square Distribution (c2) and Least Squares Fitting
Standard error of estimate & Confidence interval.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Lab 3b: Distribution of the mean
Modeling Data Greg Beckham. Bayes Fitting Procedure should provide – Parameters – Error estimates on the parameters – A statistical measure of goodness.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Machine Learning 5. Parametric Methods.
Review of statistical modeling and probability theory Alan Moses ML4bio.
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
Richard Kass/F02P416 Lecture 6 1 Lecture 6 Chi Square Distribution (  2 ) and Least Squares Fitting Chi Square Distribution (  2 ) (See Taylor Ch 8,
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Data Modeling Patrice Koehl Department of Biological Sciences
Applied statistics Usman Roshan.
The Maximum Likelihood Method
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Regression Analysis AGEC 784.
Probability Theory and Parameter Estimation I
CH 5: Multivariate Methods
The Maximum Likelihood Method
Maximum Likelihood Estimation
Simple Linear Regression - Introduction
The Maximum Likelihood Method
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
CONCEPTS OF ESTIMATION
6-1 Introduction To Empirical Models
More about Posterior Distributions
Modelling data and curve fitting
Chi Square Distribution (c2) and Least Squares Fitting
Regression Models - Introduction
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
Computing and Statistical Data Analysis / Stat 8
The Simple Linear Regression Model: Specification and Estimation
5.2 Least-Squares Fit to a Straight Line
5.4 General Linear Least-Squares
Simple Linear Regression
Parametric Methods Berlin Chen, 2005 References:
11. Conditional Density Functions and Conditional Expected Values
11. Conditional Density Functions and Conditional Expected Values
Regression Models - Introduction
Presentation transcript:

Modeling of Data

Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might be some data. so that P(A|B) expresses the probability of a hypothesis, given the data.

Least Squares as a Maximum Likelihood Estimator Suppose we have N points of data M adjustable parameters The goal is to fit data points to the model.

Least Squares as a Maximum Likelihood Estimator The model predicts a functional relationship between the measured independent and dependent variables. We want to find values for parameters, such that: is minimized.

Chi Square fitting suppose that each data point y i has a measurement error that is independently random and distributed as a normal (Gaussian) distribution around the true model y(x) And suppose that the standard deviations of these normal distributions are the same for all points. Then the probability of the data set is the product of the probabilities of each point. …..(1)

Chi Square fitting The most probable model, then, is the one that maximizes equation (1) or, equivalently, minimizes the negative of its logarithm,

Chi Square fitting Since N,, and are all constants, minimizing this equation is equivalent to minimizing

Fitting Data to a Straight Line fitting a set of N data points to a straight-line model using the chi-square Assume that the uncertainty associated with each measurement yi is known

Fitting Data to a Straight Line Taking derivatives of with respect to a; b vanish:

Fitting Data to a Straight Line Define these sums: to simplify the questions to :

Fitting Data to a Straight Line The solution of these two equations in two unknowns is calculated as:

Estimate the probable uncertainties Each data point contribute to a bit of uncertainty to the parameters. the variance in the value of any function is

Estimate the probable uncertainties For the straight line, the derivatives of a and b with respect to y i can be directly evaluated from the solution: Summing over the points: