Lecture 5 Curve fitting by iterative approaches MARINE QB III MARINE QB III Modelling Aquatic Rates In Natural Ecosystems BIOL471 © 2001 School of Biological.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Statistical Techniques I EXST7005 Simple Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
1 Functions and Applications
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Correlation and Regression
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Regression Regression: Mathematical method for determining the best equation that reproduces a data set Linear Regression: Regression method applied with.
Yard. Doç. Dr. Tarkan Erdik Regression analysis - Week 12 1.
Chapter 10 Simple Regression.
SIMPLE LINEAR REGRESSION
REGRESSION AND CORRELATION
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
1 Chapter 10 Correlation and Regression We deal with two variables, x and y. Main goal: Investigate how x and y are related, or correlated; how much they.
Linear Regression.
SIMPLE LINEAR REGRESSION
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.
Simple Linear Regression
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Chapter 20 Linear Regression. What if… We believe that an important relation between two measures exists? For example, we ask 5 people about their salary.
Chapter 8 Curve Fitting.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
GG 313 Geological Data Analysis Lecture 13 Solution of Simultaneous Equations October 4, 2005.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
1 Data Analysis Linear Regression Data Analysis Linear Regression Ernesto A. Diaz Department of Mathematics Redwood High School.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
V. Rouillard  Introduction to measurement and statistical analysis CURVE FITTING In graphical form, drawing a line (curve) of best fit through.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
Copyright © Cengage Learning. All rights reserved. 8 9 Correlation and Regression.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Lecture Slides Elementary Statistics Twelfth Edition
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
The simple linear regression model and parameter estimation
Regression Analysis AGEC 784.
REGRESSION G&W p
Part 5 - Chapter
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Simple Linear Regression - Introduction
Lecture Slides Elementary Statistics Thirteenth Edition
Correlation and Regression
LESSON 21: REGRESSION ANALYSIS
REGRESSION.
Linear regression Fitting a straight line to observations.
SIMPLE LINEAR REGRESSION
CHAPTER 12 More About Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
Created by Erin Hodgess, Houston, Texas
Presentation transcript:

Lecture 5 Curve fitting by iterative approaches MARINE QB III MARINE QB III Modelling Aquatic Rates In Natural Ecosystems BIOL471 © 2001 School of Biological Sciences, University of Liverpool S chool of B iological S ciences S chool of B iological S ciences

Curve fitting: the iterative approach Ingestion Prey 0 H c = k+ H H max H

Linear Functions General linear equations Any straight line can be represented by the general linear equation y = mx + c y1y1 x 1 y 2 or (y 1 +  y)  y (y 2 - y 1 ) x 2 or (x 1 +  x)  x (x 2 - x 1 ) Slope (m) (  y/  x) Intercept (c) Origin 0 0 Y = a + bX a b

X is the independent variable since its value is freely chosen X Y is the dependent variable since its value depends on x Y Regression Analysis Y = a + bX

Often X may be thought of as the cause and Y as the effect of that cause Height climbed cause (C) effect (E) Regression Analysis E = a + bC

Some basic algebra Remember the equation of a straight line: However when doing regression analysis, this becomes a statistical model The model is a way of estimating values of Y, given a value of X and the constants a and b But Y is estimated. Therefore: Where a is the Y-intercept, and b the slope b is also called the regression coefficient

Model 1 Regression Analysis In Model 1 Regression X is measured without error X measurements are independent X is under the control of the investigator For a value of X there is a population of Y-values, which are normally distributed There is equal variance of Y at each X value X Y Note, in Model 2 regression both X and Y are random variable – we will not be discussing this

The figure now shows the line of best fit The line is a model of the relationship between X and Y We have selected our subjects with known values of X and then measured Y X Y The line of best fit

How do we select the line of best fit? We expect it to pass through (X,Y )… For any line, we could calculate the vertical deviations of each point from that line X X Y Y

How do we select the line of best fit? We expect it to pass through (X,Y )… For any line, we could calculate the vertical deviations of each point from that line Squaring the deviations makes them positive Summing them gives the sum of the squares of the deviations X X Y Y The line of best fit The line of best fit will minimise  d 2

Now we can write the regression equation By definition The regression equation is By substitution Calculating values of a and b We need to obtain values of a and b that give minimum value of this expression for sum of squares of deviations from the fitted line We solve this with differential calculus To obtain Then

Calculating the line of best fit That was one method of finding the line of best fit, called least squares regression It works because, using calculus we can solve for b However, there are some equations (non-linear ones) that we cannot solve this way Instead we use another method: Iterative fitting X Y

Here are the steps: 1. make an estimate of the parameters, in this case, the slope (b) and the intercept (a) 2. Calculate the sum of squares of deviations from the fitted line X Y The line of best fit by iteration 3. Record this value, and then try another pair of estimates of a and b

Here are the steps: 1. make an estimate of the parameters, in this case, the slope (b) and the intercept (a) 2. Calculate the sum of squares of deviations from the fitted line 3. Record this value, and then try another pair of estimates of a and b 4. Calculate the sum of squares… repeat until you obtain the smallest sum of squares you can get 5. When the sum of squares is minimal, this is the best fit X Y The line of best fit by iteration

This process may seem very labourious, but computers make it possible Steps 1.Look at the data and think about it 2.Decide if you need non-linear regression 3.Pick a mathematical model 4.Choose initial parameter values (although some programes do this for you) 5.Fit the curve to the data X Y The line of best fit by iteration

You must satisfy these assumptions for iterative- fitting 1.X is measured without error 2.X is under the control of the investigator 3.X values are independent of each other 4.For a value of X there is a population of Y-values, which are normally distributed 5.There is equal variance of Y at each X value X Y

The line of best fit by iteration Next, ask yourself the following questions 1.Does the curve go through the data (if you pick the wrong initial parameters it can all go pear- shaped)? 2.Are the best-fit parameters plausible (see above)? 3.How precise are the best-fit parameters (we will learn about how to calculate precision in a minute)? 4.Would another model be more appropriate? 5.Have you violated any of the assumptions for iterative-fit regressions?

Curve fitting using Follow these steps 1.Open SigmaPlot 8.0

Curve fitting using SigmaPlot 8.0 Follow these steps 1.Open SigmaPlot Enter data into spread sheet (our data set will be a functional response)

Curve fitting using SigmaPlot 8.0 Follow these steps 1.Open SigmaPlot Enter data into spread sheet (our data set will be a functional response) 3.Make a graph

Curve fitting using SigmaPlot 8.0 Follow these steps 1.Open SigmaPlot Enter data into spread sheet (our data set will be a functional response) 3.Make a graph 4.Click on the data

Curve fitting using SigmaPlot 8.0 Follow these steps 1.Open SigmaPlot Enter data into spread sheet (our data set will be a functional response) 3.Make a graph 4.Click on the data 5.In the “statistics” drop down menu, chose “regression wizard”

Curve fitting using SigmaPlot 8.0 Follow these steps 1.Open SigmaPlot Enter data into spread sheet (our data set will be a functional response) 3.Make a graph 4.Click on the data 5.In the “statistics” drop down menu, chose “regression wizard” 6.Choose “hyperbola” in the “equation category” 7.Choose “2-paramerter” in the “equation name”