Regression Models - Introduction

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Multiple Regression Analysis
The Simple Regression Model
Statistical Techniques I EXST7005 Simple Linear Regression.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
The General Linear Model. The Simple Linear Model Linear Regression.
Simple Linear Regression
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
Simple Linear Regression
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Simple Linear Regression and Correlation
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
Chapter 11: Linear Regression and Correlation
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Decomposition of Sum of Squares
LEAST – SQUARES REGRESSION
The Simple Linear Regression Model: Specification and Estimation
Ch12.1 Simple Linear Regression
The Simple Regression Model
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Multiple Regression Analysis
Parameter, Statistic and Random Samples
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Simple Linear Regression - Introduction
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Diagnostics and Transformation for SLR
CHAPTER 29: Multiple Regression*
Linear Regression.
6-1 Introduction To Empirical Models
Residuals The residuals are estimate of the error
Regression Models - Introduction
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
Simple Linear Regression
Simple Linear Regression
CHAPTER 12 More About Regression
Simple Linear Regression
Chapter 14 Inference for Regression
Algebra Review The equation of a straight line y = mx + b
Diagnostics and Transformation for SLR
Ch3 The Two-Variable Regression Model
Introduction to Regression
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Regression Models - Introduction In regression models, two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent variable, X, also called predictor variable or explanatory variable. It is sometimes modeled as random and sometimes it has fixed value for each observation. In regression models we are fitting a statistical model to data. We generally use regression to be able to predict the value of one variable given the value of others. STA302/1001 week 1

Simple Linear Regression - Introduction Simple linear regression studies the relationship between a quantitative response variable Y, and a single explanatory variable X. Idea of statistical model: Actual observed value of Y = … Box (a well know statistician) claim: “All models are wrong, some are useful”. ‘Useful’ means that they describe the data well and can be used for predictions and inferences. Recall: parameters are constants in a statistical model which we usually don’t know but will use data to estimate. STA302/1001 week 1

Simple Linear Regression Models The statistical model for simple linear regression is a straight line model of the form where… For particular points, We expect that different values of X will produce different mean response. In particular we have that for each value of X, the possible values of Y follow a distribution whose mean is Formally it means that …. STA302/1001 week 1

Estimation – Least Square Method Estimates of the unknown parameters β0 and β1 based on our observed data are usually denoted by b0 and b1. For each observed value xi of X the fitted value of Y is This is an equation of a straight line. The deviations from the line in vertical direction are the errors in prediction of Y and are called “residuals”. They are defined as The estimates b0 and b1 are found by the Method of Lease Squares which is based on minimizing sum of squares of residuals. Note, the least-squares estimates are found without making any statistical assumptions about the data. STA302/1001 week 1

Derivation of Least-Squares Estimates Let We want to find b0 and b1 that minimize RSS. Use calculus…. STA302/1001 week 1

Properties of Fitted Line Note: you need to know how to prove the above properties. STA302/1001 week 1

Statistical Assumptions for SLR Recall, the simple linear regression model is Yi = β0 + β1Xi + εi where i = 1, …, n. The assumptions for the simple linear regression model are: 1) E(εi)=0 2) Var(εi) = σ2 3) εi’s are uncorrelated. These assumptions are also called Gauss-Markov conditions. The above assumptions can be stated in terms of Y’s… STA302/1001 week 1

Gauss-Markov Theorem The least-squares estimates are BLUE (Best Linear, Unbiased Estimators). The least-squares estimates are linear in y’s… Of all the possible linear, unbiased estimators of β0 and β1 the least squares estimates have the smallest variance. STA302/1001 week 1