5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Managerial Economics in a Global Economy
13- 1 Chapter Thirteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Econ 140 Lecture 81 Classical Regression II Lecture 8.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
Introduction to Regression Analysis
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Chapter 13 Multiple Regression
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Chapter 12 Multiple Regression
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Lecture 23 Multiple Regression (Sections )
Introduction to Probability and Statistics Linear Regression and Correlation.
BCOR 1020 Business Statistics
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression Analysis
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Regression and Correlation Methods Judy Zhong Ph.D.
Regression Analysis. Regression analysis Definition: Regression analysis is a statistical method for fitting an equation to a data set. It is used to.
Introduction to Linear Regression and Correlation Analysis
Multiple Regression Analysis Multivariate Analysis.
Correlation and Regression
Chapter 6 & 7 Linear Regression & Correlation
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Introduction to Linear Regression
Lecturer: Kem Reat, Viseth, PhD (Economics)
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Examining Relationships in Quantitative Research
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Discussion of time series and panel models
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
4-1 MGMG 522 : Session #4 Choosing the Independent Variables and a Functional Form (Ch. 6 & 7)
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 4 Basic Estimation Techniques
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Basic Estimation Techniques
Correlation and Simple Linear Regression
Correlation and Regression
Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used.
Simple Linear Regression and Correlation
Product moment correlation
Introduction to Regression
Presentation transcript:

5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression analysis is a method used to determine the coefficients of a a functional relationship.  For example, if demand is P = a+bQ We need to estimate a and b.

5.2 Ordinary Least Squares(OLS)  Means to determine regression equation that “best” fits data  Goal is to select the line(proper intercept & slope) that minimizes the sum of the squared vertical deviations  Minimize  e i 2 which is equivalent to minimizing  (Y i -(Y-hat) i ) 2

5.3 Standard Error of the Estimate  Measures variability about the regression equation  Labeled SEE  If SEE = 0 all points are on line and fit is perfect

5.4 Standard Error of the Slope  Measures theoretical variability in estimated slope - different datasets(samples) would yield different slopes

5.5 Variability in the Dependent Variable  The sum of squares of Y about its mean value is representative of the total variation in Y

5.6 Variability in the Dependent Variable  The sum of squares of Y about the regression line(Y-hat) is representative of the “unexplained” or residual variation in Y

5.7 Variability in the Dependent Variable  The sum of squares of Y-hat about Y-bar is representative of the “explained” variation in Y

5.8 Variability in the Dependent Variable  Note, TSS = ESS + RSS  If all data points are on the regression line, RSS=0 and TSS=ESS  If the regression line is horizontal, slope = 0, ESS=0 and TSS=RSS  The better the fit of the regression line to the data, the smaller is RSS

5.9 Describing Overall Fit - R 2  The coefficient of determination is the ratio of the “explained” sum of squares to the total sum of squares

5.10 Coefficient of Determination  R 2 yields the percentage of variability in Y that is explained by the regression equation  It ranges between 0 and 1  What is true if R 2 = 1?  What is true if R 2 = 0?

5.11 Statistical Inference  Drawing conclusions about the population based on sample information.  Hypothesis Testing –which independent variables are significant? –Is the model significant?  Estimation - point versus interval –what is the rate of change in Y per X? –what is the expected value of Y based on X

5.12 Errors in Hypotheses Testing  Type I error - rejecting the null hypothesis when it is true  Type II error - accepting the null hypothesis when it is false  Will never eliminate the possibility of error - but can control their likelihood

5.13 Structuring the Null and Alternative Hypotheses  The null hypothesis is often the reverse of what theory or logic suggest the researcher believes; it is structured to allow the data to contradict it. In the model on the effect of price on quantity demanded, the researcher would expect price to inversely impact amount purchased. Thus, the null might be that price does not effect quantity demanded or it effects it in a positive direction.

5.14 Structuring the Null and Alternative Hypotheses  Model: Q A =B 0 +B 1 P A +B 2 Inc+B 3 P B +  –Q A = quantity demanded of good A –P A = price of good A –Inc = Income –P B = price of good B  H 0 : B 1  0  H A : B 1 < 0 Law of Demand expectation

5.15 H 0 :  1 = 0 Do Not Reject Reject  /2

5.16 H 0 :  1  0  Reject Do Not Reject

5.17 H 0 :  1  0 Do Not Reject Reject 

5.18 The t-Test for the Slope  We can test the significance of an independent variable by testing the following H 0 :  k = 0 k = 1,2,….K H A :  k  0  Note if  k = 0 a change in the kth independent variable has no impact on Y

5.19 The t-Test for the Slope  The test statistic is

5.20 T-Test Decision Rule  The critical t-value, t c, is the value that defines the boundary line separating the rejection from the do not reject region.  For a 2-tailed test if |t k | > t c, reject the null; otherwise do not reject  For a 1-tailed test if |t k | > t c and if t c has the sign implied by H A, reject the null; otherwise do not reject

5.21 F-Test and ANOVA  F-Test is used to test the overall significance of the regression or model  Analysis of Variance = ANOVA  ANOVA is based on the components of the variation in Y previously discussed - TSS, ESS, and RSS

5.22 ANOVA Table

5.23 F-Statistic

5.24 Hypotheses for F-Test  H 0 :  1 =  2 =…..=  K =0 H A : H 0 is not true  Note the null suggests that all slopes are simultaneously zero and that the model would NOT be significant, ie. no independent variables are significant

5.25 Decision Rule for F-Test  If F > F c, reject the null that the model is insignificant. Note this likely to be good news - your model appears “good”  Otherwise do not reject

5.26 Illustration 5.3 page174-75

5.27 San Mateo Santa Barbara

5.28 Log_linear Model  Constant percentage change in dependent variable in response to a 1 percent change in an independent variable  no change in direction

5.29 Double-Log Model  Taking logs of the exponential equation yields (note this is linear in the logs)

5.30 Elasticity for Double Log Model  The elasticity of Y with respect to X or Z for a double- log model is merely the regression coefficient or b-hat or c-hat  Thus, in a double-log model the elasticities are constant and are merely equal to the estimated regression coefficients(partial slopes).