Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
BA 275 Quantitative Business Methods
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Lecture 3 HSPM J716. Efficiency in an estimator Efficiency = low bias and low variance Unbiased with high variance – not very useful Biased with low variance.
Conclusion to Bivariate Linear Regression Economics 224 – Notes for November 19, 2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
© McGraw-Hill Higher Education. All Rights Reserved. Chapter 2F Statistical Tools in Evaluation.
Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.
Chapter 13 Multiple Regression
Multiple Linear Regression Model
Statistics for the Social Sciences
Statistics II: An Overview of Statistics. Outline for Statistics II Lecture: SPSS Syntax – Some examples. Normal Distribution Curve. Sampling Distribution.
Chapter 12 Multiple Regression
5  ECONOMETRICS CHAPTER Yi = B1 + B2 ln(Xi2) + ui
Lecture 3 HSPM J716. New spreadsheet layout Coefficient Standard error T-statistic – Coefficient ÷ its Standard error.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
1 Review of Correlation A correlation coefficient measures the strength of a linear relation between two measurement variables. The measure is based on.
Economics 310 Lecture 7 Testing Linear Restrictions.
Simple Linear Regression Analysis
Topic 3: Regression.
1 Econometrics 1 Lecture 6 Multiple Regression -tests.
Quantitative Business Analysis for Decision Making Simple Linear Regression.
Linear Regression A method of calculating a linear equation for the relationship between two or more variables using multiple data points.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Objectives of Multiple Regression
9 - 1 Intrinsically Linear Regression Chapter Introduction In Chapter 7 we discussed some deviations from the assumptions of the regression model.
Nonlinear Regression Functions
ECON 1150, 2013 Functions of One Variable ECON 1150, Functions of One Variable Examples: y = 1 + 2x, y = x Let x and y be 2 variables.
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
Chapter 8: Regression Analysis PowerPoint Slides Prepared By: Alan Olinsky Bryant University Management Science: The Art of Modeling with Spreadsheets,
Bivariate Data When two variables are measured on a single experimental unit, the resulting data are called bivariate data. You can describe each variable.
Regression and Correlation Jake Blanchard Fall 2010.
LECTURE 2. GENERALIZED LINEAR ECONOMETRIC MODEL AND METHODS OF ITS CONSTRUCTION.
Other Regression Models Andy Wang CIS Computer Systems Performance Analysis.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
MTH 161: Introduction To Statistics
Lecturer: Kem Reat, Viseth, PhD (Economics)
Transformations. Transformations to Linearity Many non-linear curves can be put into a linear form by appropriate transformations of the either – the.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Discussion of time series and panel models
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Transformations.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Multiple Regression Scott Hudson January 24, 2011.
Regression Analysis AGEC 784.
REGRESSION G&W p
Lecturer: Ing. Martina Hanová, PhD.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
BIVARIATE REGRESSION AND CORRELATION
Simple Linear Regression
Regression Analysis Week 4.
Lecturer: Ing. Martina Hanová, PhD.
Correlation and Regression
Statistics II: An Overview of Statistics
Regression and Correlation of Data
Presentation transcript:

Lecture 5 HSPM J716

Assignment 4 Answer checker Go over results

Regression results AFDCUP% is the dependent variable, with a mean of Variable Coefficient Std Error T-statistic P-Value Intercept UE_RATE UI_AVG E E INCOME E E HIGH% NEED E E PAYMENT 3.467E E

Simple Correlations AFDCUP% AFDCUP% 1.0 UE_RATE UI_AVG INCOME HIGH% NEED PAYMENT

Multiple regression coefficient and multicollinearity If Z=aX+b, the numerator and the denominator both become 0.

Standard error of coefficient → ∞ T-value → 0 if Z=aX+b

F-test

Prediction Total is the prediction: % conf. interval is to The predicted number of families in SC 1,700,000 x x 0.01 = The top end of the 90% confidence interval 1,700,000x x 0.01 =

Heteroskedasticity Heh’-teh-ro – ske-das’ – ti’-si-tee Violates assumption 3 that errors all have the same variance. – Ordinary least squares is not your best choice. Some observations deserve more weight than others. – Or – You need a non-linear model.

Nonlinear models – adapt linear least squares by transforming variables Y = e x Made linear by New Y = logarithm of Old Y.

e Approx Limit of the expression below as n gets larger and larger

Logarithms and e e 0 =1 Ln(1)=0 e a e b =e a+b Ln(ab)=ln(a)+ln(b) (e b )a=e ba Ln(b a )=aln(b)

Transform variables to make equation linear Y = Ae bx u Ln(Y) = ln(Ae bX u) Ln(Y) = ln(A) + ln(e bX ) + ln(u) Ln(Y) = ln(A) + bln(e X ) + ln(u) Ln(Y) = ln(A) + bX + ln(u)

Transform variables to make equation linear Y = AX b u Ln(Y) = ln(AX b u) Ln(Y) = ln(A) + ln(X b ) + ln(u) Ln(Y) = ln(A) + bln(X) + ln(u)

Logarithmic models Y = Ae bx u Constant growth rate model Continuous compounding Y is growing (or shrinking) at a constant relative rate of b. Linear Form ln(Y) = ln(A) + bX + ln(u) U is random error such that ln(u) conforms to assumptions Y = Ax b u Constant elasticity model The elasticity of Y with respect to X is a constant, b. When X changes by 1%, Y changes by b%. Linear Form ln(Y) = ln(A) + bln(X) + ln(u) U is random error such that ln(u) conforms to assumptions

The error term multiplies, rather than adds. Must assume that the errors’ mean is 1.

Demos Linear function demo Power function demo

Assignment 5