Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Lecture 5 HSPM J716. Assignment 4 Answer checker Go over results.
Perfect Square Trinomials. Form for Perfect Square Trinomials: a 2 + 2ab + b 2 OR a 2 – 2ab + b 2.
2.4 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
Choosing a Functional Form
5  ECONOMETRICS CHAPTER Yi = B1 + B2 ln(Xi2) + ui
Economics 310 Lecture 7 Testing Linear Restrictions.
Econ 140 Lecture 191 Heteroskedasticity Lecture 19.
Topic 3: Regression.
1 Econometrics 1 Lecture 6 Multiple Regression -tests.
Logarithms LSP 120. Logarithms What is a Logarithm?  A logarithm (or log) is a number that represents a power or exponent  Why use logs?  A simpler.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
Exponentials, logarithms and rescaling of data Math 151 Based upon previous notes by Scott Duke- Sylvester as an adjunct to the Lecture notes posted on.
Transformations to Achieve Linearity
9 - 1 Intrinsically Linear Regression Chapter Introduction In Chapter 7 we discussed some deviations from the assumptions of the regression model.
ECON 1150, 2013 Functions of One Variable ECON 1150, Functions of One Variable Examples: y = 1 + 2x, y = x Let x and y be 2 variables.
Bivariate Data When two variables are measured on a single experimental unit, the resulting data are called bivariate data. You can describe each variable.
Applications The General Linear Model. Transformations.
10.6 Solving Rational Equations
Regression and Correlation Jake Blanchard Fall 2010.
LECTURE 2. GENERALIZED LINEAR ECONOMETRIC MODEL AND METHODS OF ITS CONSTRUCTION.
Other Regression Models Andy Wang CIS Computer Systems Performance Analysis.
Scatter plots and Regression Algebra II. Linear Regression  Linear regression is the relationship between two variables when the equation is linear.
Learning Theory Reza Shadmehr logistic regression, iterative re-weighted least squares.
MTH 161: Introduction To Statistics
Lecturer: Kem Reat, Viseth, PhD (Economics)
Transformations. Transformations to Linearity Many non-linear curves can be put into a linear form by appropriate transformations of the either – the.
Quadratics Solving equations Using “Completing the Square”
Chapter 5 Demand Estimation Managerial Economics: Economic Tools for Today’s Decision Makers, 4/e By Paul Keat and Philip Young.
1Spring 02 Non-Linear Relationships If a relationship is not linear, how can we deal with it. For example:
Discussion of time series and panel models
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Transformations.
 The logarithmic function log b (x) returns the number y such that b y = x.  For example, log 2 (8) = 3 because 2 3 = 8.  b is called the base of the.
PreCalculus Section 1.6 Solve quadratic equations by: a. Factoring b. Completing the square c. Quadratic formula d. Programmed calculator Any equation.
Chapter 3 Exponential & Logarithmic Functions. 3.1 Exponential Functions Objectives –Evaluate exponential functions. –Graph exponential functions. –Evaluate.
Variance Stabilizing Transformations. Variance is Related to Mean Usual Assumption in ANOVA and Regression is that the variance of each observation is.
EXAMPLE 4 Solve linear systems with many or no solutions Solve the linear system. a.x – 2y = 4 3x – 6y = 8 b.4x – 10y = 8 – 14x + 35y = – 28 SOLUTION a.
10.6 Solving Rational Equations Rational Equation An equation containing one or more rational expressions.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
1.7 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Mean & Standard Deviation The mean value of a random variable X, denoted by  X, describes.
Solve Quadratic Functions by Completing the Square
PreCalculus Section 1. 6 Solve quadratic equations by: a. Factoring b
Regression Analysis AGEC 784.
Lecturer: Ing. Martina Hanová, PhD.
Kin 304 Regression Linear Regression Least Sum of Squares
Sum and Product of Roots
Active Learning Lecture Slides
Other Regression Models
Factoring Special Cases
Regression Analysis Week 4.
13.3 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
Section 11.1 Quadratic Equations.
9.3 Solve Quadratics by Completing the Square
J.-F. Pâris University of Houston
Mathematical Foundations of BME Reza Shadmehr
Chapter 7: Matrices and Systems of Equations and Inequalities
2.4 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
5.4 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
Transformations to Achieve Linearity
Lecturer: Ing. Martina Hanová, PhD.
Nonlinear Fitting.
13.3 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
13.3 Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
Completing the Square Objective: To complete a square for a quadratic equation and solve by completing the square.
FACTORISING 2.
Regression and Correlation of Data
Presentation transcript:

Lecture 5 HSPM J716

Assignment 4 Answer checker Go over results

Multiple regression coefficient and multicollinearity If Z=aX+b, the denominator becomes 0.

F-test

Heteroskedasticity Heh’-teh-ro – ske-das’ – ti’-si-tee Violates assumption 3 that errors all have the same variance. – Ordinary least squares is not your best choice. Some observations deserve more weight than others. – Or – You need a non-linear model.

Nonlinear models – adapt linear least squares by transforming variables Y = e x Made linear by New Y = log of Old Y.

e Approx Limit of the expression below as n gets larger and larger

Logarithms and e e 0 =1 Ln(1)=0 e a e b =e a+b Ln(ab)=ln(a)+ln(b) (e b )a=e ba Ln(b a )=aln(b)

Transform variables to make equation linear Y = Ae bx u Ln(Y) = ln(Ae bX u) Ln(Y) = ln(A) + ln(e bX ) + ln(u) Ln(Y) = ln(A) + bln(e X ) + ln(u) Ln(Y) = ln(A) + bX + ln(u)

Transform variables to make equation linear Y = AX b u Ln(Y) = ln(AX b u) Ln(Y) = ln(A) + ln(X b ) + ln(u) Ln(Y) = ln(A) + bln(X) + ln(u)

Logarithmic models Y = Ae bx u Constant growth rate model Continuous compounding Y is growing (or shrinking) at a constant relative rate of b. Linear Form ln(Y) = ln(A) + bX + ln(u) U is random error such that ln(u) conforms to assumptions Y = Ax b u Constant elasticity model The elasticity of Y with respect to X is a constant, b. When X changes by 1%, Y changes by b%. Linear Form ln(Y) = ln(A) + bln(X) + ln(u) U is random error such that ln(u) conforms to assumptions

The error term multiplies, rather than adds. Must assume that the errors’ mean is 1.

Demos Linear function demo Power function demo

Assignment 5