Structural Variations Interactions – When the effect of one explanatory variable on the dependent variable depends on the value of another explanatory.

Slides:



Advertisements
Similar presentations
Correlation... beware. Definition Var(X+Y) = Var(X) + Var(Y) + 2·Cov(X,Y) The correlation between two random variables is a dimensionless number between.
Advertisements

Correlation... beware. Definition Var(X+Y) = Var(X) + Var(Y) + 2·Cov(X,Y) The correlation between two random variables is a dimensionless number between.
MATH 2400 Chapter 5 Notes. Regression Line Uses data to create a linear equation in the form y = ax + b where “a” is the slope of the line (unit rate.
Quadratic graphs Today we will be able to construct graphs of quadratic equations that model real life problems.
General Qualitative Data, and “Dummy Variables” How might we have represented “make-of-car” in the motorpool case, had there been more than just two makes?
Trends and Seasonality Using Multiple Regression with Time Series Data Many time series data have a common tendency of growing over time, and therefore.
Scatter Diagrams and Linear Correlation
Quadratic Functions and Their Properties
AP Statistics Chapters 3 & 4 Measuring Relationships Between 2 Variables.
Describing the Relation Between Two Variables
Sociology 601 Class 28: December 8, 2009 Homework 10 Review –polynomials –interaction effects Logistic regressions –log odds as outcome –compared to linear.
Lecture 26 Omitted Variable Bias formula revisited Specially constructed variables –Interaction variables –Polynomial terms for curvature –Dummy variables.
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Lecture 27 Polynomial Terms for Curvature Categorical Variables.
Correlation and Regression Analysis
Regression, Residuals, and Coefficient of Determination Section 3.2.
Essential Question: How do you determine whether a quadratic function has a maximum or minimum and how do you find it?
Linear Regression Analysis
Quadratic Functions and Models Lesson 3.1. Nonlinear Data When the points of the function are plotted, they do not lie in a straight line. This graph.
Linear Regression.
Overview 4.2 Introduction to Correlation 4.3 Introduction to Regression.
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Example 11.4 Demand and Cost for Electricity Modeling Possibilities.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
2.4: Cautions about Regression and Correlation. Cautions: Regression & Correlation Correlation measures only linear association. Extrapolation often produces.
Correlation Correlation measures the strength of the LINEAR relationship between 2 quantitative variables. Labeled as r Takes on the values -1 < r < 1.
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
1 © 2008 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 5 Summarizing Bivariate Data.
Chapter 3 Section 3.1 Examining Relationships. Continue to ask the preliminary questions familiar from Chapter 1 and 2 What individuals do the data describe?
How to create a graph and how to interpret different graph designs
1 Further Maths Chapter 4 Displaying and describing relationships between two variables.
Summarizing Bivariate Data
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
Linear Regression Model In regression, x = independent (predictor) variable y= dependent (response) variable regression line (prediction line) ŷ = a +
University of Warwick, Department of Sociology, 2014/15 SO 201: SSAASS (Surveys and Statistics) (Richard Lampard) Week 7 Logistic Regression I.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 19 Linear Patterns.
Graphing Quadratic Equations
Lesson 10-1 Graphing Quadratic Functions. Objectives Graph quadratic functions Find the equation of the axis of symmetry and the coordinates of the vertex.
Bivariate Data Analysis Bivariate Data analysis 4.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Chapter 4 Summary Scatter diagrams of data pairs (x, y) are useful in helping us determine visually if there is any relation between x and y values and,
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
28. Multiple regression The Practice of Statistics in the Life Sciences Second Edition.
Correlation... beware. Definition Var(X+Y) = Var(X) + Var(Y) + 2·Cov(X,Y) The correlation between two random variables is a dimensionless number between.
SWBAT: Calculate and interpret the residual plot for a line of regression Do Now: Do heavier cars really use more gasoline? In the following data set,
A P STATISTICS LESSON 3 – 3 (DAY 3) A P STATISTICS LESSON 3 – 3 (DAY 3) RISIDUALS.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
Warm-up O Turn in HW – Ch 8 Worksheet O Complete the warm-up that you picked up by the door. (you have 10 minutes)
Appearance of Different Curves Slideshow 26, Mathematics Mr. Richard Sasaki, Room 307.
LSRLs: Interpreting r vs. r 2 r – “the correlation coefficient” tells you the strength and direction between two variables (x and y, for example, height.
The coefficient of determination, r 2, is The fraction of the variation in the value of y that is explained by the regression line and the explanatory.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Structural Variations Interactions – When the effect of one explanatory variable on the dependent variable depends on the value of another explanatory.
Pre-Cal Chapter 2 Polynomial, Power, and Rational Functions Section 2.1.
Quadratic Functions and Models ♦ ♦ Learn basic concepts about quadratic functions and their graphs. ♦ Complete the square and apply the vertex formula.
Principles of Biostatistics Chapter 17 Correlation 宇传华 网上免费统计资源(八)
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Structural Variations Interactions – When the effect of one explanatory variable on the dependent variable depends on the value of another explanatory.
Chapter 4 Basic Estimation Techniques
Regression Computer Print Out
AP Stats: 3.3 Least-Squares Regression Line
Structural Variations
Quadratic Functions and Models
Structural Variations
Ch 4.1 & 4.2 Two dimensions concept
Section 10-1 Correlation © 2012 Pearson Education, Inc. All rights reserved.
Presentation transcript:

Structural Variations Interactions – When the effect of one explanatory variable on the dependent variable depends on the value of another explanatory variable The “trick”: Introduce the product of the two as a new artificial explanatory variable. Example (Day 3)Example Nonlinearities – When the impact of an explanatory variable on the dependent variable “bends” The “trick”: Introduce the square of that variable as a new artificial explanatory variable. Example (Day 4)Example

Interactions: Summary When the effect (i.e., the coefficient) of one explanatory variable on the dependent variable depends on the value of another explanatory variable – Signaled only by judgment – The “trick”: Introduce the product of the two as a new artificial explanatory variable. After the regression, interpret in the original “conceptual” model. – For example, Cost = a + (b 1 +b 2  Age)  Mileage + … (rest of model) – The latter explanatory variable (in the example, Age) might or might not remain in the model – Cost: We lose a meaningful interpretation of the beta-weights

Nonlinearity: Summary When the direct relationship between an explanatory variable and the dependent variable “bends” – Signaled by a “U” in a plot of the residuals against an explanatory variable – The “trick”: Introduce the square of that variable as a new artificial explanatory variable: Y = a + bX + cX 2 + … (rest of model) – One trick can capture 6 different nonlinear “shapes”shapes – Always keep the original variable (the linear term, with coefficient “b”, allows the parabola to take any horizontal position) – c (positive = upward-bending parabola, negative = downward- bending) – -b/(2c) indicates where the vertex (either maximum or minimum) of the parabola occurs – Cost: We lose a meaningful interpretation of the beta-weights

Examples from the Sample Exams Regression: Revenue constantAgeAge 2 SexDirectIndirect Sex  Ind coefficient Revenue pred =  Age –  Age 2 –  Sex  Direct + (  Sex)  Indirect Caligula’s Castle: revenue / $ incentivedirectindirect Men (Sex=0)$1.99$0.85 Women (Sex=1)$1.99$2.29 The Age effect on Revenue is greatest at Age = -(62.37)/(2( )) = years Give direct incentives (house chips, etc.) to men Give indirect incentives (flowers, meals) to women

Examples from the Sample Exams Regression: CustSat constantWaitWait 2 SizeFranz? Size  Franz? coefficient CustSat pred = –  Wait –  Wait 2 –  Size + (  Size)  Franz? Hans and Franz: Set Franz? = 0 (assign Hans) when the party size is < / = Customers’ anger grows more quickly the longer they wait