Lecture 27 Chapter 20.3: Nominal Variables HW6 due by 5 p.m. Wednesday Office hour today after class. Extra office hour Wednesday from 9-10. Final Exam:

Slides:



Advertisements
Similar presentations
Accounting Education Options Beyond the Four-Year Degree Complete hours as a non-graduate student Attend a school that forces students to get graduate.
Advertisements

1 Chapter 9 Supplement Model Building. 2 Introduction Introduction Regression analysis is one of the most commonly used techniques in statistics. It is.
Lecture 17: Tues., March 16 Inference for simple linear regression (Ch ) R2 statistic (Ch ) Association is not causation (Ch ) Next.
Example 1 To predict the asking price of a used Chevrolet Camaro, the following data were collected on the car’s age and mileage. Data is stored in CAMARO1.
Examining Relationships Chapter 3. Least Squares Regression Line If the data in a scatterplot appears to be linear, we often like to model the data by.
Chi Squared Tests. Introduction Two statistical techniques are presented. Both are used to analyze nominal data. –A goodness-of-fit test for a multinomial.
Student Quiz Grades Test Grades 1.Describe the association between Quiz Grades and Test Grades. 2.Write the.
Fundamentals of Real Estate Lecture 13 Spring, 2003 Copyright © Joseph A. Petry
1 Multiple Regression Chapter Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Stat 112: Lecture 17 Notes Chapter 6.8: Assessing the Assumption that the Disturbances are Independent Chapter 7.1: Using and Interpreting Indicator Variables.
Lecture 26 Model Building (Chapters ) HW6 due Wednesday, April 23 rd by 5 p.m. Problem 3(d): Use JMP to calculate the prediction interval rather.
Simple Linear Regression
Stat 512 – Lecture 18 Multiple Regression (Ch. 11)
1 Multiple Regression Chapter Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent.
1 BA 275 Quantitative Business Methods Residual Analysis Multiple Linear Regression Adjusted R-squared Prediction Dummy Variables Agenda.
Lecture 22 Multiple Regression (Sections )
1 Multiple Regression. 2 Introduction In this chapter we extend the simple linear regression model, and allow for any number of independent variables.
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Stat 512 – Lecture 19 Wrap-Up. Announcements Review sheet online  Office hours  Review session next week?  Updated final exam signup on web Review.
Lecture 6 Notes Note: I will homework 2 tonight. It will be due next Thursday. The Multiple Linear Regression model (Chapter 4.1) Inferences from.
Chapter 16 Chi Squared Tests.
Lecture Inference for a population mean when the stdev is unknown; one more example 12.3 Testing a population variance 12.4 Testing a population.
Lecture 20 Simple linear regression (18.6, 18.9)
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
1 Lecture Eleven Probability Models. 2 Outline Bayesian Probability Duration Models.
1 Simple Linear Regression and Correlation Chapter 17.
Lecture 23 Multiple Regression (Sections )
1 Lecture Eleven Probability Models. 2 Outline Bayesian Probability Duration Models.
Stat 112: Lecture 18 Notes Chapter 7.1: Using and Interpreting Indicator Variables. Visualizing polynomial regressions in multiple regression Review Problem.
Lecture 22 – Thurs., Nov. 25 Nominal explanatory variables (Chapter 9.3) Inference for multiple regression (Chapter )
Adminstrative Info for Final Exam Location: Steinberg Hall-Dietrich Hall 351 Time: Thursday, May 1st, 4:00-6:00 p.m. Closed book. Allowed two double-sided.
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
Lecture 20 – Tues., Nov. 18th Multiple Regression: –Case Studies: Chapter 9.1 –Regression Coefficients in the Multiple Linear Regression Model: Chapter.
Lecture 21 – Thurs., Nov. 20 Review of Interpreting Coefficients and Prediction in Multiple Regression Strategy for Data Analysis and Graphics (Chapters.
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Scatterplots Grade 8: 4.01 & 4.02 Collect, organize, analyze and display data (including scatter plots) to solve problems. Approximate a line of best fit.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Correlation and Linear Regression
Correlation and Linear Regression Chapter 13 Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Active Learning Lecture Slides
Fundamentals of Real Estate Lecture 16 Spring, 2002 Copyright © Joseph A. Petry
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Spring 2015 Room 150 Harvill.
Economics 173 Business Statistics Lecture 22 Fall, 2001© Professor J. Petry
Outline When X’s are Dummy variables –EXAMPLE 1: USED CARS –EXAMPLE 2: RESTAURANT LOCATION Modeling a quadratic relationship –Restaurant Example.
Lecture 8 Matched Pairs Review –Summary –The Flow approach to problem solving –Example.
Copyright © 2009 Cengage Learning 18.1 Chapter 20 Model Building.
Chapter 11 Correlation and Simple Linear Regression Statistics for Business (Econ) 1.
Chapter 3 –Systems of Linear Equations
1 Preparation for Final Exam How to answer question related to computer output?
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Chapter 8: Simple Linear Regression Yang Zhenlin.
Review Session Linear Regression. Correlation Pearson’s r –Measures the strength and type of a relationship between the x and y variables –Ranges from.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Exam 2 Review. Data referenced throughout review An Educational Testing Service (ETS) research scientist used multiple regression analysis to model y,
Modern Languages Row A Row B Row C Row D Row E Row F Row G Row H Row J Row K Row L Row M
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 6: Multiple Regression Model Building Priyantha.
1 Chapter 20 Model Building Introduction Regression analysis is one of the most commonly used techniques in statistics. It is considered powerful.
Chi-Två Test Kapitel 6. Introduction Two statistical techniques are presented, to analyze nominal data. –A goodness-of-fit test for the multinomial experiment.
1 Assessment and Interpretation: MBA Program Admission Policy The dean of a large university wants to raise the admission standards to the popular MBA.
Just one quick favor… Please use your phone or laptop Please take just a minute to complete Course Evaluations online….. Check your for a link or.
Lecture Eleven Probability Models.
Inference for Least Squares Lines
Examining Relationships
a.) What score represents the 90th percentiles?
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2019 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Regression and Categorical Predictors
Presentation transcript:

Lecture 27 Chapter 20.3: Nominal Variables HW6 due by 5 p.m. Wednesday Office hour today after class. Extra office hour Wednesday from Final Exam: May 1 st, 4-6 p.m., SHDH 351 Practice Exam will be posted tomorrow.

20.3 Nominal Independent Variables In many real-life situations one or more independent variables are nominal. Including nominal variables in a regression analysis model is done via indicator (or dummy) variables. An indicator variable (I) can assume one out of two values, “zero” or “one”. I= 1 if data were collected before if data were collected after if the temperature was below 50 o 0 if the temperature was 50 o or more 1 if a degree earned is in Finance 0 if a degree earned is not in Finance

Nominal Independent Variables; Example: Auction Car Price (II) Example revised (Xm18-02a)Xm18-02a –Recall: A car dealer wants to predict the auction price of a car. –The dealer believes now that odometer reading and the car color are variables that affect a car’s price. –Three color categories are considered: White Silver Other colors Note: Color is a nominal variable.

Example revised (Xm18-02b)Xm18-02b I 1 = 1 if the color is white 0 if the color is not white I 2 = 1 if the color is silver 0 if the color is not silver The category “Other colors” is defined by: I 1 = 0; I 2 = 0 Nominal Independent Variables; Example: Auction Car Price (II)

Note: To represent the situation of three possible colors we need only two indicator variables. Conclusion: To represent a nominal variable with m possible categories, we must create m-1 indicator variables. How Many Indicator Variables?

Solution –the proposed model is y =  0 +  1 (Odometer) +  2 I 1 +  3 I 2 +  –The data White car Other color Silver color Nominal Independent Variables; Example: Auction Car Price

Odometer Price Price = (Odometer) (0) (1) Price = (Odometer) (1) (0) Price = (Odometer) (0) (0) (Odometer) (Odometer) (Odometer) The equation for an “other color” car. The equation for a white color car. The equation for a silver color car. From JMP (Xm18-02b) we get the regression equationXm18-02b PRICE = (Odometer)+90.48(I-1) (I-2) Example: Auction Car Price The Regression Equation

From JMP we get the regression equation PRICE = (Odometer)+90.48(I-1) (I-2) A white car sells, on the average, for $90.48 more than a car of the “Other color” category A silver color car sells, on the average, for $ more than a car of the “Other color” category. For one additional mile the auction price decreases by 5.55 cents. Example: Auction Car Price The Regression Equation

Comprehension Question From JMP we get the regression equation PRICE = (Odometer)+90.48(I-1) (I-2) Consider two cars, one white and one silver, with the same number of miles. How much more on average does the silver car sell for than the white car?

There is insufficient evidence to infer that a white color car and a car of “other color” sell for a different auction price. There is sufficient evidence to infer that a silver color car sells for a larger price than a car of the “other color” category. Xm18-02b Example: Auction Car Price The Regression Equation

Recall: The Dean wanted to evaluate applications for the MBA program by predicting future performance of the applicants. The following three predictors were suggested: –Undergraduate GPA –GMAT score –Years of work experience It is now believed that the type of undergraduate degree should be included in the model. Nominal Independent Variables; Example: MBA Program Admission (MBA II)MBA II Note: The undergraduate degree is nominal data.

Nominal Independent Variables; Example: MBA Program Admission (II) I 1 = 1 if B.A. 0 otherwise I 2 = 1 if B.B.A 0 otherwise The category “Other group” is defined by: I 1 = 0; I 2 = 0; I 3 = 0 I 3 = 1 if B.Sc. or B.Eng. 0 otherwise

MBA Program Admission (II)

Practice Problems 20.6, 20.8, 20.22,20.24