MATH 3359 Introduction to Mathematical Modeling Linear System, Simple Linear Regression.

Slides:



Advertisements
Similar presentations
4/14/ lecture 81 STATS 330: Lecture 8. 4/14/ lecture 82 Collinearity Aims of today’s lecture: Explain the idea of collinearity and its connection.
Advertisements

Occupational Factors Affecting the Income of Canada ’ s Residents in the 1970 ’ s Group 5 Ben Wright Bin Ren Hong Wang Jake Stamper James Rogers Yuejing.
BA 275 Quantitative Business Methods
Regression Inferential Methods
© The McGraw-Hill Companies, Inc., 2000 CorrelationandRegression Further Mathematics - CORE.
Multiple Regression Predicting a response with multiple explanatory variables.
Zinc Data SPH 247 Statistical Analysis of Laboratory Data.
Multiple regression analysis
Linear Regression Exploring relationships between two metric variables.
x y z The data as seen in R [1,] population city manager compensation [2,] [3,] [4,]
Examining Relationship of Variables  Response (dependent) variable - measures the outcome of a study.  Explanatory (Independent) variable - explains.
Nemours Biomedical Research Statistics April 2, 2009 Tim Bunnell, Ph.D. & Jobayer Hossain, Ph.D. Nemours Bioinformatics Core Facility.
7/2/ Lecture 51 STATS 330: Lecture 5. 7/2/ Lecture 52 Tutorials  These will cover computing details  Held in basement floor tutorial lab,
Crime? FBI records violent crime, z x y z [1,] [2,] [3,] [4,] [5,]
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Regression Transformations for Normality and to Simplify Relationships U.S. Coal Mine Production – 2011 Source:
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
How to plot x-y data and put statistics analysis on GLEON Fellowship Workshop January 14-18, 2013 Sunapee, NH Ari Santoso.
Copyright © 2011 Pearson Education, Inc. Multiple Regression Chapter 23.
BIOL 582 Lecture Set 19 Matrices, Matrix calculations, Linear models using linear algebra.
A quick introduction to R prog. 淡江統計 陳景祥 (Steve Chen)
PCA Example Air pollution in 41 cities in the USA.
MATH 3359 Introduction to Mathematical Modeling Project Multiple Linear Regression Multiple Logistic Regression.
Analysis of Covariance Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Lecture 5: SLR Diagnostics (Continued) Correlation Introduction to Multiple Linear Regression BMTRY 701 Biostatistical Methods II.
7.1 - Motivation Motivation Correlation / Simple Linear Regression Correlation / Simple Linear Regression Extensions of Simple.
Lecture 4: Inference in SLR (continued) Diagnostic approaches in SLR BMTRY 701 Biostatistical Methods II.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Regression. Height Weight Suppose you took many samples of the same size from this population & calculated the LSRL for each. Using the slope from each.
Testing Multiple Means and the Analysis of Variance (§8.1, 8.2, 8.6) Situations where comparing more than two means is important. The approach to testing.
Regression. Population Covariance and Correlation.
Lecture 9: ANOVA tables F-tests BMTRY 701 Biostatistical Methods II.
Regression with Inference Notes: Page 231. Height Weight Suppose you took many samples of the same size from this population & calculated the LSRL for.
Using R for Marketing Research Dan Toomey 2/23/2015
FACTORS AFFECTING HOUSING PRICES IN SYRACUSE Sample collected from Zillow in January, 2015 Urban Policy Class Exercise - Lecy.
Lecture 11 Multicollinearity BMTRY 701 Biostatistical Methods II.
Simple Linear Regression ANOVA for regression (10.2)
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Tutorial 4 MBP 1010 Kevin Brown. Correlation Review Pearson’s correlation coefficient – Varies between – 1 (perfect negative linear correlation) and 1.
Lecture 7: Multiple Linear Regression Interpretation with different types of predictors BMTRY 701 Biostatistical Methods II.
Lecture 6: Multiple Linear Regression Adjusted Variable Plots BMTRY 701 Biostatistical Methods II.
Lecture 6: Multiple Linear Regression Adjusted Variable Plots BMTRY 701 Biostatistical Methods II.
Linear Models Alan Lee Sample presentation for STATS 760.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Active Learning Lecture Slides For use with Classroom Response Systems Chapter 23 Multiple Regression.
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Stat 1510: Statistical Thinking and Concepts REGRESSION.
Tutorial 5 Thursday February 14 MBP 1010 Kevin Brown.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
Scatter Plots and Lines of Fit
Lecture 11: Simple Linear Regression
Chapter 12 Simple Linear Regression and Correlation
Résolution de l’ex 1 p40 t=c(2:12);N=c(55,90,135,245,403,665,1100,1810,3000,4450,7350) T=data.frame(t,N,y=log(N));T; > T t N y
CHAPTER 7 Linear Correlation & Regression Methods
Correlation and regression
Regression.
1) A residual: a) is the amount of variation explained by the LSRL of y on x b) is how much an observed y-value differs from a predicted y-value c) predicts.
Chapter 12 Regression.
Console Editeur : myProg.R 1
Chapter 12 Simple Linear Regression and Correlation
Regression.
Regression.
Regression.
Regression.
Correlation and Regression
CHAPTER 12 More About Regression
Simple Linear Regression and Correlation
7.1 Draw Scatter Plots & Best-Fitting Lines
Regression.
Presentation transcript:

MATH 3359 Introduction to Mathematical Modeling Linear System, Simple Linear Regression

Outline Linear System Solve Linear System Compute the Inverse Matrix Compute Eigenvalues and Eigenvectors Simple Linear Regression Make scatter plots of the data Fit linear regression model Prediction

Linear System 3 x 1 + x 2 – 6 x 3 = –10 2 x 1 + x 2 – 5 x 3 = –8 6 x 1 – 3 x x 3 = 0 In matrix form:

Function ‘solve’ in R 1. Solve x = solve ( A, b ) 2. Find the inverse matrix A_inverse = solve ( A )

Function ‘eigen’ in R y = eigen ( A, symmetric= TRUE or FALSE, only.values= TRUE or FALSE ) Eigenvalues: y$val Eigenvectors: y$vec

Exercise x x x 3 = − 5 — x 1 + x 3 = − 3 3 x 1 + x x 3 = − 3 1. Solve the linear system 2. Find the inverse of the coefficient matrix 3. Compute the eigenvalues and eigenvectors of the coefficient matrix

Given a data set {y i, x i, i=1,…,n} of n observations, y i is dependent variable, x i is independent variable, the linear regression model is or where Simple Linear Regression

Example As Earth’s population continues to grow, the solid waste generated by the population grows with it. Governments must plan for disposal and recycling of ever growing amounts of solid waste. Planners can use data from the past to predict future waste generation and plan for enough facilities for disposing of and recycling the waste.

Example As Earth’s population continues to grow, the solid waste generated by the population grows with it. Governments must plan for disposal and recycling of ever growing amounts of solid waste. Planners can use data from the past to predict future waste generation and plan for enough facilities for disposing of and recycling the waste. Let 1990 be x=

1. Scatter Plots — Function ‘plot’ x=c(0:4) y=c(19358,19484,20293,21499,23561) plot ( x, y, main = 'Tons of Solid Waste Generated From 1990 to 1994’, xlab = 'year', ylab = 'Tons of Solid Waste Generated (in thousands)’, xlim = c(0,4), ylim = c(19000,25000) )

1. Scatter Plots — Function ‘plot’

2. Fit Linear Regression Model — Function ‘lm’ in R reg= lm ( formula, data ) summary ( reg ) In our example, x=c(0:4) y=c(19358,19484,20293,21499,23561) reg=lm(y~x) summary(reg)

> summary(reg) Call: lm(formula = y ~ x) Residuals: Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) e-05 *** x * --- Signif. codes: 0 ‘***’ ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: on 3 degrees of freedom Multiple R-squared: ,Adjusted R-squared: F-statistic: on 1 and 3 DF, p-value: Hence, the function of best fit is y = x

3. Graph the function of best fit with the scatterplot of the data — —Function ‘abline’ plot ( x, y, main = 'Tons of Solid Waste Generated From 1990 to 1994’, xlab = 'year', ylab = 'Tons of Solid Waste Generated (in thousands)’, xlim = c(1990,1994), ylim = c(19000,25000) ) abline(reg)

4. Prediction — Function ‘predict’ in R predict the average tons of waste in 2000 and 2005: predict ( reg, data.frame( x=c(10,15) ) ) Result:

Exercise Education: Average education of occupational incumbents, years, in Income: Average income of incumbents, dollars, in Women: Percentage of incumbents who are women. Prestige: Pineo-Porter prestige score for occupation, from a social survey conducted in the mid-1960s. Census: Canadian Census occupational code. Type: Type of occupation. A factor with levels (note: out of order): bc, Blue Collar; prof, Professional, Managerial, and Technical; wc, White Collar.

Exercise Import data: library (car) View ( Prestige ) education=Prestige$education prestige=Prestige$prestige Make a scatterplot of the data, letting x represent the education and y represent the prestige. Find the line that best fit the above measurements. Graph the function of best fit with the scatterplot of the data. With the function found in part 2, predict the average prestige when education=16 and 17.