REGRESI DENGAN VARABEL FAKTOR/ KUALLTATIF

Slides:



Advertisements
Similar presentations
Lecture 10 F-tests in MLR (continued) Coefficients of Determination BMTRY 701 Biostatistical Methods II.
Advertisements

Analisis Varians Dwi Arah Pertemuan 22 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
5/18/ lecture 101 STATS 330: Lecture 10. 5/18/ lecture 102 Diagnostics 2 Aim of today’s lecture  To describe some more remedies for non-planar.
Predicting Success in the National Football League An in-depth look at the factors that differentiate the winning teams from the losing teams. Benjamin.
Confidence Intervals Underlying model: Unknown parameter We know how to calculate point estimates E.g. regression analysis But different data would change.
Regression with ARMA Errors. Example: Seat-belt legislation Story: In February 1983 seat-belt legislation was introduced in UK in the hope of reducing.
Multiple Regression Predicting a response with multiple explanatory variables.
Zinc Data SPH 247 Statistical Analysis of Laboratory Data.
Linear Regression Exploring relationships between two metric variables.
x y z The data as seen in R [1,] population city manager compensation [2,] [3,] [4,]
SPH 247 Statistical Analysis of Laboratory Data 1April 23, 2010SPH 247 Statistical Analysis of Laboratory Data.
DJIA1 Beneath the Calm Waters: A Study of the Dow Index Group 5 members Project Choice: Hyo Joon You Data Retrieval: Stephen Meronk Statistical Analysis:
1 Regression Homework Solutions EPP 245/298 Statistical Analysis of Laboratory Data.
Cost of construction of nuclear power plants light water reactor plants cost in dollars e-06 log date permit issued plant capacity log NE USA?
Examining Relationship of Variables  Response (dependent) variable - measures the outcome of a study.  Explanatory (Independent) variable - explains.
Nemours Biomedical Research Statistics April 2, 2009 Tim Bunnell, Ph.D. & Jobayer Hossain, Ph.D. Nemours Bioinformatics Core Facility.
FISH 397C Winter 2009 Evan Girvetz Basic Statistical Analyses and Contributed Packages in R © R Foundation, from
MATH 3359 Introduction to Mathematical Modeling Linear System, Simple Linear Regression.
Crime? FBI records violent crime, z x y z [1,] [2,] [3,] [4,] [5,]
Some Analysis of Some Perch Catch Data 56 perch were caught in a freshwater lake in Finland Their weights, lengths, heights and widths were recorded It.
Regression Transformations for Normality and to Simplify Relationships U.S. Coal Mine Production – 2011 Source:
How to plot x-y data and put statistics analysis on GLEON Fellowship Workshop January 14-18, 2013 Sunapee, NH Ari Santoso.
BIOL 582 Lecture Set 19 Matrices, Matrix calculations, Linear models using linear algebra.
PCA Example Air pollution in 41 cities in the USA.
9/14/ Lecture 61 STATS 330: Lecture 6. 9/14/ Lecture 62 Inference for the Regression model Aim of today’s lecture: To discuss how we assess.
SWC Methodology - TWG February 19, 2015 Settlement Document Subject to I.R.E. 408.
 Combines linear regression and ANOVA  Can be used to compare g treatments, after controlling for quantitative factor believed to be related to response.
Logistic Regression Pre-Challenger Relation Between Temperature and Field-Joint O-Ring Failure Dalal, Fowlkes, and Hoadley (1989). “Risk Analysis of the.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Regression and Analysis Variance Linear Models in R.
Exercise 8.25 Stat 121 KJ Wang. Votes for Bush and Buchanan in all Florida Counties Palm Beach County (outlier)
Collaboration and Data Sharing What have I been doing that’s so bad, and how could it be better? August 1 st, 2010.
Lecture 9: ANOVA tables F-tests BMTRY 701 Biostatistical Methods II.
Using R for Marketing Research Dan Toomey 2/23/2015
FACTORS AFFECTING HOUSING PRICES IN SYRACUSE Sample collected from Zillow in January, 2015 Urban Policy Class Exercise - Lecy.
Exercise 1 The standard deviation of measurements at low level for a method for detecting benzene in blood is 52 ng/L. What is the Critical Level if we.
Tutorial 4 MBP 1010 Kevin Brown. Correlation Review Pearson’s correlation coefficient – Varies between – 1 (perfect negative linear correlation) and 1.
Lecture 7: Multiple Linear Regression Interpretation with different types of predictors BMTRY 701 Biostatistical Methods II.
Environmental Modeling Basic Testing Methods - Statistics III.
Applied Statistics Week 4 Exercise 3 Tick bites and suspicion of Borrelia Mihaela Frincu
Lecture 6: Multiple Linear Regression Adjusted Variable Plots BMTRY 701 Biostatistical Methods II.
Determining Factors of GPA Natalie Arndt Allison Mucha MA /6/07.
Lecture 6: Multiple Linear Regression Adjusted Variable Plots BMTRY 701 Biostatistical Methods II.
Introduction to R Las Vegas 2015 James McCaffrey Microsoft Research, Advanced Development Tuesday, October 27, :15 - 3:30 PM devintersection.com.
Linear Models Alan Lee Sample presentation for STATS 760.
Exercise 1 The standard deviation of measurements at low level for a method for detecting benzene in blood is 52 ng/L. What is the Critical Level if we.
EPP 245 Statistical Analysis of Laboratory Data 1April 23, 2010SPH 247 Statistical Analysis of Laboratory Data.
Tutorial 5 Thursday February 14 MBP 1010 Kevin Brown.
The Effect of Race on Wage by Region. To what extent were black males paid less than nonblack males in the same region with the same levels of education.
1 Analysis of Variance (ANOVA) EPP 245/298 Statistical Analysis of Laboratory Data.
WSUG M AY 2012 EViews, S-Plus and R Damian Staszek Bristol Water.
Lecture 11: Simple Linear Regression
Peter Fox and Greg Hughes Data Analytics – ITWS-4600/ITWS-6600
Chapter 12 Simple Linear Regression and Correlation
Résolution de l’ex 1 p40 t=c(2:12);N=c(55,90,135,245,403,665,1100,1810,3000,4450,7350) T=data.frame(t,N,y=log(N));T; > T t N y
Correlation and regression
Data Analytics – ITWS-4600/ITWS-6600/MATP-4450
Console Editeur : myProg.R 1
Regression with ARMA Errors
Chapter 12 Simple Linear Regression and Correlation
Regression Transformations for Normality and to Simplify Relationships
Example 1 5. Use SPSS output ANOVAb Model Sum of Squares df
Multi Linear Regression Lab
Simple Linear Regression
Obtaining the Regression Line in R
ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960
24/02/11 Tutorial 2 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Estimating the Variance of the Error Terms
Table 2. Regression statistics for independent and dependent variables
Presentation transcript:

REGRESI DENGAN VARABEL FAKTOR/ KUALLTATIF KASUS I: FAKTOR TIDAK BERPENGARUH

REGRESI DIPERIKSA INTERAKSI lm(formula = y1 ~ x1 * g, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 10.00000 1.23576 8.092 5.45e-11 *** x1 3.00000 0.02475 121.230 < 2e-16 *** g[T.P] -0.65146 1.85766 -0.351 0.727 x1:g[T.P] 0.01779 0.03622 0.491 0.625 --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 1.184 on 56 degrees of freedom Multiple R-squared: 0.998, Adjusted R-squared: 0.9979 F-statistic: 9476 on 3 and 56 DF, p-value: < 2.2e-16

REGRESI DIPISAH lm(formula = y1 ~ g/x1 - 1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) gL 10.00000 1.23576 8.092 5.45e-11 *** gP 9.34854 1.38701 6.740 9.30e-09 *** gL:x1 3.00000 0.02475 121.230 < 2e-16 *** gP:x1 3.01779 0.02645 114.099 < 2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 1.184 on 56 degrees of freedom Multiple R-squared: 1, Adjusted R-squared: 0.9999 F-statistic: 2.864e+05 on 4 and 56 DF, p-value: < 2.2e-16

REGRESI DIGABUNG lm(formula = y1 ~ x1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 9.60337 0.90543 10.61 3.35e-15 *** x1 3.01053 0.01768 170.25 < 2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 1.173 on 58 degrees of freedom Multiple R-squared: 0.998, Adjusted R-squared: 0.998 F-statistic: 2.899e+04 on 1 and 58 DF, p-value: < 2.2e-16

FAKTOR BERPENGARUH TANPA INTERAKSI

PERIKSA INTERAKSI lm(formula = y2 ~ x1 * g, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 52.38491 2.16798 24.163 <2e-16 *** x1 2.95164 0.04341 67.988 <2e-16 *** g[T.P] -102.82667 3.25903 -31.551 <2e-16 *** x1:g[T.P] 0.05375 0.06354 0.846 0.401 --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 2.078 on 56 degrees of freedom Multiple R-squared: 0.9985, Adjusted R-squared: 0.9985 F-statistic: 1.28e+04 on 3 and 56 DF, p-value: < 2.2e-16

PAKSA REGRESI BERBEDA lm(formula = y2 ~ g/x1 - 1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) gL 52.38491 2.16798 24.16 <2e-16 *** gP -50.44177 2.43334 -20.73 <2e-16 *** gL:x1 2.95164 0.04341 67.99 <2e-16 *** gP:x1 3.00539 0.04640 64.77 <2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 2.078 on 56 degrees of freedom Multiple R-squared: 0.9998, Adjusted R-squared: 0.9998 F-statistic: 8.923e+04 on 4 and 56 DF, p-value: < 2.2e-16

REGRESI PARALEL lm(formula = y2 ~ g + x1 - 1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) gL 51.15128 1.60017 31.97 <2e-16 *** gP -48.95706 1.68119 -29.12 <2e-16 *** x1 2.97673 0.03162 94.13 <2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 2.072 on 57 degrees of freedom Multiple R-squared: 0.9998, Adjusted R-squared: 0.9998 F-statistic: 1.196e+05 on 3 and 57 DF, p-value: < 2.2e-16

REGRESI DIGABUNG lm(formula = y2 ~ x1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 46.4770 38.8685 1.196 0.23666 x1 2.0778 0.7591 2.737 0.00821 ** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 50.35 on 58 degrees of freedom Multiple R-squared: 0.1144, Adjusted R-squared: 0.09914 F-statistic: 7.493 on 1 and 58 DF, p-value: 0.008211

FAKTOR BERINTERAKSI

MEMERIKSA INTERAKSI lm(formula = y3 ~ g * x1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 59.16291 2.08027 28.44 <2e-16 *** g[T.P] -125.53198 3.12719 -40.14 <2e-16 *** x1 -1.48226 0.04166 -35.58 <2e-16 *** g[T.P]:x1 3.90656 0.06097 64.07 <2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 1.994 on 56 degrees of freedom Multiple R-squared: 0.9977, Adjusted R-squared: 0.9976 F-statistic: 8101 on 3 and 56 DF, p-value: < 2.2e-16

REGRESI DIPISAH lm(formula = y3 ~ g/x1 - 1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) gL 59.16291 2.08027 28.44 <2e-16 *** gP -66.36907 2.33490 -28.43 <2e-16 *** gL:x1 -1.48226 0.04166 -35.58 <2e-16 *** gP:x1 2.42430 0.04452 54.45 <2e-16 *** --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 1.994 on 56 degrees of freedom Multiple R-squared: 0.9983, Adjusted R-squared: 0.9981 F-statistic: 8028 on 4 and 56 DF, p-value: < 2.2e-16

REGRESI PARALEL lm(formula = y3 ~ g + x1 - 1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) gL -30.4926 13.1515 -2.319 0.02403 * gP 41.5335 13.8174 3.006 0.00393 ** x1 0.3412 0.2599 1.313 0.19446 --- Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1 Residual standard error: 17.03 on 57 degrees of freedom Multiple R-squared: 0.8707, Adjusted R-squared: 0.8639 F-statistic: 127.9 on 3 and 57 DF, p-value: < 2.2e-16

REGRESI DIGABUNG lm(formula = y3 ~ x1, data = DataSimReg) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -27.1295 30.8332 -0.880 0.383 x1 0.9880 0.6022 1.641 0.106 Residual standard error: 39.94 on 58 degrees of freedom Multiple R-squared: 0.04436, Adjusted R-squared: 0.02788 F-statistic: 2.692 on 1 and 58 DF, p-value: 0.1063