CPE 619 One Factor Experiments Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama in.

Slides:



Advertisements
Similar presentations
Copyright 2004 David J. Lilja1 Comparing Two Alternatives Use confidence intervals for Before-and-after comparisons Noncorresponding measurements.
Advertisements

ANOVA: Analysis of Variation
Chapter 12 Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
© 2010 Pearson Prentice Hall. All rights reserved Single Factor ANOVA.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 11 Analysis of Variance
Statistics for Business and Economics
SIMPLE LINEAR REGRESSION
k r Factorial Designs with Replications r replications of 2 k Experiments –2 k r observations. –Allows estimation of experimental errors Model:
Stat Today: Multiple comparisons, diagnostic checking, an example After these notes, we will have looked at (skip figures 1.2 and 1.3, last.
Chapter 2 Simple Comparative Experiments
Inferences About Process Quality
Ch. 14: The Multiple Regression Model building
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Linear Regression/Correlation
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 10-1 Chapter 10 Analysis of Variance Statistics for Managers Using Microsoft.
Chap 10-1 Analysis of Variance. Chap 10-2 Overview Analysis of Variance (ANOVA) F-test Tukey- Kramer test One-Way ANOVA Two-Way ANOVA Interaction Effects.
Chapter 12: Analysis of Variance
F-Test ( ANOVA ) & Two-Way ANOVA
The Chi-Square Distribution 1. The student will be able to  Perform a Goodness of Fit hypothesis test  Perform a Test of Independence hypothesis test.
PS 225 Lecture 15 Analysis of Variance ANOVA Tables.
Introduction to Linear Regression and Correlation Analysis
One-Factor Experiments Andy Wang CIS 5930 Computer Systems Performance Analysis.
CPE 619 2k-p Factorial Design
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
© 2003 Prentice-Hall, Inc.Chap 11-1 Analysis of Variance IE 340/440 PROCESS IMPROVEMENT THROUGH PLANNED EXPERIMENTATION Dr. Xueping Li University of Tennessee.
© 1998, Geoff Kuenning Linear Regression Models What is a (good) model? Estimating model parameters Allocating variation Confidence intervals for regressions.
© 1998, Geoff Kuenning General 2 k Factorial Designs Used to explain the effects of k factors, each with two alternatives or levels 2 2 factorial designs.
© 2002 Prentice-Hall, Inc.Chap 9-1 Statistics for Managers Using Microsoft Excel 3 rd Edition Chapter 9 Analysis of Variance.
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Lecture 10 Page 1 CS 239, Spring 2007 Experiment Designs for Categorical Parameters CS 239 Experimental Methodologies for System Software Peter Reiher.
Business Statistics, A First Course (4e) © 2006 Prentice-Hall, Inc. Chap 10-1 Chapter 10 Two-Sample Tests and One-Way ANOVA Business Statistics, A First.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
© Copyright McGraw-Hill 2000
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
CPE 619 Two-Factor Full Factorial Design With Replications Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The.
CPE 619 Comparing Systems Using Sample Data Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of.
Experiment Design Overview Number of factors 1 2 k levels 2:min/max n - cat num regression models2k2k repl interactions & errors 2 k-p weak interactions.
Chapter 4 Analysis of Variance
Copyright © 2016, 2013, 2010 Pearson Education, Inc. Chapter 10, Slide 1 Two-Sample Tests and One-Way ANOVA Chapter 10.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Linear Regression Models Andy Wang CIS Computer Systems Performance Analysis.
(c) 2007 IUPUI SPEA K300 (4392) Outline Comparing Group Means Data arrangement Linear Models and Factor Analysis of Variance (ANOVA) Partitioning Variance.
Formula for Linear Regression y = bx + a Y variable plotted on vertical axis. X variable plotted on horizontal axis. Slope or the change in y for every.
 List the characteristics of the F distribution.  Conduct a test of hypothesis to determine whether the variances of two populations are equal.  Discuss.
DSCI 346 Yamasaki Lecture 4 ANalysis Of Variance.
ANOVA: Analysis of Variation
ANOVA: Analysis of Variation
The simple linear regression model and parameter estimation
ANOVA: Analysis of Variation
EXCEL: Multiple Regression
Chapter 10 Two-Sample Tests and One-Way ANOVA.
ANOVA: Analysis of Variation
Statistics for Managers Using Microsoft Excel 3rd Edition
Factorial Experiments
i) Two way ANOVA without replication
Two-Factor Full Factorial Designs
Linear Regression Models
Chapter 10 Two-Sample Tests and One-Way ANOVA.
Correlation and Regression
Chapter 11 Analysis of Variance
Chapter 13 Group Differences
Replicated Binary Designs
One-Factor Experiments
Presentation transcript:

CPE 619 One Factor Experiments Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama in Huntsville

2 Overview Computation of Effects Estimating Experimental Errors Allocation of Variation ANOVA Table and F-Test Visual Diagnostic Tests Confidence Intervals For Effects Unequal Sample Sizes

3 One Factor Experiments Used to compare alternatives of a single categorical variable For example, several processors, several caching schemes

4 Computation of Effects

5 Computation of Effects (Cont)

6 Example 20.1: Code Size Comparison Entries in a row are unrelated (Otherwise, need a two factor analysis)

7 Example 20.1 Code Size (cont ’ d)

8 Example 20.1: Interpretation Average processor requires bytes of storage The effects of the processors R, V, and Z are -13.3, -24.5, and 37.7, respectively. That is, R requires 13.3 bytes less than an average processor V requires 24.5 bytes less than an average processor, and Z requires 37.7 bytes more than an average processor.

9 Estimating Experimental Errors Estimated response for jth alternative: Error: Sum of squared errors (SSE):

10 Example 20.2

11 Allocation of Variation

12 Allocation of Variation (cont ’ d) Total variation of y (SST):

13 Example 20.3

14 Example 20.3 (cont ’ d) 89.6% of variation in code size is due to experimental errors (programmer differences) Is 10.4% statistically significant?

15 Analysis of Variance (ANOVA) Importance  Significance Important  Explains a high percent of variation Significance  High contribution to the variation compared to that by errors Degree of freedom = Number of independent values required to compute Note that the degrees of freedom also add up.

16 F-Test Purpose: to check if SSA is significantly greater than SSE Errors are normally distributed  SSE and SSA have chi-square distributions The ratio (SSA/  )/(SSE/ e ) has an F distribution where A =a-1 = degrees of freedom for SSA e =a(r-1) = degrees of freedom for SSE Computed ratio > F [1-  ; A, e ]  SSA is significantly higher than SSE. SSA/ A is called mean square of A or (MSA) Similary, MSE=SSE/ e

17 ANOVA Table for One Factor Experiments

18 Example 20.4: Code Size Comparison Computed F-value < F from Table The variation in the code sizes is mostly due to experimental errors and not because of any significant difference among the processors

19 Visual Diagnostic Tests Assumptions: 1.Factors effects are additive 2.Errors are additive 3.Errors are independent of factor levels 4.Errors are normally distributed 5.Errors have the same variance for all factor levels Tests: Residuals versus predicted response: No trend  Independence Scale of errors << Scale of response  Ignore visible trends Normal quantilte-quantile plot linear  Normality

20 Example 20.5 Horizontal and vertical scales similar  Residuals are not small  Variation due to factors is small compared to the unexplained variation No visible trend in the spread Q-Q plot is S-shaped  shorter tails than normal

21 Confidence Intervals For Effects Estimates are random variables For the confidence intervals, use t values at a(r-1) degrees of freedom. Mean responses: Contrasts  h j  j : Use for  1 -  2

22 Example 20.6: Code Size Comparison

23 Example 20.6 (cont ’ d) For 90% confidence, t [0.95; 12] = % confidence intervals: The code size on an average processor is significantly different from zero Processor effects are not significant

24 Example 20.6 (cont ’ d) Using h 1 =1, h 2 =-1, h 3 =0, (  h j =0): CI includes zero  one isn't superior to other

25 Example 20.6 (cont ’ d) Similarly, Any one processor is not superior to another

26 Unequal Sample Sizes By definition: Here, r j is the number of observations at jth level. N =total number of observations:

27 Parameter Estimation

28 Analysis of Variance

29 Example 20.7: Code Size Comparison All means are obtained by dividing by the number of observations added The column effects are 2.15, 13.75, and

30 Example 20.6: Analysis of Variance

31 Example 20.6 ANOVA (cont ’ d) Sums of Squares: Degrees of Freedom:

32 Example 20.6 ANOVA Table Conclusion: Variation due processors is insignificant as compared to that due to modeling errors

33 Example 20.6 Standard Dev. of Effects Consider the effect of processor Z: Since, Error in  3 =  Errors in terms on the right hand side: e ij 's are normally distributed  3 is normal with

34 Summary Model for One factor experiments: Computation of effects Allocation of variation, degrees of freedom ANOVA table Standard deviation of errors Confidence intervals for effects and contracts Model assumptions and visual tests

35 Exercise 20.1 For a single factor design, suppose we want to write an expression for  j in terms of y ij 's: What are the values of a..j 's? From the above expression, the error in  j is seen to be: Assuming errors e ij are normally distributed with zero mean and variance  e 2, write an expression for variance of e  j. Verify that your answer matches that in Table 20.5.

36 An Example Analyze the following one factor experiment: 1.Compute the effects 2.Prepare ANOVA table 3.Compute confidence intervals for effects and interpret 4.Compute Confidence interval for  1 -  3 5.Show graphs for visual tests and interpret