Design and Analysis of Experiments

Slides:



Advertisements
Similar presentations
1 Chapter 4 Experiments with Blocking Factors The Randomized Complete Block Design Nuisance factor: a design factor that probably has an effect.
Advertisements

Model Adequacy Checking in the ANOVA Text reference, Section 3-4, pg
ANOVA: Analysis of Variation
Comparing Two Population Means The Two-Sample T-Test and T-Interval.
Part I – MULTIVARIATE ANALYSIS
Chapter 3 Experiments with a Single Factor: The Analysis of Variance
Final Review Session.
Analysis of Variance (ANOVA) MARE 250 Dr. Jason Turner.
Analysis of Variance Chapter 3Design & Analysis of Experiments 7E 2009 Montgomery 1.
Chapter 11 Multiple Regression.
13-1 Designing Engineering Experiments Every experiment involves a sequence of activities: Conjecture – the original hypothesis that motivates the.
Chapter 2 Simple Comparative Experiments
Inferences About Process Quality
13 Design and Analysis of Single-Factor Experiments:
Chapter 13: Inference in Regression
1 1 Slide © 2005 Thomson/South-Western Chapter 13, Part A Analysis of Variance and Experimental Design n Introduction to Analysis of Variance n Analysis.
ITK-226 Statistika & Rancangan Percobaan Dicky Dermawan
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
Design and Analysis of Experiments Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University Tainan, TAIWAN,
Testing Multiple Means and the Analysis of Variance (§8.1, 8.2, 8.6) Situations where comparing more than two means is important. The approach to testing.
Design of Engineering Experiments Part 2 – Basic Statistical Concepts
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Chapter 19 Analysis of Variance (ANOVA). ANOVA How to test a null hypothesis that the means of more than two populations are equal. H 0 :  1 =  2 =
1 Always be mindful of the kindness and not the faults of others.
1 Analysis of Variance & One Factor Designs Y= DEPENDENT VARIABLE (“yield”) (“response variable”) (“quality indicator”) X = INDEPENDENT VARIABLE (A possibly.
DOX 6E Montgomery1 Design of Engineering Experiments Part 2 – Basic Statistical Concepts Simple comparative experiments –The hypothesis testing framework.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Chapters Way Analysis of Variance - Completely Randomized Design.
Formula for Linear Regression y = bx + a Y variable plotted on vertical axis. X variable plotted on horizontal axis. Slope or the change in y for every.
 List the characteristics of the F distribution.  Conduct a test of hypothesis to determine whether the variances of two populations are equal.  Discuss.
1 Chapter 5.8 What if We Have More Than Two Samples?
ANOVA: Analysis of Variation
Design and Analysis of Experiments
Chapter 11 Analysis of Variance
Comparing Multiple Groups:
ANOVA: Analysis of Variation
ANOVA: Analysis of Variation
Chapter 20 Linear and Multiple Regression
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
What If There Are More Than Two Factor Levels?
Chapter 10 Two-Sample Tests and One-Way ANOVA.
ANOVA: Analysis of Variation
Design and Analysis of Experiments
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
CHAPTER 13 Design and Analysis of Single-Factor Experiments:
The greatest blessing in life is in giving and not taking.
Comparing Three or More Means
Basic Practice of Statistics - 5th Edition
Hypothesis testing using contrasts
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Least Square Regression
Chapter 2 Simple Comparative Experiments
Chapter 10: Analysis of Variance: Comparing More Than Two Means
Simple Linear Regression
Chapter 13 Simple Linear Regression
Comparing Multiple Groups: Analysis of Variance ANOVA (1-way)
Chapter 9 Hypothesis Testing.
Chapter 11 Analysis of Variance
Chapter 11: The ANalysis Of Variance (ANOVA)
1-Way Analysis of Variance - Completely Randomized Design
One-Way Analysis of Variance
CHAPTER 12 More About Regression
Chapter Fourteen McGraw-Hill/Irwin
One-Factor Experiments
Essentials of Statistics for Business and Economics (8e)
1-Way Analysis of Variance - Completely Randomized Design
STATISTICS INFORMED DECISIONS USING DATA
Chapter 13 Simple Linear Regression
Presentation transcript:

Design and Analysis of Experiments Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University Tainan, TAIWAN, ROC This is a basic course blah, blah, blah…

Analysis of Variance Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University Tainan, TAIWAN, ROC This is a basic course blah, blah, blah…

Outline(1/2) Example The ANOVA Analysis of Fixed effects Model Model adequacy Checking Practical Interpretation of results Determining Sample Size

Outline (2/2) Discovering Dispersion Effects Nonparametric Methods in the ANOVA

What If There Are More Than Two Factor Levels? The t-test does not directly apply There are lots of practical situations where there are either more than two levels of interest, or there are several factors of simultaneous interest The ANalysis Of VAriance (ANOVA) is the appropriate analysis “engine” for these types of experiments The ANOVA was developed by Fisher in the early 1920s, and initially applied to agricultural experiments Used extensively today for industrial experiments

An Example(1/6)

An Example(2/6) An engineer is interested in investigating the relationship between the RF power setting and the etch rate for this tool. The objective of an experiment like this is to model the relationship between etch rate and RF power, and to specify the power setting that will give a desired target etch rate. The response variable is etch rate.

An Example(3/6) She is interested in a particular gas (C2F6) and gap (0.80 cm), and wants to test four levels of RF power: 160W, 180W, 200W, and 220W. She decided to test five wafers at each level of RF power. The experimenter chooses 4 levels of RF power 160W, 180W, 200W, and 220W The experiment is replicated 5 times – runs made in random order

An Example --Data

An Example – Data Plot Data: Etch-Rate.mtw Graph -> Boxplot, Scatterplot

An Example--Questions Does changing the power change the mean etch rate? Is there an optimum level for power? We would like to have an objective way to answer these questions The t-test really doesn’t apply here – more than two factor levels

The Analysis of Variance In general, there will be a levels of the factor, or a treatments, and n replicates of the experiment, run in random order…a completely randomized design (CRD) N = an total runs We consider the fixed effects case…the random effects case will be discussed later Objective is to test hypotheses about the equality of the a treatment means

The Analysis of Variance

The Analysis of Variance The name “analysis of variance” stems from a partitioning of the total variability in the response variable into components that are consistent with a model for the experiment

The Analysis of Variance The basic single-factor ANOVA model is

Models for the Data There are several ways to write a model for the data Mean model Also known as one-way or single-factor ANOVA

Models for the Data Fixed or random factor? The a treatments could have been specifically chosen by the experimenter. In this case, the results may apply only to the levels considered in the analysis. fixed effect models

Models for the Data The a treatments could be a random sample from a larger population of treatments. In this case, we should be able to extend the conclusion to all treatments in the population.  random effect models

Analysis of the Fixed Effects Model Recall the single-factor ANOVA for the fixed effect model Define

Analysis of the Fixed Effects Model Hypothesis 

Analysis of the Fixed Effects Model Thus, the equivalent Hypothesis

Analysis of the Fixed Effects Model-Decomposition Total variability is measured by the total sum of squares: The basic ANOVA partitioning is:

Analysis of the Fixed Effects Model-Decomposition In detail (=0)

Analysis of the Fixed Effects Model-Decomposition Thus SST SSTreatments SSE

Analysis of the Fixed Effects Model-Decomposition A large value of SSTreatments reflects large differences in treatment means A small value of SSTreatments likely indicates no differences in treatment means Formal statistical hypotheses are:

Analysis of the Fixed Effects Model-Decomposition For SSE Recall

Analysis of the Fixed Effects Model-Decomposition Combine a sample variances The above formula is a pooled estimate of the common variance with each a treatment.

Analysis of the Fixed Effects Model-Mean Squares Define and df df

Analysis of the Fixed Effects Model-Mean Squares By mathematics, That is, MSE estimates σ2. If there are no differences in treatments means, MSTreatments also estimates σ2.

Analysis of the Fixed Effects Model-Statistical Analysis Cochran’s Theorem Let Zi be NID(0,1) for i=1,2,…,v and then Q1, q2,…,Qs are independent chi-square random variables withv1, v2, …,vs degrees of freedom, respectively, If and only if

Analysis of the Fixed Effects Model-Statistical Analysis Cochran’s Theorem implies that are independently distributed chi-square random variables Thus, if the null hypothesis is true, the ratio is distributed as F with a-1 and N-a degrees of freedom.

Analysis of the Fixed Effects Model-Statistical Analysis Cochran’s Theorem implies that Are independently distributed chi-square random variables Thus, if the null hypothesis is true, the ratio is distributed as F with a-1 and N-a degrees of freedom.

Analysis of the Fixed Effects Models-- Summary Table The reference distribution for F0 is the Fa-1, a(n-1) distribution Reject the null hypothesis (equal treatment means) if

Analysis of the Fixed Effects Models-- Example Recall the example of etch rate (Etch-Rate.mtw), Hypothesis

Analysis of the Fixed Effects Models-- Example ANOVA table

Analysis of the Fixed Effects Models-- Example Rejection region

Analysis of the Fixed Effects Models-- Example P-value P-value

Analysis of the Fixed Effects Models-- Example Minitab Data: Etch-Rate.mtw Stat -> ANOVA -> One-Way Graphs: Residual plots -> Check four-in-one

Analysis of the Fixed Effects Models-- Example Minitab One-way ANOVA: Etch Rate versus Power Method Null hypothesis All means are equal Alternative hypothesis At least one mean is different Significance level α = 0.05 Equal variances were assumed for the analysis. Factor Information Factor Levels Values Power 4 160, 180, 200, 220 Analysis of Variance Source DF Adj SS Adj MS F-Value P-Value Power 3 66871 22290.2 66.80 0.000 Error 16 5339 333.7 Total 19 72210 Model Summary S R-sq R-sq(adj) R-sq(pred) 18.2675 92.61% 91.22% 88.45% Means Power N Mean StDev 95% CI 160 5 551.20 20.02 (533.88, 568.52) 180 5 587.40 16.74 (570.08, 604.72) 200 5 625.40 20.53 (608.08, 642.72) 220 5 707.00 15.25 (689.68, 724.32) Pooled StDev = 18.2675

Analysis of the Fixed Effects Model-Statistical Analysis Coding the observations will not change the results Without the assumption of randomization, the ANOVA F test can be viewed as approximation to the randomization test.

Analysis of the Fixed Effects Model- Estimation of the model parameters Reasonable estimates of the overall mean and the treatment effects for the single-factor model are given by

Analysis of the Fixed Effects Model- Estimation of the model parameters Confidence interval for μi Confidence interval for μi - μj

Analysis of the Fixed Effects Model- Unbalanced data For the unbalanced data

A little (very little) humor…

Model Adequacy Checking Assumptions on the model Errors are normally distributed and independently distributed with mean zero and constant but unknown variances σ2 Define residual where is an estimate of yij

Model Adequacy Checking --Normality Normal probability plot

Model Adequacy Checking --Normality Four-in-one

Model Adequacy Checking --Plot of residuals in time sequence Residuals vs run order

Model Adequacy Checking --Residuals vs fitted

Model Adequacy Checking --Residuals vs fitted Defects Horn shape Moon type Test for equal variances Bartlett’s test

Model Adequacy Checking --Residuals vs fitted Test for equal variances Bartlett’s test Stat ->ANOVA -> Test for Equal Variances

Model Adequacy Checking --Residuals vs fitted Test for equal variances Bartlett’s test Test for Equal Variances: Etch Rate versus Power Method Null hypothesis All variances are equal Alternative hypothesis At least one variance is different Significance level α = 0.05 95% Bonferroni Confidence Intervals for Standard Deviations Power N StDev CI 160 5 20.0175 (8.56240, 93.509) 180 5 16.7422 (4.73551, 118.274) 200 5 20.5256 (7.31210, 115.128) 220 5 15.2480 (4.44931, 104.415) Individual confidence level = 98.75% Tests Test Method Statistic P-Value Multiple comparisons — 0.898 Levene 0.20 0.898

Model Adequacy Checking --Residuals vs fitted Test for equal variances Bartlett’s test

Model Adequacy Checking --Residuals vs fitted Variance-stabilizing transformation Deal with non-constant variance If observations follows Poisson distribution  square root transformation If observations follows Lognormal distribution  Logarithmic transformation

Model Adequacy Checking --Residuals vs fitted Variance-stabilizing transformation If observations are binominal data Arcsine transformation Other transformation  check the relationship among observations and mean.

Practical Interpretation of Results – Regression Model The one-way ANOVA model is a regression model and is similar to Stat -> Regression -> regression -> fit regression model

Practical Interpretation of Results – Regression Model Computer Results Regression Analysis: Etch Rate versus Power Analysis of Variance Source DF Adj SS Adj MS F-Value P-Value Regression 1 63857 63857.3 137.62 0.000 Power 1 63857 63857.3 137.62 0.000 Error 18 8352 464.0 Lack-of-Fit 2 3013 1506.6 4.51 0.028 Pure Error 16 5339 333.7 Total 19 72210 Model Summary S R-sq R-sq(adj) R-sq(pred) 21.5413 88.43% 87.79% 85.73% Coefficients Term Coef SE Coef T-Value P-Value VIF Constant 137.6 41.2 3.34 0.004 Power 2.527 0.215 11.73 0.000 1.00 Regression Equation Etch Rate = 137.6 + 2.527 Power Fits and Diagnostics for Unusual Observations Obs Etch Rate Fit Resid Std Resid 1 600.00 643.02 -43.02 -2.06 R R Large residual

Practical Interpretation of Results – Regression Model Computer Results

Practical Interpretation of Results – Regression Model Redo regression by using power and power*power as continuous factors

The Regression Model

Practical Interpretation of Results – Regression Model

Practical Interpretation of Results – Comparison of Means The analysis of variance tests the hypothesis of equal treatment means Assume that residual analysis is satisfactory If that hypothesis is rejected, we don’t know which specific means are different Determining which specific means differ following an ANOVA is called the multiple comparisons problem

Practical Interpretation of Results – Comparison of Means There are lots of ways to do this We will use pairwised t-tests on means…sometimes called Fisher’s Least Significant Difference (or Fisher’s LSD) Method

Practical Interpretation of Results – Comparison of Means Fisher Pairwise Comparisons Grouping Information Using the Fisher LSD Method and 95% Confidence Power N Mean Grouping 220 5 707.00 A 200 5 625.40 B 180 5 587.40 C 160 5 551.20 D Means that do not share a letter are significantly different.

Practical Interpretation of Results – Comparison of Means

Practical Interpretation of Results – Graphical Comparison of Means

Practical Interpretation of Results – Contrasts A linear combination of parameters So the hypothesis becomes

Practical Interpretation of Results – Contrasts Examples

Practical Interpretation of Results – Contrasts Testing t-test Contrast average Contrast Variance Test statistic

Practical Interpretation of Results – Contrasts Testing F-test Test statistic

Practical Interpretation of Results – Contrasts Testing F-test Reject hypothesis if

Practical Interpretation of Results – Contrasts Confidence interval

Practical Interpretation of Results – Contrasts Standardized contrast Standardize the contrasts when more than one contrast is of interest

Practical Interpretation of Results – Contrasts Unequal sample size Contrast t-statistic

Practical Interpretation of Results – Contrasts Unequal sample size Contrast sum of squares

Practical Interpretation of Results – Orthogonal Contrasts Define two contrasts with coefficients {ci} and {di} are orthogonal contrasts if Unbalanced design

Practical Interpretation of Results – Orthogonal Contrasts Why use orthogonal contrasts ?  For a treatments, the set of a-1 orthogonal contrasts partition the sum of squares due to treatments into a-1 independent single-degree-of freedom tests performed on orthogonal contrasts are independent.

Practical Interpretation of Results – Orthogonal Contrasts example

Practical Interpretation of Results – Orthogonal Contrasts Example for contrast

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts Comparing any and all possible contrasts between treatment means Suppose that a set of m contrasts in the treatment means of interest have been determined. The corresponding contrast in the treatment averages is

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts The standard error of this contrast is The critical value against which should be compared is If |Cu|>Sα,u , the hypothesis that contrast Γu equals zero is rejected.

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts The simultaneous confidence intervals with type I error α

Practical Interpretation of Results – example for Scheffe’s method Contrast of interest The numerical values of these contrasts are

Practical Interpretation of Results – example for Scheffe’s method Standard error One percent critical values are |C1|>S0.01,1 and |C1|>S0.01,1 , both contrast hypotheses should be rejected.

Practical Interpretation of Results – Minitab example Orthogonal contrasts: Contrast 1: 1= 2 Contrast 2: 1+ 2= 3+ 4 Contrast 3: 3= 4

Practical Interpretation of Results – Minitab example Coding “power” into 3 orthogonal contrasts in Minitab: contrast1 contrast2 contrast3 0 -1 1 0 -1 -1 1 1 0 -1 1 0 Data Code numeric to numeric

Practical Interpretation of Results – example for Scheffe’s method

Practical Interpretation of Results – example for Scheffe’s method Stat Regressionregressionfit regression Choose contrast1, 2, 3 as factors Do not choose “power” as factor because “power” has been separated by contrast 1, 2 and 3.

Practical Interpretation of Results – example for Scheffe’s method Regression Analysis: Etch Rate versus conytrast1, contrast2, contrast3 Analysis of Variance Source DF Adj SS Adj MS F-Value P-Value Regression 3 66871 22290.2 66.80 0.000 conytrast1 1 3276 3276.1 9.82 0.006 contrast2 1 46948 46948.0 140.69 0.000 contrast3 1 16646 16646.4 49.88 0.000 Error 16 5339 333.7 Total 19 72210 Model Summary S R-sq R-sq(adj) R-sq(pred) 18.2675 92.61% 91.22% 88.45%

Practical Interpretation of Results –comparing pairs of treatment means Tukey’s test Fisher’s Least significant Difference (LSD) method Hsu’s Methods

Practical Interpretation of Results –comparing pairs of treatment means—Computer output One-way ANOVA: Etch Rate versus Power Source DF SS MS F P Power 3 66871 22290 66.80 0.000 Error 16 5339 334 Total 19 72210 S = 18.27 R-Sq = 92.61% R-Sq(adj) = 91.22% Individual 95% CIs For Mean Based on Pooled StDev Level N Mean StDev ---+---------+---------+---------+------ 160 5 551.20 20.02 (--*---) 180 5 587.40 16.74 (--*---) 200 5 625.40 20.53 (--*---) 220 5 707.00 15.25 (--*---) ---+---------+---------+---------+------ 550 600 650 700 Pooled StDev = 18.27

Practical Interpretation of Results –comparing pairs of treatment means—Computer output Grouping Information Using Tukey Method Power N Mean Grouping 220 5 707.00 A 200 5 625.40 B 180 5 587.40 C 160 5 551.20 D Means that do not share a letter are significantly different. Tukey 95% Simultaneous Confidence Intervals All Pairwise Comparisons among Levels of Power Individual confidence level = 98.87% Power = 160 subtracted from: Power Lower Center Upper -----+---------+---------+---------+---- 180 3.11 36.20 69.29 (---*--) 200 41.11 74.20 107.29 (--*---) 220 122.71 155.80 188.89 (---*--) -----+---------+---------+---------+---- -100 0 100 200

Practical Interpretation of Results –comparing pairs of treatment means—Computer output Power = 180 subtracted from: Power Lower Center Upper -----+---------+---------+---------+---- 200 4.91 38.00 71.09 (---*--) 220 86.51 119.60 152.69 (--*--) -----+---------+---------+---------+---- -100 0 100 200 Power = 200 subtracted from: Power Lower Center Upper -----+---------+---------+---------+---- 220 48.51 81.60 114.69 (--*--) -100 0 100 200

Practical Interpretation of Results –comparing pairs of treatment means—Computer output Hsu's MCB (Multiple Comparisons with the Best) Family error rate = 0.05 Critical value = 2.23 Intervals for level mean minus largest of other level means Level Lower Center Upper ---+---------+---------+---------+------ 160 -181.53 -155.80 0.00 (---*------------------) 180 -145.33 -119.60 0.00 (--*--------------) 200 -107.33 -81.60 0.00 (--*---------) 220 0.00 81.60 107.33 (---------*--) ---+---------+---------+---------+------ -160 -80 0 80

Practical Interpretation of Results –comparing pairs of treatment means—Computer output Grouping Information Using Fisher Method Power N Mean Grouping 220 5 707.00 A 200 5 625.40 B 180 5 587.40 C 160 5 551.20 D Means that do not share a letter are significantly different. Fisher 95% Individual Confidence Intervals All Pairwise Comparisons among Levels of Power Simultaneous confidence level = 81.11% Power = 160 subtracted from: Power Lower Center Upper ----+---------+---------+---------+----- 180 11.71 36.20 60.69 (--*-) 200 49.71 74.20 98.69 (-*--) 220 131.31 155.80 180.29 (--*-) ----+---------+---------+---------+----- -100 0 100 200

Practical Interpretation of Results –comparing pairs of treatment means—Computer output Power = 180 subtracted from: Power Lower Center Upper ----+---------+---------+---------+----- 200 13.51 38.00 62.49 (--*-) 220 95.11 119.60 144.09 (-*-) ----+---------+---------+---------+----- -100 0 100 200 Power = 200 subtracted from: Power Lower Center Upper ----+---------+---------+---------+----- 220 57.11 81.60 106.09 (-*--) -100 0 100 200

Practical Interpretation of Results –comparing treatment means with a control Donntt’s method Control– the one to be compared Totally a-1 comparisons Dunnett's comparisons with a control Family error rate = 0.05 Individual error rate = 0.0196 Critical value = 2.59 Control = level (220) of Power Intervals for treatment mean minus control mean Level Lower Center Upper ---+---------+---------+---------+------ 160 -185.75 -155.80 -125.85 (-------*--------) 180 -149.55 -119.60 -89.65 (--------*-------) 200 -111.55 -81.60 -51.65 (--------*-------) ---+---------+---------+---------+------ -175 -140 -105 -70

Determining Sample size -- Minitab Stat  Power and sample size One-way ANOVA Number of level ->4 Sample size -> Max. difference  75 Power value  0.9 SD  25 One-way ANOVA Alpha = 0.01 Assumed standard deviation = 25 Factors: 1 Number of levels: 4 Maximum Sample Target Difference Size Power Actual Power 75 6 0.9 0.915384 The sample size is for each level.

Dispersion Effects ANOVA for location effects Different factor level affects variability dispersion effects Example Average and standard deviation are measured for a response variable.

Dispersion Effects ANOVA found no location effects Transform the standard deviation to Ratio Obser. Control Algorithm 1 2 3 4 5 6 -2.99573 -3.21888 -2.81341 -3.50656 -3.91202 -2.40795 -2.04022 -2.20727 -1.89712 -2.52573 -2.12026

Dispersion Effects ANOVA found dispersion effects One-way ANOVA: y=ln(s) versus Algorithm Source DF SS MS F P Algorithm 3 6.1661 2.0554 21.96 0.000 Error 20 1.8716 0.0936 Total 23 8.0377 S = 0.3059 R-Sq = 76.71% R-Sq(adj) = 73.22%

Regression and ANOVA Regression Analysis: Etch Rate versus Power The regression equation is Etch Rate = 138 + 2.53 Power Predictor Coef SE Coef T P Constant 137.62 41.21 3.34 0.004 Power 2.5270 0.2154 11.73 0.000 S = 21.5413 R-Sq = 88.4% R-Sq(adj) = 87.8% Analysis of Variance Source DF SS MS F P Regression 1 63857 63857 137.62 0.000 Residual Error 18 8352 464 Total 19 72210 Unusual Observations Obs Power Etch Rate Fit SE Fit Residual St Resid 11 200 600.00 643.02 5.28 -43.02 -2.06R R denotes an observation with a large standardized residual.

Nonparametric methods in the ANOVA When normality is invalid Use Kruskal-Wallis test Rank observation yij in ascending order Replace each observation by its rank, Rij In case tie, assign average rank to them test statistic

Nonparametric methods in the ANOVA Kruskal-Wallis test test statistic where

Nonparametric methods in the ANOVA Kruskal-Wallis test If the null hypothesis is rejected.

Nonparametric methods in the ANOVA Kruskal-Wallis Test: Etch Rate versus Power Kruskal-Wallis Test on Etch Rate Power N Median Ave Rank Z 160 5 542.0 3.4 -3.10 180 5 590.0 7.9 -1.13 200 5 629.0 12.7 0.96 220 5 710.0 18.0 3.27 Overall 20 10.5 H = 16.89 DF = 3 P = 0.001 H = 16.91 DF = 3 P = 0.001 (adjusted for ties)