MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #20 3/10/02 Taguchi’s Orthogonal Arrays.

Slides:



Advertisements
Similar presentations
Overview of Lecture Partitioning Evaluating the Null Hypothesis ANOVA
Advertisements

Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Estimation of Means and Proportions
General Linear Model With correlated error terms  =  2 V ≠  2 I.
Chapter 4 Randomized Blocks, Latin Squares, and Related Designs
Chapter 1 Computing Tools Data Representation, Accuracy and Precision Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction.
Section 2.3 Gauss-Jordan Method for General Systems of Equations
Design of Engineering Experiments - Experiments with Random Factors
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #19 3/8/02 Taguchi’s Orthogonal Arrays.
Chapter 12 Simple Regression
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
ANOVA Determining Which Means Differ in Single Factor Models Determining Which Means Differ in Single Factor Models.
MAE 552 Heuristic Optimization
Evaluating Hypotheses
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #16 3/1/02 Taguchi’s Orthogonal Arrays.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #18 3/6/02 Taguchi’s Orthogonal Arrays.
Lecture 9: One Way ANOVA Between Subjects
Role and Place of Statistical Data Analysis and very simple applications Simplified diagram of scientific research When you know the system: Estimation.
Analysis of Variance Chapter 3Design & Analysis of Experiments 7E 2009 Montgomery 1.
Handling Categorical Data. Learning Outcomes At the end of this session and with additional reading you will be able to: – Understand when and how to.
Chapter 9 - Lecture 2 Computing the analysis of variance for simple experiments (single factor, unrelated groups experiments).
Role and Place of Statistical Data Analysis and very simple applications Simplified diagram of a scientific research When you know the system: Estimation.
Introduction to Regression Analysis, Chapter 13,
1 Psych 5500/6500 Statistics and Parameters Fall, 2008.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
Psy B07 Chapter 1Slide 1 ANALYSIS OF VARIANCE. Psy B07 Chapter 1Slide 2 t-test refresher  In chapter 7 we talked about analyses that could be conducted.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Chapter 13: Inference in Regression
Hypothesis Testing.
One-Factor Experiments Andy Wang CIS 5930 Computer Systems Performance Analysis.
QNT 531 Advanced Problems in Statistics and Research Methods
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
10.2 Estimating a Population Mean when σ is Unknown When we don’t know σ, we substitute the standard error of for its standard deviation. This is the standard.
Experimental Design An Experimental Design is a plan for the assignment of the treatments to the plots in the experiment Designs differ primarily in the.
Lecture 14 Sections 7.1 – 7.2 Objectives:
© 1998, Geoff Kuenning General 2 k Factorial Designs Used to explain the effects of k factors, each with two alternatives or levels 2 2 factorial designs.
Fundamentals of Data Analysis Lecture 9 Management of data sets and improving the precision of measurement.
Fundamentals of Data Analysis Lecture 10 Management of data sets and improving the precision of measurement pt. 2.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Experimental Design If a process is in statistical control but has poor capability it will often be necessary to reduce variability. Experimental design.
Testing Hypotheses about Differences among Several Means.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
DOX 6E Montgomery1 Design of Engineering Experiments Part 9 – Experiments with Random Factors Text reference, Chapter 13, Pg. 484 Previous chapters have.
Analysis of Variance (ANOVA) Brian Healy, PhD BIO203.
ANOVA: Analysis of Variance.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 14 Experimental.
Chapter 14 Repeated Measures and Two Factor Analysis of Variance
1 3. M ODELING U NCERTAINTY IN C ONSTRUCTION Objective: To develop an understanding of the impact of uncertainty on the performance of a project, and to.
Robust System Design Session #11 MIT Plan for the Session Quiz on Constructing Orthogonal Arrays (10 minutes) Complete some advanced topics on OAs Lecture.
Copyright © Cengage Learning. All rights reserved. 12 Analysis of Variance.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
Two-Way (Independent) ANOVA. PSYC 6130A, PROF. J. ELDER 2 Two-Way ANOVA “Two-Way” means groups are defined by 2 independent variables. These IVs are typically.
ANOVA, Regression and Multiple Regression March
Sampling Design and Analysis MTH 494 Lecture-21 Ossam Chohan Assistant Professor CIIT Abbottabad.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Chapter 14: Analysis of Variance One-way ANOVA Lecture 9a Instructor: Naveen Abedin Date: 24 th November 2015.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
Estimating standard error using bootstrap
Experiments, Simulations Confidence Intervals
Week 2 – PART III POST-HOC TESTS.
The binomial applied: absolute and relative risks, chi-square
Analysis of Covariance (ANCOVA)
CHAPTER 29: Multiple Regression*
Analysis variance (ANOVA).
Econ 3790: Business and Economics Statistics
Chapter 14: Analysis of Variance One-way ANOVA Lecture 8
One-Factor Experiments
Presentation transcript:

MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #20 3/10/02 Taguchi’s Orthogonal Arrays

ANOVA Now that we have a predicted optimum observation value for our experiment, we should compare that to our actual observed value. (We talked about performing the optimum experiment in a previous lecture). If the predicted value is close to the actual, then we can say that the additive model is sufficient to describe our system and visa versa. The opposing case indicates that we probably have interactions amongst our variables.

ANOVA So how close is close enough? To determine this, we have to find our Variance of prediction error. The error value is given by the difference between the predicted and observed optimal. It has 2 independent components: –That caused by estimates of m, m A1, m B1. –That caused by repetition error in an experiment.

ANOVA These two components are independent of one another and therefore, the variance of the prediction error is the sum of the variances of these to error components. To compute the component of the variance due to the first term, we must find the equivalent sample size n 0. It is given as follows: Where: n A1 is replication # of A1 n B1 is replication # of B1 n is # of experiments.

ANOVA The first components contribution will then be The second component is a function of the number of times that we repeat the verification experiment, n r (we called it the replication error). It is given by:

ANOVA So our final calculation (for our experiment) is: You should verify that in our example, this comes our to a value of 80.6(dB) 2.

ANOVA So our corresponding 2σ confidence limits are ±17.96 dB. A prediction error outside these limits is a strong indication that the additive model was not appropriate and that there likely exists some factor interactions. Final note on prediction error. The prediction error does not only apply to our optimum experiment. It applies to any experiment we wish to try that is not part of our matrix experiment.

Interactions Control Factor Interactions. –This entire time, we have been operating under the assumption that an additive model will suit us. Thus we used the variables separable approach. –So what happens if our ANOVA tells us that the additive model is not a good approximation of our process? –Let’s start by defining a factor interaction.

Interactions Interaction: When the effect of changing 2 or more factors simultaneously is not the simple sum of the effects of changing them one at a time, there exists an interaction between some or all of these variables. Consider the following set of figures.

Interactions Notice that the change in performance when varying factor A through each of its 3 levels is the same regardless of the setting of factor B. In this case, there is said to be no interaction between factors A and B.

Interactions Here, the lines are not parallel but the direction of improvement is consistent. In this case, our additive model will give useful results although our predicted outcomes will be inaccurate.

Interactions Here, the lines are not parallel and the direction of improvement is inconsistent. In this case, our additive model will give unpredictable results. This is least desirable.

Interactions The presence of an interaction necessitates the use of a cross product term to describe the variation in eta. These cross products are treated like factors in a matrix experiment which increases the needed dimensionality and likewise the required number of experiments. Therefore, it is desirable to eliminate interactions whenever possible.

Interactions Keep in mind that the mere presence of an interaction does not mean that we must study it. Usually, an engineer is able to judge which factor interactions will be important and which will not based on his/her knowledge of the problem. Analysis note, the degrees of freedom for an interaction is equal to the product of the degrees of freedom of the factors involved.

Experiment Planning The question now is: “Do we have all the tools we need to properly fit an OA to our design task?” Let’s consider this question with our new knowledge about handling factor interactions.

Experiment Planning What information do we need? –The number of factors. –The number of levels for each factor. –The number of interactions to be studied. Let’s consider an example.

Experiment Planning We have a problem that consists of the following: –A single 2-level factor (A). –5 three level factors (B-F). –A two level interaction between A and B. What matrix should we use?

Experiment Planning Let’s look at the L 18 (2 1 x 3 7 ) array. This experiment is set up to handle 1 factor with 2 levels and 7 factors with 3 levels each. Clearly, our experiment will fit in here but what do we do with the extra column? Can we ignore it and still maintain orthogonality? Is it wise to ignore it? Could it be put to better use?

Experiment Planning Let’s consider another example such as the following: –3 factors with 3 levels each (A-C) –1 factor with 2 levels (D) –0 interactions.

Experiment Planning This is very similar to our CVD example right? So could we use the L 9 array? If we do, what do we do about the fact that factor D has only 2 levels? The answer is that we can still use the array and there are techniques that will allow us to fill in the missing levels sensibly.

Experiment Planning Dummy Level Technique –User can fit a factor which has m levels into a column that accommodates n levels (where n>m). For example: 2-level factor A assigned to a 3-level column. Set A 3 = A 1 or A 2 Deciding which is up to the designer. The effect of doing this is that you unbalance your experiment. You will in fact estimate the effect of A 3 to a greater precision.