Another way to select sample size The sample R-squared method.

Slides:



Advertisements
Similar presentations
Agenda of Week VII Review of Week VI Multiple regression Canonical correlation.
Advertisements

Topic 12: Multiple Linear Regression
Lesson 10: Linear Regression and Correlation
Factorial ANOVA More than one categorical explanatory variable.
Ch. 21 Practice.
Monday, November 9 Correlation and Linear Regression.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Multiple Logistic Regression RSQUARE, LACKFIT, SELECTION, and interactions.
MF-852 Financial Econometrics
1 An example. 2 AirlinePercentage on time Complaints Southwest Continental Northwest US Airways United American
BCOR 1020 Business Statistics Lecture 28 – May 1, 2008.
Ch. 14: The Multiple Regression Model building
Chapter 12 Section 1 Inference for Linear Regression.
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
Lecture 5 Correlation and Regression
Quantitative Business Analysis for Decision Making Multiple Linear RegressionAnalysis.
Leedy and Ormrod Ch. 11 Gray Ch. 14
Lecture 16 Correlation and Coefficient of Correlation
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 13: Inference in Regression
Means Tests Hypothesis Testing Assumptions Testing (Normality)
Chapter 13 Chi-squared hypothesis testing. Summary.
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Understanding Multivariate Research Berry & Sanders.
Relationships between Variables. Two variables are related if they move together in some way Relationship between two variables can be strong, weak or.
7.1 - Motivation Motivation Correlation / Simple Linear Regression Correlation / Simple Linear Regression Extensions of Simple.
Section 12.1 Scatter Plots and Correlation HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2008 by Hawkes Learning Systems/Quant Systems,
Model Selection1. 1. Regress Y on each k potential X variables. 2. Determine the best single variable model. 3. Regress Y on the best variable and each.
● Final exam Wednesday, 6/10, 11:30-2:30. ● Bring your own blue books ● Closed book. Calculators and 2-page cheat sheet allowed. No cell phone/computer.
Design and Data Analysis in Psychology I Salvador Chacón Moscoso Susana Sanduvete Chaves School of Psychology Dpt. Experimental Psychology 1.
Regression. Population Covariance and Correlation.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Section 9.3 ~ Hypothesis Tests for Population Proportions Introduction to Probability and Statistics Ms. Young.
Section 9-1: Inference for Slope and Correlation Section 9-3: Confidence and Prediction Intervals Visit the Maths Study Centre.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
CORRELATION: Correlation analysis Correlation analysis is used to measure the strength of association (linear relationship) between two quantitative variables.
Chapter 16 Data Analysis: Testing for Associations.
Discussion of time series and panel models
Statistical planning and Sample size determination.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Categorical Independent Variables STA302 Fall 2013.
Math 4030 – 13a Correlation & Regression. Correlation (Sec. 11.6):  Two random variables, X and Y, both continuous numerical;  Correlation exists when.
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
Correlation. Up Until Now T Tests, Anova: Categories Predicting a Continuous Dependent Variable Correlation: Very different way of thinking about variables.
Inferential Statistics. Explore relationships between variables Test hypotheses –Research hypothesis: a statement of the relationship between variables.
1.What is Pearson’s coefficient of correlation? 2.What proportion of the variation in SAT scores is explained by variation in class sizes? 3.What is the.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
AP Process Test of Significance for Population Proportion.
Kin 304 Inferential Statistics Probability Level for Acceptance Type I and II Errors One and Two-Tailed tests Critical value of the test statistic “Statistics.
Using Microsoft Excel to Conduct Regression Analysis.
Tests of Significance: The Basics ESS chapter 15 © 2013 W.H. Freeman and Company.
Chapter 9 Minitab Recipe Cards. Contingency tests Enter the data from Example 9.1 in C1, C2 and C3.
The Idea of the Statistical Test. A statistical test evaluates the "fit" of a hypothesis to a sample.
The Sample Variation Method A way to select sample size This slide show is a free open source document. See the last slide for copyright information.
AP STATISTICS LESSON 3 – 3 (DAY 2) The role of r 2 in regression.
REGRESSION REVISITED. PATTERNS IN SCATTER PLOTS OR LINE GRAPHS Pattern Pattern Strength Strength Regression Line Regression Line Linear Linear y = mx.
3.3. SIMPLE LINEAR REGRESSION: DUMMY VARIABLES 1 Design and Data Analysis in Psychology II Salvador Chacón Moscoso Susana Sanduvete Chaves.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Scatter Plots and Correlation
Chapter 11 Analysis of Covariance
Ch6 Dummy Variable Regression Models
Statistical Analysis of the Randomized Block Design
Kin 304 Inferential Statistics
LESSON 24: INFERENCES USING REGRESSION
AP STATISTICS LESSON 3 – 3 (DAY 2)
Hypothesis Tests for Proportions
CORRELATION ANALYSIS.
Presentation transcript:

Another way to select sample size The sample R-squared method

F test is based upon Increase in explained variation expressed as a fraction of the variation that the reduced model does not explain.

For any given sample size, the bigger a is, the bigger F becomes. For any a ≠0, F increases as a function of n. So you can get a large F from strong results and a small sample, or from weak results and a large sample.

The sample variation method is to choose a value of a that is just large enough to be interesting, and increase n, calculating F and its p-value each time until p < 0.05; then stop. The final value of n is the smallest sample size for which an effect explaining that much of the remaining variation will be significant. With that sample size, the effect will be significant if and only if it explains a or more of the remaining variation. That's all there is to it. You tell me a proportion of remaining variation that you want to be statistically significant, and I'll tell you a sample size.

Example Suppose we are planning a 2x3x4 analysis of covariance, with two covariates, and factors named A, B and C. We are setting it up as a regression model, with one dummy variable for A, 2 dummy variables for B, and 3 for C. Interactions are represented by product terms, and there are 2 products for the AxB interaction, 3 for AxC, 6 for BxC, and 1*2*3 = 6 for AxBxC. The regression coefficients for these plus two for the covariates and one for the intercept give us p = 26. The null hypothesis is that of no BxC interaction, so r = 6. The "other effects in the model" for which we are "controlling" are represented by 2 covariates and 17 dummy variables and products of dummy variables.