United Stats Of AMERICA. Unit 7 chapters 26-27 Jordo, Rob III, Kins and Toph.

Slides:



Advertisements
Similar presentations
Chapter 11 Other Chi-Squared Tests
Advertisements

Chapter 12 Goodness-of-Fit Tests and Contingency Analysis
Chapter 27 Inferences for Regression This is just for one sample We want to talk about the relation between waist size and %body fat for the complete population.
Inference for Regression
Inference about the Difference Between the
Copyright ©2011 Brooks/Cole, Cengage Learning More about Inference for Categorical Variables Chapter 15 1.
Copyright ©2006 Brooks/Cole, a division of Thomson Learning, Inc. More About Categorical Variables Chapter 15.
CHI-SQUARE TEST OF INDEPENDENCE
11-2 Goodness-of-Fit In this section, we consider sample data consisting of observed frequency counts arranged in a single row or column (called a one-way.
Presentation 12 Chi-Square test.
Chapter 13 Chi-Square Tests. The chi-square test for Goodness of Fit allows us to determine whether a specified population distribution seems valid. The.
Chi-Square Distributions
The Chi-Square Distribution 1. The student will be able to  Perform a Goodness of Fit hypothesis test  Perform a Test of Independence hypothesis test.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Chapter 26: Comparing Counts AP Statistics. Comparing Counts In this chapter, we will be performing hypothesis tests on categorical data In previous chapters,
More About Significance Tests
Inference for Linear Regression Conditions for Regression Inference: Suppose we have n observations on an explanatory variable x and a response variable.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on Categorical Data 12.
Chapter 11: Applications of Chi-Square. Count or Frequency Data Many problems for which the data is categorized and the results shown by way of counts.
Chapter 11 Chi-Square Procedures 11.3 Chi-Square Test for Independence; Homogeneity of Proportions.
AP Statistics Chapter 26 Notes
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Chapter 9: Non-parametric Tests n Parametric vs Non-parametric n Chi-Square –1 way –2 way.
Chi-square test or c2 test
Chi-square test Chi-square test or  2 test Notes: Page Goodness of Fit 2.Independence 3.Homogeneity.
Chapter 26 Chi-Square Testing
Chi-Square Procedures Chi-Square Test for Goodness of Fit, Independence of Variables, and Homogeneity of Proportions.
Other Chi-Square Tests
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Chi-Square Distributions. Recap Analyze data and test hypothesis Type of test depends on: Data available Question we need to answer What do we use to.
Section 9-1: Inference for Slope and Correlation Section 9-3: Confidence and Prediction Intervals Visit the Maths Study Centre.
FPP 28 Chi-square test. More types of inference for nominal variables Nominal data is categorical with more than two categories Compare observed frequencies.
13.2 Chi-Square Test for Homogeneity & Independence AP Statistics.
Chapter 14: Chi-Square Procedures – Test for Goodness of Fit.
Other Chi-Square Tests
Copyright © 2010 Pearson Education, Inc. Slide
Section 10.2 Independence. Section 10.2 Objectives Use a chi-square distribution to test whether two variables are independent Use a contingency table.
Comparing Counts.  A test of whether the distribution of counts in one categorical variable matches the distribution predicted by a model is called a.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Chapter 13 Inference for Counts: Chi-Square Tests © 2011 Pearson Education, Inc. 1 Business Statistics: A First Course.
Chapter Outline Goodness of Fit test Test of Independence.
11.2 Tests Using Contingency Tables When data can be tabulated in table form in terms of frequencies, several types of hypotheses can be tested by using.
Section 12.2: Tests for Homogeneity and Independence in a Two-Way Table.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 11 Analyzing the Association Between Categorical Variables Section 11.2 Testing Categorical.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 12 Tests of Goodness of Fit and Independence n Goodness of Fit Test: A Multinomial.
Chi-Square Goodness of Fit Test. In general, the chi-square test statistic is of the form If the computed test statistic is large, then the observed and.
Comparing Counts Chapter 26. Goodness-of-Fit A test of whether the distribution of counts in one categorical variable matches the distribution predicted.
Chapter 11: Categorical Data n Chi-square goodness of fit test allows us to examine a single distribution of a categorical variable in a population. n.
The Chi-Square Distribution  Chi-square tests for ….. goodness of fit, and independence 1.
Section 10.2 Objectives Use a contingency table to find expected frequencies Use a chi-square distribution to test whether two variables are independent.
Comparing Observed Distributions A test comparing the distribution of counts for two or more groups on the same categorical variable is called a chi-square.
AP Stats Check In Where we’ve been… Chapter 7…Chapter 8… Where we are going… Significance Tests!! –Ch 9 Tests about a population proportion –Ch 9Tests.
Comparing Counts Chi Square Tests Independence.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Chi-square test or c2 test
10 Chapter Chi-Square Tests and the F-Distribution Chapter 10
Inferences for Regression
Inference for Regression
Chapter 12 Tests with Qualitative Data
Active Learning Lecture Slides
AP Stats Check In Where we’ve been… Chapter 7…Chapter 8…
AP Stats Check In Where we’ve been… Chapter 7…Chapter 8…
Chapter 11: Inference for Distributions of Categorical Data
Chapter 10 Analyzing the Association Between Categorical Variables
Chi-square test or c2 test
Analyzing the Association Between Categorical Variables
Chi-Square Hypothesis Testing PART 3:
Inferences for Regression
Presentation transcript:

United Stats Of AMERICA

Unit 7 chapters Jordo, Rob III, Kins and Toph

Chapter 26 Three types of Chi-squared tests: 1) Goodness of Fit 2) Homogeneity 3) Independence

Goodness gracious of Fit The test is used when you have one categorical variable from a single population. It is used to determine whether sample data are consistent with a hypothesized distribution. (How “good” the data fit the hypothesis)

Goodness of Fit Conditions: The sampling method is random. The variable under study is categorical. (counted) The expected value of the number of sample observations in each level of the variable is at least 5. (expected cells > 5) Degrees of freedom: df = n - 1 n = total number of categories

Goodness of Fit Hypothesis: We need a null hypothesis (H 0 ) and an alternative hypothesis (H a ). The hypotheses are mutually exclusive. So if one is true, the other must be false; and vice versa.null hypothesis alternative hypothesis For a chi-square goodness of fit test, the hypotheses take the following form. H 0 : The data are consistent with a specified distribution. H a : The data are not consistent with a specified distribution.

Goodness of Fit Acme Toy Company prints baseball cards. The company claims that 30% of the cards are rookies, 60% veterans, and 10% are All-Stars. The cards are sold in packages of 100. Suppose a randomly-selected package of cards has 50 rookies, 45 veterans, and 5 All-Stars. Is this consistent with Acme's claim? Use a 0.05 level of significance. Here you can see there is only one categorical variable, and putting these numbers in the calculator and doing a x^2 GOF test is super easy. RookiesVeteransAll-Stars

no Homogeneity This test is used for one categorical variable from two populations. It is used to determine whether frequency is consistent across different populations.

Homogeneity Conditions ●Expected Cells > 5 ●Categorical ●Random

Homogeneity Hypothesis: H 0 : The distribution of separate categories is the same. H a : The distribution is different.

Homogeneity Viewing PreferencesRow total Lone Ranger Sesame Street The Simpsons Boys Girls Column total In a study of the television viewing habits of children, a developmental psychologist selects a random sample of 300 first graders boys and 200 girls. Each child is asked which of the following TV programs they like best: The Lone Ranger, Sesame Street, or The Simpsons. Results are shown in the contingency table above. Do the boys' preferences for these TV programs differ significantly from the girls' preferences? Use a 0.05 level of significance. contingency table

declaration of Independence ●We use Independence to find out if one thing causes another or if two samples relate. ●Ho: will always be that X is INDEPENDENT of Y. ●Ha: will always be X is DEPENDENT OF Y.

Independence Example: YesNoTotal Male268 Female4812 Total61420 ●To find Degrees of freedom you would do: (number of rows-1)(number of columns-1) For this chart it would be (2-1)(2-1)=1 ●We have to find expected cells to make sure that they are greater or equal to 5. To do this for the shaded cell we would do 6 times 8 divided by 20 which equals 2.4 which is less than five so this example would not work.

Independence Conditions: -Categorical -Counted -Expected > or to 5 -Random -Independent

Chapter 27 Regression Analysis

●We use regression analysis to determine if a relationship exists between two quantitative variables. ●Chapter 27 is a throwback to earlier chapters ○Chapter 8 - Scatterplots H 0 : 1=0 (This means that the slope is equal to 0, meaning that there is no linear relationship) H A : 1 ≉ 0 (The slope is not equal to 0, so there is a linear relationship)

Conditions ●Straight Enough (Linear) ●Quantitative Data ●Residual Graph is good ●Random ●Nearly Normal ●No Outliers

Example How to make regression equation: ●The row labeled “Constant” or the name of the y-variable is the information for the y- intercept. (Beta 0) ●The other row, which is usually labeled with the name of the x-variable, shows the slope. (Beta 1) Ŷ= (x) *****Make sure you talk about the slope and the r-squared in context*****

Making an inference Since we are testing the Beta 1, which is slope, we will look at P for the x variable. ●P=0.000 ●Conclusion: We have enough evidence to reject the null, and can conclude that there is a relationship between the two variables.

Confidence Intervals The equation for a confidence interval is: 1 士 T*(SE) 1 is SE is We’ll do a 95% confidence interval, so we’ll need to find the t-score using an inverse-T function on the calculator. The equation comes out to be 士 2.1(.3842). Conclusion: We can be 95% confident that the true mean of the relationship between the two variables is between and *****Degrees of Freedom are always n-2*****