Lecture 13 Dustin Lueker. 2  Inferential statistical methods provide predictions about characteristics of a population, based on information in a sample.

Slides:



Advertisements
Similar presentations
Chapter 10: Estimating with Confidence
Advertisements

Chapter 8: Estimating with Confidence
Sampling: Final and Initial Sample Size Determination
1 Virtual COMSATS Inferential Statistics Lecture-7 Ossam Chohan Assistant Professor CIIT Abbottabad.
Chapter 10: Estimating with Confidence
Lecture 14 Dustin Lueker.  This interval will contain μ with a 100(1-α)% confidence ◦ If we are estimating µ, then why it is unreasonable for us to know.
Sociology 601: Class 5, September 15, 2009
Sample size computations Petter Mostad
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
CHAPTER 8 Estimating with Confidence
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Overview Central Limit Theorem The Normal Distribution The Standardised Normal.
BCOR 1020 Business Statistics
Chapter 10: Estimating with Confidence
STA Lecture 201 STA 291 Lecture 20 Exam II Today 5-7pm Memorial Hall (Same place as exam I) Makeup Exam 7:15pm – 9:15pm Location CB 234 Bring a calculator,
Standard error of estimate & Confidence interval.
1 Psych 5500/6500 Statistics and Parameters Fall, 2008.
Estimation Goal: Use sample data to make predictions regarding unknown population parameters Point Estimate - Single value that is best guess of true parameter.
STA Lecture 161 STA 291 Lecture 16 Normal distributions: ( mean and SD ) use table or web page. The sampling distribution of and are both (approximately)
LECTURE 17 THURSDAY, 2 APRIL STA291 Spring
STA291 Statistical Methods Lecture 16. Lecture 15 Review Assume that a school district has 10,000 6th graders. In this district, the average weight of.
Topic 5 Statistical inference: point and interval estimate
Lecture 14 Sections 7.1 – 7.2 Objectives:
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
LECTURE 16 TUESDAY, 31 March STA 291 Spring
Lecture 14 Dustin Lueker. 2  Inferential statistical methods provide predictions about characteristics of a population, based on information in a sample.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
PARAMETRIC STATISTICAL INFERENCE
1 Estimation From Sample Data Chapter 08. Chapter 8 - Learning Objectives Explain the difference between a point and an interval estimate. Construct and.
Lecture 15 Dustin Lueker.  The width of a confidence interval ◦ Increases as the confidence level increases ◦ Increases as the error probability decreases.
STA Lecture 181 STA 291 Lecture 18 Exam II Next Tuesday 5-7pm Memorial Hall (Same place) Makeup Exam 7:15pm – 9:15pm Location TBA.
STA291 Statistical Methods Lecture 18. Last time… Confidence intervals for proportions. Suppose we survey likely voters and ask if they plan to vote for.
Determination of Sample Size: A Review of Statistical Theory
Lecture 11 Dustin Lueker. 2  The larger the sample size, the smaller the sampling variability  Increasing the sample size to 25… 10 samples of size.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Unit 5: Estimating with Confidence Section 10.1 Confidence Intervals: The Basics.
Lecture 7 Dustin Lueker. 2  Point Estimate ◦ A single number that is the best guess for the parameter  Sample mean is usually at good guess for the.
STA Lecture 191 STA 291 Lecture 19 Exam II Next Tuesday 5-7pm Memorial Hall (Same place as exam I) Makeup Exam 7:15pm – 9:15pm Location CB 234.
LECTURE 25 THURSDAY, 19 NOVEMBER STA291 Fall
STA Lecture 171 STA 291 Lecture 17 Chap. 10 Estimation – Estimating the Population Proportion p –We are not predicting the next outcome (which is.
Introduction to Inference: Confidence Intervals and Hypothesis Testing Presentation 8 First Part.
+ DO NOW. + Chapter 8 Estimating with Confidence 8.1Confidence Intervals: The Basics 8.2Estimating a Population Proportion 8.3Estimating a Population.
Review Normal Distributions –Draw a picture. –Convert to standard normal (if necessary) –Use the binomial tables to look up the value. –In the case of.
Lecture 11 Dustin Lueker. 2  The larger the sample size, the smaller the sampling variability  Increasing the sample size to 25… 10 samples of size.
Ex St 801 Statistical Methods Inference about a Single Population Mean (CI)
Lecture 22 Dustin Lueker.  Similar to testing one proportion  Hypotheses are set up like two sample mean test ◦ H 0 :p 1 -p 2 =0  Same as H 0 : p 1.
Lecture 10 Dustin Lueker.  The z-score for a value x of a random variable is the number of standard deviations that x is above μ ◦ If x is below μ, then.
LECTURE 26 TUESDAY, 24 NOVEMBER STA291 Fall
CHAPTER 8 (4 TH EDITION) ESTIMATING WITH CONFIDENCE CORRESPONDS TO 10.1, 11.1 AND 12.1 IN YOUR BOOK.
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 8: Estimating with Confidence Section 8.1 Confidence Intervals: The.
Chapter 7, part D. VII. Sampling Distribution of The sampling distribution of is the probability distribution of all possible values of the sample proportion.
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
04/10/
Nature of Estimation.
LECTURE 24 TUESDAY, 17 November
STA 291 Spring 2010 Lecture 12 Dustin Lueker.
STA 291 Spring 2008 Lecture 11 Dustin Lueker.
Estimation Goal: Use sample data to make predictions regarding unknown population parameters Point Estimate - Single value that is best guess of true parameter.
STA 291 Summer 2008 Lecture 10 Dustin Lueker.
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
STA 291 Spring 2008 Lecture 13 Dustin Lueker.
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
STA 291 Summer 2008 Lecture 12 Dustin Lueker.
STA 291 Summer 2008 Lecture 14 Dustin Lueker.
STA 291 Spring 2008 Lecture 12 Dustin Lueker.
STA 291 Summer 2008 Lecture 15 Dustin Lueker.
STA 291 Spring 2008 Lecture 21 Dustin Lueker.
STA 291 Spring 2008 Lecture 15 Dustin Lueker.
STA 291 Spring 2008 Lecture 14 Dustin Lueker.
Presentation transcript:

Lecture 13 Dustin Lueker

2  Inferential statistical methods provide predictions about characteristics of a population, based on information in a sample from that population ◦ Quantitative variables  Usually estimate the population mean  Mean household income ◦ Qualitative variables  Usually estimate population proportions  Proportion of people voting for candidate A STA 291 Summer 2010 Lecture 13

3  Point Estimate ◦ A single number that is the best guess for the parameter  Sample mean is usually a good guess for the population mean  Interval Estimate ◦ Point estimator with error bound  A range of numbers around the point estimate  Gives an idea about the precision of the estimator  The proportion of people voting for A is between 67% and 73% STA 291 Summer 2010 Lecture 13

4  A point estimator of a parameter is a sample statistic that predicts the value of that parameter  A good estimator is ◦ Unbiased  Centered around the true parameter ◦ Consistent  Gets closer to the true parameter as the sample size gets larger ◦ Efficient  Has a standard error that is as small as possible (made use of all available information) STA 291 Summer 2010 Lecture 13

5  A biased estimator systematically underestimates or overestimates the population parameter ◦ In the definition of sample variance and sample standard deviation uses n-1 instead of n, because this makes the estimator unbiased ◦ With n in the denominator, it would systematically underestimate the variance STA 291 Summer 2010 Lecture 13

6  An estimator is unbiased if its sampling distribution is centered around the true parameter ◦ For example, we know that the mean of the sampling distribution of equals μ, which is the true population mean  So, is an unbiased estimator of μ  Note: For any particular sample, the sample mean may be smaller or greater than the population mean  Unbiased means that there is no systematic underestimation or overestimation STA 291 Summer 2010 Lecture 13

7  An estimator is efficient if its standard error is small compared to other estimators ◦ Such an estimator has high precision  A good estimator has small standard error and small bias (or no bias at all) ◦ The following pictures represent different estimators with different bias and efficiency ◦ Assume that the true population parameter is the point (0,0) in the middle of the picture STA 291 Summer 2010 Lecture 13

8 Note that even an unbiased and efficient estimator does not always hit exactly the population parameter. But in the long run, it is the best estimator. STA 291 Summer 2010 Lecture 13

9  Inferential statements about a parameter should always provide the accuracy of the estimate ◦ How close is the estimate likely to fall to the true parameter value?  Within 1 unit? 2 units? 10 units? ◦ This can be determined using the sampling distribution of the estimator/sample statistic ◦ In particular, we need the standard error to make a statement about accuracy of the estimator STA 291 Summer 2010 Lecture 13

10  Range of numbers that is likely to cover (or capture) the true parameter  Probability that the confidence interval captures the true parameter is called the confidence coefficient or more commonly the confidence level ◦ Confidence level is a chosen number close to 1, usually 0.90, 0.95 or 0.99 ◦ Level of significance = α = 1 – confidence level STA 291 Summer 2010 Lecture 13

11  To calculate the confidence interval, we use the Central Limit Theorem ◦ Substituting the sample standard deviation for the population standard deviation  Also, we need a that is determined by the confidence level  Formula for 100(1-α)% confidence interval for μ STA 291 Summer 2010 Lecture 13

 90% confidence interval ◦ Confidence level of 0.90  α=.10  Z α/2 =1.645  95% confidence interval ◦ Confidence level of 0.95  α=.05  Z α/2 =1.96  99% confidence interval ◦ Confidence level of 0.99  α=.01  Z α/2 = STA 291 Summer 2010 Lecture 13

 This interval will contain μ with a 100(1-α)% confidence ◦ If we are estimating µ, then why it is unreasonable for us to know σ?  Thus we replace σ by s (sample standard deviation)  This formula is used for a large sample size (n≥30)  If we have a sample size less than 30 a different distribution is used, the t-distribution, we will get to this later 13STA 291 Summer 2010 Lecture 13

 Compute a 95% confidence interval for μ if we know that s=12 and the sample of size 36 yielded a mean of 7 14STA 291 Summer 2010 Lecture 13

 “Probability” means that in the long run 100(1-α)% of the intervals will contain the parameter ◦ If repeated samples were taken and confidence intervals calculated then 100(1-α)% of the intervals will contain the parameter  For one sample, we do not know whether the confidence interval contains the parameter  The 100(1-α)% probability only refers to the method that is being used 15STA 291 Summer 2010 Lecture 13

16

 Incorrect statement ◦ With 95% probability, the population mean will fall in the interval from 3.5 to 5.2  To avoid the misleading word “probability” we say that we are “confident” ◦ We are 95% confident that the true population mean will fall between 3.5 and STA 291 Summer 2010 Lecture 13

 Changing our confidence level will change our confidence interval ◦ Increasing our confidence level will increase the length of the confidence interval  A confidence level of 100% would require a confidence interval of infinite length  Not informative  There is a tradeoff between length and accuracy ◦ Ideally we would like a short interval with high accuracy (high confidence level) 18STA 291 Summer 2010 Lecture 13

 Start with the confidence interval formula assuming that the population standard deviation is known  Mathematically we need to solve the above equation for n 19STA 291 Summer 2010 Lecture 13

20  About how large of a sample would have been adequate if we merely needed to estimate the mean to within 0.75, with 95% confidence? Assume s = 5 Note: We will always round the sample size up to ensure that we get within the desired error bound. STA 291 Summer 2010 Lecture 13