Presentation is loading. Please wait.

Presentation is loading. Please wait.

Last Time Central Limit Theorem –Illustrations –How large n? –Normal Approximation to Binomial Statistical Inference –Estimate unknown parameters –Unbiasedness.

Similar presentations


Presentation on theme: "Last Time Central Limit Theorem –Illustrations –How large n? –Normal Approximation to Binomial Statistical Inference –Estimate unknown parameters –Unbiasedness."— Presentation transcript:

1 Last Time Central Limit Theorem –Illustrations –How large n? –Normal Approximation to Binomial Statistical Inference –Estimate unknown parameters –Unbiasedness (centered correctly) –Standard error (measures spread)

2 Administrative Matters Midterm II, coming Tuesday, April 6

3 Administrative Matters Midterm II, coming Tuesday, April 6 Numerical answers: –No computers, no calculators

4 Administrative Matters Midterm II, coming Tuesday, April 6 Numerical answers: –No computers, no calculators –Handwrite Excel formulas (e.g. =9+4^2) –Don’t do arithmetic (e.g. use such formulas)

5 Administrative Matters Midterm II, coming Tuesday, April 6 Numerical answers: –No computers, no calculators –Handwrite Excel formulas (e.g. =9+4^2) –Don’t do arithmetic (e.g. use such formulas) Bring with you: –One 8.5 x 11 inch sheet of paper

6 Administrative Matters Midterm II, coming Tuesday, April 6 Numerical answers: –No computers, no calculators –Handwrite Excel formulas (e.g. =9+4^2) –Don’t do arithmetic (e.g. use such formulas) Bring with you: –One 8.5 x 11 inch sheet of paper –With your favorite info (formulas, Excel, etc.)

7 Administrative Matters Midterm II, coming Tuesday, April 6 Numerical answers: –No computers, no calculators –Handwrite Excel formulas (e.g. =9+4^2) –Don’t do arithmetic (e.g. use such formulas) Bring with you: –One 8.5 x 11 inch sheet of paper –With your favorite info (formulas, Excel, etc.) Course in Concepts, not Memorization

8 Administrative Matters Midterm II, coming Tuesday, April 6 Material Covered: HW 6 – HW 10

9 Administrative Matters Midterm II, coming Tuesday, April 6 Material Covered: HW 6 – HW 10 –Note: due Thursday, April 2

10 Administrative Matters Midterm II, coming Tuesday, April 6 Material Covered: HW 6 – HW 10 –Note: due Thursday, April 2 –Will ask grader to return Mon. April 5 –Can pickup in my office (Hanes 352)

11 Administrative Matters Midterm II, coming Tuesday, April 6 Material Covered: HW 6 – HW 10 –Note: due Thursday, April 2 –Will ask grader to return Mon. April 5 –Can pickup in my office (Hanes 352) –So today’s HW not included

12 Administrative Matters Extra Office Hours before Midterm II Monday, Apr. 23 8:00 – 10:00 Monday, Apr. 23 11:00 – 2:00 Tuesday, Apr. 24 8:00 – 10:00 Tuesday, Apr. 24 1:00 – 2:00 (usual office hours)

13 Study Suggestions 1.Work an Old Exam a)On Blackboard b)Course Information Section

14 Study Suggestions 1.Work an Old Exam a)On Blackboard b)Course Information Section c)Afterwards, check against given solutions

15 Study Suggestions 1.Work an Old Exam a)On Blackboard b)Course Information Section c)Afterwards, check against given solutions 2.Rework HW problems

16 Study Suggestions 1.Work an Old Exam a)On Blackboard b)Course Information Section c)Afterwards, check against given solutions 2.Rework HW problems a)Print Assignment sheets b)Choose problems in “random” order

17 Study Suggestions 1.Work an Old Exam a)On Blackboard b)Course Information Section c)Afterwards, check against given solutions 2.Rework HW problems a)Print Assignment sheets b)Choose problems in “random” order c)Rework (don’t just “look over”)

18 Reading In Textbook Approximate Reading for Today’s Material: Pages 356-369, 487-497 Approximate Reading for Next Class: Pages 498-501, 418-422, 372-390

19 Law of Averages Case 2: any random sample CAN SHOW, for n “large” is “roughly” Terminology:  “Law of Averages, Part 2”  “Central Limit Theorem” (widely used name)

20 Central Limit Theorem Illustration: Rice Univ. Applet http://www.ruf.rice.edu/~lane/stat_sim/sampling_dist/index.html Starting Distribut’n user input (very non-Normal) Dist’n of average of n = 25 (seems very mound shaped?)

21 Extreme Case of CLT Consequences: roughly Terminology: Called The Normal Approximation to the Binomial

22 Normal Approx. to Binomial How large n? Bigger is better Could use “n ≥ 30” rule from above Law of Averages But clearly depends on p Textbook Rule: OK when {np ≥ 10 & n(1-p) ≥ 10}

23 Statistical Inference Idea: Develop formal framework for handling unknowns p & μ e.g. 1:Political Polls e.g. 2a:Population Modeling e.g. 2b:Measurement Error

24 Statistical Inference A parameter is a numerical feature of population, not sample An estimate of a parameter is some function of data (hopefully close to parameter)

25 Statistical Inference Standard Error: for an unbiased estimator, standard error is standard deviation Notes:  For SE of, since don’t know p, use sensible estimate  For SE of, use sensible estimate

26 Statistical Inference Another view: Form conclusions by

27 Statistical Inference Another view: Form conclusions by quantifying uncertainty

28 Statistical Inference Another view: Form conclusions by quantifying uncertainty (will study several approaches, first is…)

29 Confidence Intervals Background:

30 Confidence Intervals Background: The sample mean,, is an “estimate” of the population mean,

31 Confidence Intervals Background: The sample mean,, is an “estimate” of the population mean, How accurate?

32 Confidence Intervals Background: The sample mean,, is an “estimate” of the population mean, How accurate? (there is “variability”, how much?)

33 Confidence Intervals Idea: Since a point estimate (e.g. or )

34 Confidence Intervals Idea: Since a point estimate is never exactly right (in particular )

35 Confidence Intervals Idea: Since a point estimate is never exactly right give a reasonable range of likely values (range also gives feeling for accuracy of estimation)

36 Confidence Intervals Idea: Since a point estimate is never exactly right give a reasonable range of likely values (range also gives feeling for accuracy of estimation)

37 Confidence Intervals E.g.

38 Confidence Intervals E.g. with σ known

39 Confidence Intervals E.g. with σ known Think: measurement error

40 Confidence Intervals E.g. with σ known Think: measurement error Each measurement is Normal

41 Confidence Intervals E.g. with σ known Think: measurement error Each measurement is Normal Known accuracy (maybe)

42 Confidence Intervals E.g. with σ known Think: population modeling

43 Confidence Intervals E.g. with σ known Think: population modeling Normal population

44 Confidence Intervals E.g. with σ known Think: population modeling Normal population Known s.d. (a stretch, really need to improve)

45 Confidence Intervals E.g. with σ known Recall the Sampling Distribution:

46 Confidence Intervals E.g. with σ known Recall the Sampling Distribution: (recall have this even when data not normal, by Central Limit Theorem)

47 Confidence Intervals E.g. with σ known Recall the Sampling Distribution: Use to analyze variation

48 Confidence Intervals Understand error as: (normal density quantifies randomness in )

49 Confidence Intervals Understand error as: (distribution centered at μ)

50 Confidence Intervals Understand error as: (spread: s.d. = )

51 Confidence Intervals Understand error as: How to explain to untrained consumers?

52 Confidence Intervals Understand error as: How to explain to untrained consumers? (who don’t know randomness, distributions, normal curves)

53 Confidence Intervals Approach: present an interval

54 Confidence Intervals Approach: present an interval With endpoints: Estimate +- margin of error

55 Confidence Intervals Approach: present an interval With endpoints: Estimate +- margin of error I.e.

56 Confidence Intervals Approach: present an interval With endpoints: Estimate +- margin of error I.e. reflecting variability

57 Confidence Intervals Approach: present an interval With endpoints: Estimate +- margin of error I.e. reflecting variability

58 Confidence Intervals Approach: present an interval With endpoints: Estimate +- margin of error I.e. reflecting variability How to choose ?

59 Confidence Intervals Choice of Confidence Interval Radius

60 Confidence Intervals Choice of Confidence Interval Radius, i.e. margin of error,

61 Confidence Intervals Choice of Confidence Interval Radius, i.e. margin of error, : Notes: No Absolute Range (i.e. including “everything”) is available

62 Confidence Intervals Choice of Confidence Interval Radius, i.e. margin of error, : Notes: No Absolute Range (i.e. including “everything”) is available From infinite tail of normal dist’n

63 Confidence Intervals Choice of Confidence Interval Radius, i.e. margin of error, : Notes: No Absolute Range (i.e. including “everything”) is available From infinite tail of normal dist’n So need to specify desired accuracy

64 Confidence Intervals Choice of margin of error,

65 Confidence Intervals Choice of margin of error, : Approach: Choose a Confidence Level

66 Confidence Intervals Choice of margin of error, : Approach: Choose a Confidence Level Often 0.95

67 Confidence Intervals Choice of margin of error, : Approach: Choose a Confidence Level Often 0.95 (e.g. FDA likes this number for approving new drugs, and it is a common standard for publication in many fields)

68 Confidence Intervals Choice of margin of error, : Approach: Choose a Confidence Level Often 0.95 (e.g. FDA likes this number for approving new drugs, and it is a common standard for publication in many fields) And take margin of error to include that part of sampling distribution

69 Confidence Intervals E.g. For confidence level 0.95, want 0.95 = Area

70 Confidence Intervals E.g. For confidence level 0.95, want distribution 0.95 = Area

71 Confidence Intervals E.g. For confidence level 0.95, want distribution 0.95 = Area = margin of error

72 Confidence Intervals Computation: Recall NORMINV

73 Confidence Intervals Computation: Recall NORMINV takes areas (probs)

74 Confidence Intervals Computation: Recall NORMINV takes areas (probs), and returns cutoffs

75 Confidence Intervals Computation: Recall NORMINV takes areas (probs), and returns cutoffs Issue: NORMINV works with lower areas

76 Confidence Intervals Computation: Recall NORMINV takes areas (probs), and returns cutoffs Issue: NORMINV works with lower areas Note: lower tail included

77 Confidence Intervals So adapt needed probs to lower areas….

78 Confidence Intervals So adapt needed probs to lower areas…. When inner area = 0.95,

79 Confidence Intervals So adapt needed probs to lower areas…. When inner area = 0.95, Right tail = 0.025

80 Confidence Intervals So adapt needed probs to lower areas…. When inner area = 0.95, Right tail = 0.025 Shaded Area = 0.975

81 Confidence Intervals So adapt needed probs to lower areas…. When inner area = 0.95, Right tail = 0.025 Shaded Area = 0.975 So need to compute as:

82 Confidence Intervals Need to compute:

83 Confidence Intervals Need to compute: Major problem: is unknown

84 Confidence Intervals Need to compute: Major problem: is unknown But should answer depend on ?

85 Confidence Intervals Need to compute: Major problem: is unknown But should answer depend on ? “Accuracy” is only about spread

86 Confidence Intervals Need to compute: Major problem: is unknown But should answer depend on ? “Accuracy” is only about spread Not centerpoint

87 Confidence Intervals Need to compute: Major problem: is unknown But should answer depend on ? “Accuracy” is only about spread Not centerpoint Need another view of the problem

88 Confidence Intervals Approach to unknown

89 Confidence Intervals Approach to unknown : Recenter, i.e. look at dist’n

90 Confidence Intervals Approach to unknown : Recenter, i.e. look at dist’n

91 Confidence Intervals Approach to unknown : Recenter, i.e. look at dist’n Key concept: Centered at 0

92 Confidence Intervals Approach to unknown : Recenter, i.e. look at dist’n Key concept: Centered at 0 Now can calculate as:

93 Confidence Intervals Computation of:

94 Confidence Intervals Computation of: Smaller Problem: Don’t know

95 Confidence Intervals Computation of: Smaller Problem: Don’t know Approach 1: Estimate with (natural approach: use estimate)

96 Confidence Intervals Computation of: Smaller Problem: Don’t know Approach 1: Estimate with Leads to complications

97 Confidence Intervals Computation of: Smaller Problem: Don’t know Approach 1: Estimate with Leads to complications Will study later

98 Confidence Intervals Computation of: Smaller Problem: Don’t know Approach 1: Estimate with Leads to complications Will study later Approach 2: Sometimes know

99 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window ~1?

100 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window ~2?

101 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window ~7?

102 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window ~10?

103 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window Early Approach: Use data to choose window width

104 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window Challenge: Not enough info in data for good choice

105 Research Corner How many bumps in stamps data? Kernel Density Estimates Depends on Window Alternate Approach: Scale Space

106 Research Corner Scale Space: Main Idea: Don’t try to choose window width

107 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them

108 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Terminology from Computer Vision

109 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Terminology from Computer Vision (goal: teach computers to “see”)

110 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Terminology from Computer Vision: –Oversmoothing: coarse scale view (zoomed out – macroscopic perception)

111 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Terminology from Computer Vision: –Oversmoothing: coarse scale view –Undersmoothing: fine scale view (zoomed in – microscopic perception)

112 Research Corner Scale Space: View 1: Rainbow colored movie

113 Research Corner Scale Space: View 2: Rainbow colored overlay

114 Research Corner Scale Space: View 3: Rainbow colored surface

115 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Challenge: how to do statistical inference?

116 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Challenge: how to do statistical inference? Which bumps are really there?

117 Research Corner Scale Space: Main Idea: Don’t try to choose window width Instead use all of them Challenge: how to do statistical inference? Which bumps are really there? (i.e. statistically significant)

118 Research Corner Scale Space: Challenge: how to do statistical inference? Which bumps are really there? (i.e. statistically significant)

119 Research Corner Scale Space: Challenge: how to do statistical inference? Which bumps are really there? (i.e. statistically significant) Address this next time

120 Confidence Intervals E.g. Crop researchers plant 15 plots with a new variety of corn.

121 Confidence Intervals E.g. Crop researchers plant 15 plots with a new variety of corn. The yields, in bushels per acre are: 138 139.1 113 132.5 140.7 109.7 118.9 134.8 109.6 127.3 115.6 130.4 130.2 111.7 105.5

122 Confidence Intervals E.g. Crop researchers plant 15 plots with a new variety of corn. The yields, in bushels per acre are: Assume that = 10 bushels / acre 138 139.1 113 132.5 140.7 109.7 118.9 134.8 109.6 127.3 115.6 130.4 130.2 111.7 105.5

123 Confidence Intervals E.g. Find: a)The 90% Confidence Interval for the mean value, for this type of corn. b)The 95% Confidence Interval. c)The 99% Confidence Interval. d)How do the CIs change as the confidence level increases? Solution, part 1 of Class Example 11: http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg11.xls

124 Confidence Intervals E.g. Find: a)90% Confidence Interval for Next study relevant parts of E.g. 11: http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg11.xls

125 Confidence Intervals E.g. Find: a)90% Confidence Interval for Use Excel

126 Confidence Intervals E.g. Find: a)90% Confidence Interval for Use Excel Data in C8:C22

127 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n

128 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n - Average,

129 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n - Average, - S. D., σ

130 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n - Average, - S. D., σ - Margin, m

131 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n - Average, - S. D., σ - Margin, m - CI endpoint, left

132 Confidence Intervals E.g. Find: a)90% Confidence Interval for Steps: - Sample Size, n - Average, - S. D., σ - Margin, m - CI endpoint, left - CI endpoint, right

133 Confidence Intervals E.g. Find: a)90% CI for : [119.6, 128.0]

134 Confidence Intervals An EXCEL shortcut: CONFIDENCE

135 Confidence Intervals An EXCEL shortcut: CONFIDENCE

136 Confidence Intervals An EXCEL shortcut: CONFIDENCE Note: same margin of error as before

137 Confidence Intervals An EXCEL shortcut: CONFIDENCE

138 Confidence Intervals An EXCEL shortcut: CONFIDENCE Inputs: Sample Size

139 Confidence Intervals An EXCEL shortcut: CONFIDENCE Inputs: Sample Size S. D.

140 Confidence Intervals An EXCEL shortcut: CONFIDENCE Inputs: Sample Size S. D. Alpha

141 Confidence Intervals An EXCEL shortcut: CONFIDENCE Careful: parameter α

142 Confidence Intervals An EXCEL shortcut: CONFIDENCE Careful: parameter α is: 2 tailed outer area

143 Confidence Intervals An EXCEL shortcut: CONFIDENCE Careful: parameter α is: 2 tailed outer area So for level = 0.90, α = 0.10

144 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0]

145 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0] b)95% CI for μ: [118.7, 128.9]

146 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0] b)95% CI for μ: [118.7, 128.9] c)99% CI for μ: [117.1, 130.5]

147 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0] b)95% CI for μ: [118.7, 128.9] c)99% CI for μ: [117.1, 130.5] d)How do the CIs change as the confidence level increases?

148 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0] b)95% CI for μ: [118.7, 128.9] c)99% CI for μ: [117.1, 130.5] d)How do the CIs change as the confidence level increases? –Intervals get longer

149 Confidence Intervals E.g. Find: a)90% CI for μ: [119.6, 128.0] b)95% CI for μ: [118.7, 128.9] c)99% CI for μ: [117.1, 130.5] d)How do the CIs change as the confidence level increases? –Intervals get longer –Reflects higher demand for accuracy

150 Confidence Intervals HW: 6.11 (use Excel to draw curve & shade by hand) 6.13, 6.14 (7.30,7.70, wider) 6.16 (n = 2673, so CLT gives Normal)

151 Choice of Sample Size Additional use of margin of error idea

152 Choice of Sample Size Additional use of margin of error idea Background: distributions Small n Large n

153 Choice of Sample Size Could choose n to make = desired value

154 Choice of Sample Size Could choose n to make = desired value But S. D. is not very interpretable

155 Choice of Sample Size Could choose n to make = desired value But S. D. is not very interpretable, so make “margin of error”, m = desired value

156 Choice of Sample Size Could choose n to make = desired value But S. D. is not very interpretable, so make “margin of error”, m = desired value Then get: “ is within m units of, 95% of the time”

157 Choice of Sample Size Given m, how do we find n?

158 Choice of Sample Size Given m, how do we find n? Solve for n (the equation):

159 Choice of Sample Size Given m, how do we find n? Solve for n (the equation): (where is n in this?)

160 Choice of Sample Size Given m, how do we find n? Solve for n (the equation): (use of “standardization”)

161 Choice of Sample Size Given m, how do we find n? Solve for n (the equation):

162 Choice of Sample Size Given m, how do we find n? Solve for n (the equation): [so use NORMINV & Stand. Normal, N(0,1)]

163 Choice of Sample Size Graphically, find m so that: Area = 0.95

164 Choice of Sample Size Graphically, find m so that: Area = 0.95 Area = 0.975

165 Choice of Sample Size Thus solve:

166 Choice of Sample Size Thus solve:

167 Choice of Sample Size Thus solve:

168 Choice of Sample Size (put this on list of formulas)

169 Choice of Sample Size Numerical fine points:

170 Choice of Sample Size Numerical fine points: Change this for coverage prob. ≠ 0.95

171 Choice of Sample Size Numerical fine points: Change this for coverage prob. ≠ 0.95 Round decimals upwards, To be “sure of desired coverage”

172 Choice of Sample Size EXCEL Implementation: Class Example 11, Part 2: http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg11.xls

173 Choice of Sample Size Class Example 11, Part 2: Recall: Corn Yield Data

174 Choice of Sample Size Class Example 11, Part 2: Recall: Corn Yield Data Gave

175 Choice of Sample Size Class Example 11, Part 2: Recall: Corn Yield Data Gave Assumed σ = 10

176 Choice of Sample Size Class Example 11, Part 2: Recall: Corn Yield Data Resulted in margin of error, m

177 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2?

178 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from:

179 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from: (recall 90% central area, so use 95% cutoff)

180 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from:

181 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from:

182 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from:

183 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from:

184 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? Compute from: Round up, to be safe in statement

185 Choice of Sample Size Class Example 11, Part 2: Excel Function to round up: CIELING

186 Choice of Sample Size Class Example 11, Part 2: How large should n be to give smaller (90%) margin of error, say m = 2? n = 68

187 Choice of Sample Size Now ask for higher confidence level: How large should n be to give smaller (99%) margin of error, say m = 2?

188 Choice of Sample Size Now ask for higher confidence level: How large should n be to give smaller (99%) margin of error, say m = 2? Similar computations: n = 166

189 Choice of Sample Size Now ask for smaller margin: How large should n be to give smaller (99%) margin of error, say m = 0.2?

190 Choice of Sample Size Now ask for smaller margin: How large should n be to give smaller (99%) margin of error, say m = 0.2? Similar computations: n = 16588

191 Choice of Sample Size Now ask for smaller margin: How large should n be to give smaller (99%) margin of error, say m = 0.2? Similar computations: n = 16588 Note: serious round up

192 Choice of Sample Size Now ask for smaller margin: How large should n be to give smaller (99%) margin of error, say m = 0.2? Similar computations: n = 16588 (10 times the accuracy requires 100 times as much data)

193 Choice of Sample Size Now ask for smaller margin: How large should n be to give smaller (99%) margin of error, say m = 0.2? Similar computations: n = 16588 (10 times the accuracy requires 100 times as much data) (Law of Averages: Square Root)

194 Choice of Sample Size HW: 6.29, 6.30 (52), 6.31

195 And now for something completely different…. An interesting advertisement: http://www.albinoblacksheep.com/flash/honda.php

196 C.I.s for proportions Recall: Counts:

197 C.I.s for proportions Recall: Counts: Sample Proportions:

198 C.I.s for proportions Calculate prob’s with BINOMDIST

199 C.I.s for proportions Calculate prob’s with BINOMDIST (but C.I.s need inverse of probs)

200 C.I.s for proportions Calculate prob’s with BINOMDIST, but note no BINOMINV

201 C.I.s for proportions Calculate prob’s with BINOMDIST, but note no BINOMINV, so instead use Normal Approximation Recall:

202 Normal Approx. to Binomial Example: from StatsPortal http://courses.bfwpub.com/ips6e.php For Bi(n,p): Control n Control p See Prob. Histo. Compare to fit (by mean & sd) Normal dist’n

203 C.I.s for proportions Recall Normal Approximation to Binomial

204 C.I.s for proportions Recall Normal Approximation to Binomial: For

205 C.I.s for proportions Recall Normal Approximation to Binomial: For is approximately

206 C.I.s for proportions Recall Normal Approximation to Binomial: For is approximately

207 C.I.s for proportions Recall Normal Approximation to Binomial: For is approximately So use NORMINV

208 C.I.s for proportions Recall Normal Approximation to Binomial: For is approximately So use NORMINV (and often NORMDIST)

209 C.I.s for proportions Main problem: don’t know p

210 C.I.s for proportions Main problem: don’t know p Solution: Depends on context: CIs or hypothesis tests

211 C.I.s for proportions Main problem: don’t know p Solution: Depends on context: CIs or hypothesis tests Different from Normal

212 C.I.s for proportions Main problem: don’t know p Solution: Depends on context: CIs or hypothesis tests Different from Normal, since now mean and sd are linked

213 C.I.s for proportions Main problem: don’t know p Solution: Depends on context: CIs or hypothesis tests Different from Normal, since now mean and sd are linked, with both depending on p

214 C.I.s for proportions Main problem: don’t know p Solution: Depends on context: CIs or hypothesis tests Different from Normal, since now mean and sd are linked, with both depending on p, instead of separate μ & σ.

215 C.I.s for proportions Case 1: Margin of Error and CIs: 95%

216 C.I.s for proportions Case 1: Margin of Error and CIs: 95% 0.975

217 C.I.s for proportions Case 1: Margin of Error and CIs: 95% 0.975 So:

218 C.I.s for proportions Case 1: Margin of Error and CIs:

219 C.I.s for proportions Case 1: Margin of Error and CIs: Continuing problem: Unknown

220 C.I.s for proportions Case 1: Margin of Error and CIs: Continuing problem: Unknown Solution 1: “Best Guess”

221 C.I.s for proportions Case 1: Margin of Error and CIs: Continuing problem: Unknown Solution 1: “Best Guess” Replace by

222 C.I.s for proportions Solution 2: “Conservative”

223 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible

224 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible (makes no sense for Normal)

225 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible (makes no sense for Normal)

226 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible (makes no sense for Normal)

227 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible (makes no sense for Normal) zeros at 0 & 1

228 C.I.s for proportions Solution 2: “Conservative” Idea: make sd (and thus m) as large as possible (makes no sense for Normal) zeros at 0 & 1 max at

229 C.I.s for proportions Solution 1: “Conservative” Can check by calculus so

230 C.I.s for proportions Solution 1: “Conservative” Can check by calculus so Thus

231 C.I.s for proportions Solution 1: “Conservative” Can check by calculus so Thus

232 C.I.s for proportions Example: Old Text Problem 8.8

233 C.I.s for proportions Example: Old Text Problem 8.8 Power companies spend time and money trimming trees to keep branches from falling on lines.

234 C.I.s for proportions Example: Old Text Problem 8.8 Power companies spend time and money trimming trees to keep branches from falling on lines. Chemical treatment can stunt tree growth, but too much may kill the tree.

235 C.I.s for proportions Example: Old Text Problem 8.8 Power companies spend time and money trimming trees to keep branches from falling on lines. Chemical treatment can stunt tree growth, but too much may kill the tree. In an experiment on 216 trees, 41 died.

236 C.I.s for proportions Example: Old Text Problem 8.8 Power companies spend time and money trimming trees to keep branches from falling on lines. Chemical treatment can stunt tree growth, but too much may kill the tree. In an experiment on 216 trees, 41 died. Give a 99% CI for the proportion expected to die from this treatment.

237 C.I.s for proportions Example: Old Text Problem 8.8 Solution: Class example 12, part 1 http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg12.xls

238 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n

239 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X

240 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Check Normal Approximation

241 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Check Normal Approximation

242 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Best Guess Margin of Error

243 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Best Guess Margin of Error

244 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Best Guess Margin of Error (Recall 99% level & 2 tails…)

245 C.I.s for proportions Class e.g. 12, part 1 Sample Size, n Data Count, X Sample Prop., Best Guess Margin of Error Conservative Margin of Error

246 C.I.s for proportions Class e.g. 12, part 1 Best Guess CI: [0.121, 0.259]

247 C.I.s for proportions Class e.g. 12, part 1 Best Guess CI: [0.121, 0.259] Conservative CI: [0.102, 0.277]

248 C.I.s for proportions Example: Old Text Problem 8.8 Solution: Class example 12, part 1 http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg12.xls Note: Conservative is bigger

249 C.I.s for proportions Example: Old Text Problem 8.8 Solution: Class example 12, part 1 http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg12.xls Note: Conservative is bigger Since

250 C.I.s for proportions Example: Old Text Problem 8.8 Solution: Class example 12, part 1 http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg12.xls Note: Conservative is bigger Since Big gap

251 C.I.s for proportions Example: Old Text Problem 8.8 Solution: Class example 12, part 1 http://www.stat-or.unc.edu/webspace/courses/marron/UNCstor155-2009/ClassNotes/Stor155Eg12.xls Note: Conservative is bigger Since Big gap So may pay substantial price for being “safe”

252 C.I.s for proportions HW: 8.7 Do both best-guess and conservative CIs: 8.11, 8.13a, 8.19


Download ppt "Last Time Central Limit Theorem –Illustrations –How large n? –Normal Approximation to Binomial Statistical Inference –Estimate unknown parameters –Unbiasedness."

Similar presentations


Ads by Google