Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Point Estimation Notes of STAT 6205 by Dr. Fan.
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Chapter 7. Statistical Estimation and Sampling Distributions
Statistical Estimation and Sampling Distributions
Sampling: Final and Initial Sample Size Determination
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
SOLVED EXAMPLES.
Copyright © Cengage Learning. All rights reserved.
Chapter 5 Estimating Parameters From Observational Data Instructor: Prof. Wilson Tang CIVL 181 Modelling Systems with Uncertainties.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Estimation of parameters. Maximum likelihood What has happened was most likely.
Statistics Lecture 20. Last Day…completed 5.1 Today Parts of Section 5.3 and 5.4.
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Stat 321 – Lecture 26 Estimators (cont.) The judge asked the statistician if she promised to tell the truth, the whole truth, and nothing but the truth?
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Today Today: More on the Normal Distribution (section 6.1), begin Chapter 8 (8.1 and 8.2) Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions:
Today Today: Chapter 8 Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions: 6-1, 6-2, 6-4, , 8-3, 8-5, 8-7 Reading: –Sections 8.1,
Lecture 13 (Greene Ch 16) Maximum Likelihood Estimation (MLE)
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Today Today: Chapter 8, start Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Likelihood probability of observing the data given a model with certain parameters Maximum Likelihood Estimation (MLE) –find the parameter combination.
Probability theory: (lecture 2 on AMLbook.com)
STATISTICAL INFERENCE PART I POINT ESTIMATION
Random Sampling, Point Estimation and Maximum Likelihood.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 7 Point Estimation
R Introduction and Training Patrick Gurian, Drexel University CAMRA 1st QMRA Summer Institute August 7, 2006.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
CY3A2 System identification1 Maximum Likelihood Estimation: Maximum Likelihood is an ancient concept in estimation theory. Suppose that e is a discrete.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
CLASS: B.Sc.II PAPER-I ELEMENTRY INFERENCE. TESTING OF HYPOTHESIS.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Chapter 6 parameter estimation
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Conditional Expectation
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Statistical Estimation
Estimating Volatilities and Correlations
Probability Theory and Parameter Estimation I
Chapter 4. Inference about Process Quality
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
STATISTICAL INFERENCE PART I POINT ESTIMATION
Maximum Likelihood Estimation
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Estimation Maximum Likelihood Estimates Industrial Engineering
Goodness-of-Fit Tests
CONCEPTS OF ESTIMATION
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
More about Posterior Distributions
POINT ESTIMATOR OF PARAMETERS
EC 331 The Theory of and applications of Maximum Likelihood Method
Lecture 2 Interval Estimation
Estimation Maximum Likelihood Estimates Industrial Engineering
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation transcript:

Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25

Estimation Can use the sample mean and sample variance to estimate the population mean and variance respectively How do we estimate parameters in general? Will consider 2 procedures: –Method of moments –Maximum likelihood

Method of Moments Suppose X=(X 1, X 2,…,X n ) represents random sample from a population Suppose distribution of interest has k parameters The procedure for obtaining the k estimators has 3 steps: –Conpute the first k population moments first moment is the mean, second is the variance, … –Set the sample estimates of these moments equal to the population moment –Solve for the population parameters

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a population Suppose the population is Poisson Find the method of moments estimator for the rate parameter

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a population with pdf The mean and variance of X are: Find the method of moments estimator for the parameter

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a population with pdf The mean and variance of X are: Find the method of moments estimator for the parameters

Maximum Likelihood Suppose X=(X 1, X 2,…,X n ) represents random sample from a Ber(p) population What is the distribution of the count of the number of successes What is the likelihood for the data

Example Suppose X=(X 1, X 2,…,X 10 ) represents random sample from a Ber(p) population Suppose 6 successes are observed What is the likelihood for the experiment If p=0.2, what is the probability of observing these data? If p=0.5, what is the probability of observing these data? If p=0.6, what is the probability of observing these data?

Maximum Likelihood Estimators Maximum likelihood estimators are those that result in the largest likelihood for the observed data More specifically, a maximum likelihood estimator (MLE) is: Since the log transformation is monotonically increasing, any value that maximizes the likelihood also maximizes the log likelihood

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a population Suppose the population is Poisson Find the MLE for the rate parameter

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a population Suppose the population has pdf Find the MLE for θ

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a normal population (N(μ,σ 2 ) ) Find the MLE for μ and σ 2

Example Suppose X=(X 1, X 2,…,X n ) represents random sample from a normal population (N(μ,σ 2 ) ) Find the MLE for μ and σ 2