Estimation Maximum Likelihood Estimates Industrial Engineering

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Point Estimation Notes of STAT 6205 by Dr. Fan.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
The Estimation Problem How would we select parameters in the limiting case where we had ALL the data? k → l  l’ k→ l’ Intuitively, the actual frequencies.
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
The General Linear Model. The Simple Linear Model Linear Regression.
Maximum likelihood (ML) and likelihood ratio (LR) test
Part 4 b Forward-Backward Algorithm & Viterbi Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo.
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Maximum likelihood (ML) and likelihood ratio (LR) test
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Stat 321 – Lecture 26 Estimators (cont.) The judge asked the statistician if she promised to tell the truth, the whole truth, and nothing but the truth?
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Lecture 13 (Greene Ch 16) Maximum Likelihood Estimation (MLE)
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
EM Algorithm Likelihood, Mixture Models and Clustering.
Maximum likelihood (ML)
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Likelihood probability of observing the data given a model with certain parameters Maximum Likelihood Estimation (MLE) –find the parameter combination.
Probability theory: (lecture 2 on AMLbook.com)
Estimation Basic Concepts & Estimation of Proportions
Chapter 7 Point Estimation
Selecting Input Probability Distribution. Simulation Machine Simulation can be considered as an Engine with input and output as follows: Simulation Engine.
CY3A2 System identification1 Maximum Likelihood Estimation: Maximum Likelihood is an ancient concept in estimation theory. Suppose that e is a discrete.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
CLASS: B.Sc.II PAPER-I ELEMENTRY INFERENCE. TESTING OF HYPOTHESIS.
South Dakota School of Mines & Technology Introduction to Probability & Statistics Industrial Engineering.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
South Dakota School of Mines & Technology Introduction to Probability & Statistics Industrial Engineering.
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Conditional Expectation
MathematicalMarketing Slide 3c.1 Mathematical Tools Chapter 3: Part c – Parameter Estimation We will be discussing  Nonlinear Parameter Estimation  Maximum.
Stat 223 Introduction to the Theory of Statistics
Copyright © Cengage Learning. All rights reserved.
Stat 223 Introduction to the Theory of Statistics
Probability Theory and Parameter Estimation I
Parameter Estimation 主講人:虞台文.
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Data Analysis Histograms Industrial Engineering
Review of Probability and Estimators Arun Das, Jason Rebello
CONCEPTS OF HYPOTHESIS TESTING
Maximum Likelihood Find the parameters of a model that best fit the data… Forms the foundation of Bayesian inference Slide 1.
Estimation Point Estimates Industrial Engineering
Goodness-of-Fit Tests
More about Posterior Distributions
Parametric Survival Models (ch. 7)
Estimation Method of Moments Industrial Engineering
Estimation Maximum Likelihood Estimates Industrial Engineering
Stat 223 Introduction to the Theory of Statistics
6 Point Estimation.
Data Analysis Statistical Measures Industrial Engineering
Estimation Method of Moments Industrial Engineering
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Presentation transcript:

Estimation Maximum Likelihood Estimates Industrial Engineering

Discrete Case p x ( ) L q = × p x ( ) Suppose we have hypothesized a discrete distribution from which our data which has some unknown parameter . Let denote the probability mass function for this distribution. The likelihood function is q p x q ( ) p x n ( ) L q = × 1 2

Discrete Case L for all possible ( $ ) q ³ L ( ) q $ q Since is just the joint probability, we want to choose some which maximizes this joint probability mass function. $ q L for all possible ( $ ) q ³

Continuous Case Suppose we have a set of nine observations x1, x2, . . . X9 which have underlying distribution exponential (in this case scale parameter l = 2.0). 0.053 0.458 0.112 0.602 0.178 0.805 0.255 1.151 0.347

Continuous Case L f x ( ) l = × Suppose we have a set of nine observations x1, x2, . . . X9 which have underlying distribution exponential (in this case scale parameter l = 2.0). Our object is to estimate the true but unknown parameter l. L f x n ( ) l = × 1 2

MLE (Exponential) L f x ( ) l = × = × l e = å l e - x - x n n 1 2 n 1

MLE (Exponential)

MLE (Exponential) ¶ l L ( ) = We can use the plot to graphically solve for the best estimate of l. Alternatively, we can find the maximum analytically by using calculus. Specifically, ¶ l L ( ) =

Log Likelihood Ln LN L ( ) q = The natural log is a monotonically increasing function. Consequently, maximizing the log of the likelihood function is the same as maximizing the likelihood function itself. Ln LN L ( ) q =

å MLE (Exponential) Ln nln x ( ) l = - L f x ( ) l = × = å l e i - x n 1 2 = å - l e n x i Ln nln x i ( ) l = - å

å å å MLE (Exponential) Ln nln x ( ) l = - ¶ l ¶l Ln n x ( ) ln( = - = = 0 = - å n x i l

MLE (Exponential) n x i l - = å n x i l = å

MLE (Exponential) x l = 1 $ i X 1 0.053 2 0.112 3 0.178 4 0.255 5 0.347 6 0.458 7 0.602 8 0.805 9 1.151 Sum = 3.961 X-bar = 0.440 x l = 1 $

Experimental Data Suppose we wish to make some estimates on time to fail for a new power supply. 40 units are randomly selected and tested to failure. Failure times are recorded follow: X 1 . 19 =

Failure Data