Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Statistical Estimation and Sampling Distributions
Sampling: Final and Initial Sample Size Determination
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Chap 8-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 8 Estimation: Single Population Statistics for Business and Economics.
SOLVED EXAMPLES.
Visual Recognition Tutorial
Maximum likelihood (ML) and likelihood ratio (LR) test
Point estimation, interval estimation
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Maximum likelihood (ML)
Maximum likelihood (ML) and likelihood ratio (LR) test
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Chapter 8 Estimation: Single Population
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Lecture 7 1 Statistics Statistics: 1. Model 2. Estimation 3. Hypothesis test.
Maximum likelihood (ML)
Review of normal distribution. Exercise Solution.
Fundamentals of Data Analysis Lecture 7 ANOVA. Program for today F Analysis of variance; F One factor design; F Many factors design; F Latin square scheme.
Lecture 14 Sections 7.1 – 7.2 Objectives:
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Estimates and Sample Sizes Lecture – 7.4
PROBABILITY (6MTCOAE205) Chapter 6 Estimation. Confidence Intervals Contents of this chapter: Confidence Intervals for the Population Mean, μ when Population.
Random Sampling, Point Estimation and Maximum Likelihood.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
LECTURE 25 THURSDAY, 19 NOVEMBER STA291 Fall
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Copyright © 1998, Triola, Elementary Statistics Addison Wesley Longman 1 Estimates and Sample Sizes Chapter 6 M A R I O F. T R I O L A Copyright © 1998,
CHAPTER 9 Inference: Estimation The essential nature of inferential statistics, as verses descriptive statistics is one of knowledge. In descriptive statistics,
Machine Learning 5. Parametric Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
Confidence Intervals. Point Estimate u A specific numerical value estimate of a parameter. u The best point estimate for the population mean is the sample.
STATISTICS People sometimes use statistics to describe the results of an experiment or an investigation. This process is referred to as data analysis or.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Virtual University of Pakistan
The Maximum Likelihood Method
Chapter 9 Estimation: Additional Topics
ESTIMATION.
STATISTICS POINT ESTIMATION
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
The Maximum Likelihood Method
The Maximum Likelihood Method
Introduction to Instrumentation Engineering
CONCEPTS OF ESTIMATION
Warmup To check the accuracy of a scale, a weight is weighed repeatedly. The scale readings are normally distributed with a standard deviation of
5.2 Least-Squares Fit to a Straight Line
LESSON 18: CONFIDENCE INTERVAL ESTIMATION
Lecture # 2 MATHEMATICAL STATISTICS
Chapter 8 Estimation.
Chapter 9 Estimation: Additional Topics
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Presentation transcript:

Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation

Programm for today F Definitions F Maximum likelihood method F Least square method

Definitions Estimator - any function used to estimate the unknown parameter of the general population; Unloaded estimator - estimator for which the average value is equal to zero, ie, the estimator estimating the distribution parameter without bias; Efficient estimator - estimator with variance as small as possible; Compliant estimator- the estimator that is stochastically converges to the parameter the estimator that is subject to the action of the law of large numbers (using larger samples improves the accuracy of the estimate); Sufficient estimator - the estimator gathering together all the information about tested parameter included in the sample;

Definitions Point estimation - an unknown parameter estimation method consists in the fact that as the value of the parameter estimator is the value of this parameter obtained from the n-element random sample; Interval estimation - estimation consists in the construction of confidence intervals for this parameter. The confidence interval is a range defined by a random distribution of the estimator, and having the property that covers the value of the probability defined in advance, it usually is written in the form P(a<X<b) = 1- α.

Maximum likelihood method This method allows you to find estimators of unknown parameters of such distributions in the population, which its functional form is known. Estimates obtained by maximum likelihood have many desirable properties. The three most important for practical reasons are: 1. For a large number of measurements estimator is normally distributed; 2. Variance of the estimator, the evaluation of the accuracy of determination of the true, is the best that can be achieved in a given situation (optimal); 3. Estimator obtained by this method does not depend on whether the maximum reliability calculated for the estimated parameter, or for any of its functions.

Maximum likelihood method The maximum likelihood estimate of parameter vector is obtained by maximizing the likelihood function. The likelihood contribution of an observation is the probability of observing the data. The likelihood function will vary depending on whether it is completely observed, or if not, then possibly truncated and/or censored. In many cases, it is more straightforward to maximize the logarithm of the likelihood function.

Maximum likelihood method In the case where we have no truncation and no censoring and each observation X i, for i = 1 … n, is recorded, the likelihood function is where f(x i,  ) means probability density function and p(x i,  ) probability function, while  may be a single parameter or a vector.

Maximum likelihood method The corresponding log-likelihood function is

Maximum likelihood method Algorithm for finding the most reliable estimator the parameter Θ is as follows: 1. find the likelihood function L for a given distribution of the population; 2. calculate the logarithm of the likelihood function; 3. using extreme prerequisite solve the equation: obtaining estimator: 4. check the sufficient condition for a maximum:

Maximum likelihood method The maximum likelihood method introduced credibility intervals for appropriate levels of reliability. The solution of the equation: because of Θ for a = 0.5, 2, 4.5, determined intervals corresponding to the reliability of the reliability levels of 68%, 95% and 99.7%

Maximum likelihood method The general population has a two-point distribution of zero-one with an unknown parameter p. Find the most reliable estimator of the parameter p for n - element simple sample. Since the probability distribution of the data is a function of: Example

Maximum likelihood method Therefore, the likelihood function is as follows where m is the number of successes in the sample. Example ln L = m ln(p) + (n-m) ln(1-p)

Maximum likelihood method and the differential of this expression amounting to: is zero if: Example

Maximum likelihood method The second derivative of the logarithm: is less than zero for p*, which means that the reliability of the function has a maximum at that point, and p* is the most reliable estimator of the parameter p Example

Maximum likelihood method The speed of sound in air measured with two different methods is: v 1 = 340±9 m/s, v 2 = 350 ±18 m/s Find the best estimate of the speed of sound. Note: The speed of sound is a weighted average of these results. Exercise

Least square method At the base of the method of least squares is the principle according to which the degree of non-compliance is measured by the sum of the squared deviations of the actual value y and the calculated Y: (y - Y) 2 = minimum.

Least square method Found parabolic equation to the experimental data presented in Table: ixixi yiyi xi2xi2 xiyix1 3 xi 4 xi 2 yi  Exercise

Thank you for attention !