OVERVIEW OF LEAST-SQUARES, MAXIMUM LIKELIHOOD AND BLUP PART 2

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
CHAPTER 13 M ODELING C ONSIDERATIONS AND S TATISTICAL I NFORMATION “All models are wrong; some are useful.”  George E. P. Box Organization of chapter.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
The General Linear Model. The Simple Linear Model Linear Regression.
Psychology 202b Advanced Psychological Statistics, II February 15, 2011.
Maximum likelihood (ML) and likelihood ratio (LR) test
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
AGC DSP AGC DSP Professor A G Constantinides© Estimation Theory We seek to determine from a set of data, a set of parameters such that their values would.
BCOR 1020 Business Statistics Lecture 28 – May 1, 2008.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Chapter 11 Multiple Regression.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
1  The goal is to estimate the error probability of the designed classification system  Error Counting Technique  Let classes  Let data points in class.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Chapter 14. Linear Regression We can use least squares regression to estimate the linear relationship between two quantitative.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Estimation in Marginal Models (GEE and Robust Estimation)
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Byron Gangnes Econ 427 lecture 6 slides Selecting forecasting models— alternative criteria.
Lecture 5. Linear Models for Correlated Data: Inference.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Yi Jiang MS Thesis 1 Yi Jiang Dept. Of Electrical and Computer Engineering University of Florida, Gainesville, FL 32611, USA Array Signal Processing in.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Lecture 3 (Chapter 4). Linear Models for Longitudinal Data Linear Regression Model (Review) Ordinary Least Squares (OLS) Maximum Likelihood Estimation.
Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Stat 223 Introduction to the Theory of Statistics
STATISTICS POINT ESTIMATION
Stat 223 Introduction to the Theory of Statistics
Probability Theory and Parameter Estimation I
AP Statistics Chapter 14 Section 1.
ICS 280 Learning in Graphical Models
Ch3: Model Building through Regression
STATISTICAL INFERENCE PART I POINT ESTIMATION
Simultaneous equation system
Microeconometric Modeling
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
Two-Variable Regression Model: The Problem of Estimation
جلسه هشتم شناسايي سيستم مدلسازي سيستم هاي بيو لوژيکي.
Marc A. Coram, Huaying Fang, Sophie I. Candille, Themistocles L
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
POINT ESTIMATOR OF PARAMETERS
Linear regression Fitting a straight line to observations.
Review of Chapter 2 Some Basic Concepts: Sample center
10701 / Machine Learning Today: - Cross validation,
OVERVIEW OF LINEAR MODELS
مدلسازي تجربي – تخمين پارامتر
Correlation and Regression-III
Contact: Machine Learning – (Linear) Regression Wilson Mckerrow (Fenyo lab postdoc) Contact:
Simple Linear Regression
Stat 223 Introduction to the Theory of Statistics
OVERVIEW OF LINEAR MODELS
Parametric Methods Berlin Chen, 2005 References:
Chapter 14 Inference for Regression
Maximum Likelihood Estimation (MLE)
Regression Models - Introduction
Inferences 10-3.
Presentation transcript:

OVERVIEW OF LEAST-SQUARES, MAXIMUM LIKELIHOOD AND BLUP PART 2

LIKELIHOOD-BASED INFERENCE

1. The likelihood function

3. Information (Fisher’s) contained in the likelihood

5. Asymptotic Properties of Maximum Likelihood Estimators

BEST LINEAR UNBIASED ESTIMATION AND BEST LINEAR UNBIASED PREDICTION

C. Alternative method for computing BLUE (GLS) estimator of the vector of fixed effects: known V

EXAMPLE: BLUP OF 599 WHEAT LINES

IS BLUP OVERFITTING. FITCOR<-cbind(y,MATPAIR) cor(FITCOR) Y. BLUPR IS BLUP OVERFITTING? FITCOR<-cbind(y,MATPAIR) cor(FITCOR) Y BLUPR BLUP1 BLUP2 BLUP3 BLUP4 BLUP5 1.0000000 0.7391247 0.7384688 0.7159003 0.7275751 0.7295317 0.7287069 0.7391247 1.0000000 0.9963397 0.9508149 0.9841684 0.9863276 0.9852552 0.7384688 0.9963397 1.0000000 0.9444558 0.9812627 0.9814759 0.9806144 0.7159003 0.9508149 0.9444558 1.0000000 0.9348012 0.9338615 0.9328132 0.7275751 0.9841684 0.9812627 0.9348012 1.0000000 0.9688824 0.9680241 0.7295317 0.9863276 0.9814759 0.9338615 0.9688824 1.0000000 0.9979560 0.7287069 0.9852552 0.9806144 0.9328132 0.9680241 0.9979560 1.0000000 NO EVIDENCE BUT BLUPS CLOSELY CORRELATED

####HOW ABOUT TRAINING MEAN-SQUARED ERROR ####HOW ABOUT TRAINING MEAN-SQUARED ERROR? mse<-numeric(6) for (i in 1:6){ mse[i]<-sum((y-MATPAIR[,i])**2)/n } mse [1] 0.6877218 0.6866927 0.6834385 0.6869435 0.6875240 0.6875065