Ch9 Random Function Models (II)

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Spatial point patterns and Geostatistics an introduction
Estimation of Means and Proportions
Chapter 6 Confidence Intervals.
The Simple Regression Model
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Sampling: Final and Initial Sample Size Determination
Ch. 19 Unbiased Estimators Ch. 20 Efficiency and Mean Squared Error CIS 2033: Computational Probability and Statistics Prof. Longin Jan Latecki Prepared.
Chap 8-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 8 Estimation: Single Population Statistics for Business and Economics.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
The Simple Linear Regression Model: Specification and Estimation
Class notes for ISE 201 San Jose State University
Chapter 8 Estimation: Single Population
Deterministic Solutions Geostatistical Solutions
Continuous Random Variables and Probability Distributions
Chapter 7 Estimation: Single Population
Ordinary Kriging Process in ArcGIS
5-3 Inference on the Means of Two Populations, Variances Unknown
Applications in GIS (Kriging Interpolation)
Method of Soil Analysis 1. 5 Geostatistics Introduction 1. 5
Business Statistics: Communicating with Numbers
Chapter 7 Estimation: Single Population
Geo479/579: Geostatistics Ch17. Cokriging
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
PROBABILITY (6MTCOAE205) Chapter 6 Estimation. Confidence Intervals Contents of this chapter: Confidence Intervals for the Population Mean, μ when Population.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
Geo597 Geostatistics Ch9 Random Function Models.
1 Estimation From Sample Data Chapter 08. Chapter 8 - Learning Objectives Explain the difference between a point and an interval estimate. Construct and.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
Geo479/579: Geostatistics Ch4. Spatial Description.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
Confidence Interval & Unbiased Estimator Review and Foreword.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Geo479/579: Geostatistics Ch7. Spatial Continuity.
Continuous Random Variables and Probability Distributions
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Geostatistics GLY 560: GIS for Earth Scientists. 2/22/2016UB Geology GLY560: GIS Introduction Premise: One cannot obtain error-free estimates of unknowns.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (2)
Geo597 Geostatistics Ch11 Point Estimation. Point Estimation  In the last chapter, we looked at estimating a mean value over a large area within which.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Chapter 7: The Distribution of Sample Means
Conditional Expectation
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
CWR 6536 Stochastic Subsurface Hydrology Optimal Estimation of Hydrologic Parameters.
Statistics for Business and Economics 7 th Edition Chapter 7 Estimation: Single Population Copyright © 2010 Pearson Education, Inc. Publishing as Prentice.
Inference about the slope parameter and correlation
12. Principles of Parameter Estimation
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Basic Estimation Techniques
Ch3: Model Building through Regression
Behavioral Statistics
Basic Estimation Techniques
Inference about the Slope and Intercept
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Inference about the Slope and Intercept
STOCHASTIC HYDROLOGY Random Processes
Simple Linear Regression
12. Principles of Parameter Estimation
Chapter 8 Estimation.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
CH2 Time series.
Presentation transcript:

Ch9 Random Function Models (II) Geo597 Geostatistics Ch9 Random Function Models (II)

Spatial Stochastic Process Spatial stochastic process (referred to as “random function” in the text book) is a set of random variables that have spatial locations and whose dependence on each other is specified by some probabilistic mechanism. x is a location whose value is 0,1. V(0) means x=0, the initial location. with probability 1/2 with probability 1/2 with probability 3/4 with probability 1/4

Spatial Stochastic Process ... Like random variables, stochastic processes also have possible outcomes, called “realizations”. The random variables V(0), V(1), V(2), V(3),… V(xi) The pairs of random variables (V(0),V(1)), (V(1),V(2)), (V(2),V(3)),…, (V(x),V(x+1))

Spatial Stochastic Process ... The possible outcomes of (V(x),V(x+1)) and their corresponding probabilities (joint distribution) {(0,0), (0,1), (1,0), (1,1)} 3/8 1/8 1/8 3/8 (0,0) 1/2*3/4= 3/8 (0,1) 1/2*1/4= 1/8 (1,0) 1/2*1/4= 1/8 (1,1) 1/2*3/4= 3/8

Spatial Stochastic Process ... What about the possible outcomes of pairs (V(x),V(x+2)) and their corresponding probabilities? Let’s begin with triples (V(x),V(x+1),V(x+2)) {(0,0,0), (0,0,1), (0,1,0), (0,1,1), (1,0,0), (1,0,1), (1,1,0), (1,1,1)} (0,0,0) (1,1,1) p=1/2*3/4*3/4=9/32 (0,0,1) (1,1,0) p=1/2*3/4*1/4=3/32 (1,0,0) (0,1,1) p=1/2*1/4*3/4=3/32 (0,1,0) (1,0,1) p=1/2*1/4*1/4=1/32

Spatial Stochastic Process ... Now, (V(x),V(x+2)) P{V(x),V(x+2) = (0,1)} = 3/32+3/32=6/32 P{V(x),V(x+2) = (1,0)} = 3/32+3/32=6/32 P{V(x),V(x+2) = (1,1)} = 9/32+1/32=10/32 P{V(x),V(x+2) = (0,0)} = 9/32+1/32=10/32

Stationarity Define pair of random variables separated by distance h: (V(x),V(x+h)). Then the probability of staying at the same value would decrease, asymptotically approaching ½. P{V(x),V(x+1) = (0,0) (1,1)} =3/8+3/8=3/4=24/32 P{V(x),V(x+2) = (0,0) (1,1)} = 10/32+10/32=20/32

Stationarity ... The univariate probability law of V(x) does not depend on the location of x; 0 and 1 have an equal probability of occurring at all locations. Similarly, the bivariate probability law of (V(x), V(x+h)) does not depend on the location x, but only on the separation of h; all pairs of random variables separated by a particular distance h have the same joint probability distribution.

Stationarity ... This univariate and bivariate probability laws from the locations x is referred to as “stationarity”. A spatial process is stationary, if its statistical properties such as mean and variance are independent of absolute location. E{ V(x) } = E{V(x+h)}, Var{V(x) } = Var{V(x+h) } for all x when assuming stationary process.

Stationarity ... Stationarity also implies that the covariance between two sites depends only on the relative locations of their sites, the distance and direction between them, but not on their absolute location x.

Parameters of Random Functions If the random function (spatial stochastic process) is stationary, the univariate parameters, the expected value, and the variance can be used to summarize the univariate behavior of the set of random variables. For stationary random function,

Parameters of Random Functions ... Also assuming stationarity for correlation coefficient,

Parameters of Random Functions ... Variogram

Parameters of Random Functions ... Relationship between covariance and variogram The covariance function and the correlogram eventually reach zero, while the variogram ultimately reaches a maximum value (sill) which is also the variance of the random function.

Parameters of Random Functions ... The correlogram and the covariance have the same shape, with the correlogram being scaled to the range between 1 and –1. The variogram can also be obtained by flipping the covariance function upside-down with respect to the line

Parameters of Random Functions ... A random function U(x) similar to V(x) but with a greater probability to stay the same Fig 9.6. with probability 1/2 with probability 1/2 with probability 7/8 with probability 1/8

Parameters of Random Functions ... Another similar random function W(x) with a smaller (1/2,1/2) probability to stay similar Fig 9.7. with probability 1/2 with probability 1/2

7/8, 1/8 3/4, 1/4 1/2, 1/2

Estimations requires more samples beyond the closest ones. Estimates based on closest samples is reliable.

Implication of Unbiased Estimates Suppose we wish to estimate the unknown true value using a weighted linear combination of p available points. Question 1: How can we choose the weights so that the average estimation error is zero in order to get unbiased estimate? Average error =

Implication of Unbiased Estimates ... However, we do not know the true values , so Average Error = does not help. As a probabilistic solution to this problem, we consider a stationary random function consisting of p+1 random variables: Each has E(V). Any pair of them has a joint distribution that depends on the separation h, not their locations. Each has a Cv(h).

Implication of Unbiased Estimates ... All p samples (plus the unknown true value) are outcomes of random variables Each estimate is also a random variable; some of its parameters are known since it is a linear combination of known random variables: The estimation error is also a random variable:

Implication of Unbiased Estimates ... The estimation error is also a random variable: , Average error= ,

Implication of Unbiased Estimates ...

Implication of Unbiased Estimates ... This result, referred to as the unbiasedness condition, makes such obvious common sense that it is often taken for granted. This is an important condition that you will see in the following chapters.