Download presentation
Presentation is loading. Please wait.
Published byBaldric Merritt Modified over 6 years ago
1
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use the data available to find estimates of the unknown parameters of this relationship. However, we will not be able to find an exact relationship because of the random error. The nature of the random error u will determine the properties of our estimates of the unknown parameters.
2
Modelling random variables
A random experiment is an experiment whose outcome cannot be predicted with certainty. The sample space is a set of all possible outcomes of a random experiment An event is a subset of the sample space. Example: Tossing a coin is a random experiment because we don’t know the outcome. The sample space consists of heads and tails. An event is either heads or tails.
3
Probability A Bernouilli trial is special kind of experiment in which the outcome can be classified as either ‘success’ or ‘failure’. The relative frequency is the number of successes observed in n Bernouilli trials = k/n. The probability of a success is the value that the relative frequency converges to as n becomes large. Example: If we perform a coin toss experiment enough times and ‘success’ is a head, then the relative frequency of successes should converge on ½. If p is the probability of a success in a Bernouilli trial then the probability of failure is q = 1-p.
4
The Binomial Distribution
Suppose we construct n Bernouilli trials. The number of successes K is a random variable which follows the Binomial Distribution. The probability of k successes in n trials is given by the expression If we write down the probabilities for all possible values of k then this gives us the Probability Distribution Function.
5
Example: Suppose we toss a coin twice. What is the probability
distribution function for the number of heads observed? This is an example of a discrete random variable because there it can only take on a finite number of possible values. More generally we are interested in continuous random variables where there is an infinite number of possible values.
6
If the number of trials is large enough then we can get a good
approximation to the binomial probabilities by using the normal curve.
7
The Normal Distribution
The normal distribution is of interest in itself because many distribution become approximately normal in large samples. However, it differs from the binomial in that it is a continuous distribution and this requires us to introduce the idea of the probability density function or PDF. For the random variable X, the PDF is defined as the function f (x) such that:
8
The Normal Distribution
The normal distribution has a PDF with the shape shown below The area under this curve between any two points gives the probability that x lies in this interval.
9
The probability density function for a normal distribution
is defined as: where m is the mean and s is the standard deviation. The standard normal distribution is that distribution which has mean equal to zero and standard deviation equal to one. In which case the pdf simplifies to:
10
The z transformation or ‘standardising’ a normal distribution
It is possible to transform any normal distribution into the standard normal distribution (mean = 0, standard deviation = 1) as follows: The advantage of this transformation is that we only have to tabulate the standard normal distribution to be able to look up critical values and/or P-values for test statistics.
11
Example: Therefore if we wish to calculate the probability that a random observation of X is positive then:
12
The area under the PDF to the right of a particular value is
known as the P-Value. For example, the P-Value for for the standard normal distribution is
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.