Download presentation
Presentation is loading. Please wait.
Published byOctavia Holmes Modified over 6 years ago
1
Lecture 1.31 Criteria for optimal reception of radio signals.
Max. Likelihood Detector Probability of Error
2
Probability Decision Theory
Bayesian decision theory is a fundamental statistical approach to the problem of pattern classification. Quantify the tradeoffs between various classification decisions using probability and the costs that accompany these decisions. Assume all relevant probability distributions are known (later we will learn how to estimate these from data). Can we exploit prior knowledge in our signal classification problem: Are the sequence of signal predictable? (statistics) Is each class equally probable? (uniform priors) What is the cost of an error? (risk, optimization)
3
Prior Probabilities State of nature is prior information
Model as a random variable, : = 1: the event that the next signal is «zero» category 1: «zero»; category 2: : «unit» P(1) = probability of category 1 P(2) = probability of category 2 P(1) + P( 2) = 1 Exclusivity: 1 and 2 share no basic events Exhaustivity: the union of all outcomes is the sample space (either 1 or 2 must occur) If all incorrect classifications have an equal cost: Decide 1 if P(1) > P(2); otherwise, decide 2
4
Probability Functions
A probability density function is denoted in lowercase and represents a function of a continuous variable. px(x|), often abbreviated as p(x), denotes a probability density function for the random variable X. Note that px(x|) and py(y|) can be two different functions. P(x|) denotes a probability mass function, and must obey the following constraints: Probability mass functions are typically used for discrete random variables while densities describe continuous random variables (latter must be integrated).
5
Bayes Formula Suppose we know both P(j) and p(x|j), and we can measure x. How does this influence our decision? The joint probability of finding a pattern that is in category j and that this pattern has a feature value of x is: Rearranging terms, we arrive at Bayes formula: where in the case of two categories:
6
Posterior Probabilities
Bayes formula: can be expressed in words as: By measuring x, we can convert the prior probability, P(j), into a posterior probability, P(j|x). Evidence can be viewed as a scale factor and is often ignored in optimization applications.
7
Bayes Decision Rule Decision rule:
For an observation x, decide 1 if P(1|x) > P(2|x); otherwise, decide 2 Probability of error: The average probability of error is given by: If for every x we ensure that is as small as possible, then the integral is as small as possible. Thus, Bayes decision rule minimizes
8
Detection Matched filter reduces the received signal to a single variable z(T), after which the detection of symbol is carried out The concept of maximum likelihood detector is based on Statistical Decision Theory It allows us to formulate the decision rule that operates on the data optimize the detection criterion
9
Likelihood Suppose the data have density
The likelihood is the probability of the observed data, as a function of the parameters.
10
Probabilities Review P[s0], P[s1] a priori probabilities
These probabilities are known before transmission P[z] probability of the received sample p(z|s0), p(z|s1) conditional pdf of received signal z, conditioned on the class si P[s0|z], P[s1|z] a posteriori probabilities After examining the sample, we make a refinement of our previous knowledge P[s1|s0], P[s0|s1] wrong decision (error) P[s1|s1], P[s0|s0] correct decision
11
How to Choose the threshold?
Maximum Likelihood Ratio test and Maximum a posteriori (MAP) criterion: If else Problem is that a posteriori probabilities are not known. Solution: Use Bay’s theorem: This means that if received signal is positive, s1 (t) was sent, else s0 (t) was sent
12
Likelihood of So and S1 1
13
Likelihood ratio test MAP criterion:
When the two signals, s0(t) and s1(t), are equally likely, i.e., P(s0) = P(s1) = 0.5, then the decision rule becomes This is known as maximum likelihood ratio test because we are selecting the hypothesis that corresponds to the signal with the maximum likelihood. In terms of the Bayes criterion, it implies that the cost of both types of error is the same
14
Likelihood ratio test (cond)
Substituting the pdfs
15
Likelihood ratio test (cond)
Hence: Taking the log, both sides will give
16
Types of Error Type I Error: Type II Error:
Rejecting H0 when H0 is true Type II Error: Accepting H0 when H0 is false
17
Probability of Error Error will occur if s1 is sent s0 is received s0 is sent s1 is received The total probability of error is sum of the errors
18
Likelihood ratio test (cond)
Hence where z is the minimum error criterion and 0 is optimum threshold For antipodal signal, s1(t) = - s0 (t) a1 = - a0
19
Probability of Error (contd)
If signals are equally probable Hence, the probability of bit error PB, is the probability that an incorrect hypothesis is made Numerically, PB is the area under the tail of either of the conditional distributions p(z|s1) or p(z|s0)
20
Probability of Error (contd)
The above equation cannot be evaluated in closed form (Q-function) Hence,
21
Co-error function Q(x) is called the complementary error function or co-error function Is commonly used symbol for probability Another approximation for Q(x) for x>3 is as follows: Q(x) is presented in a tabular form
22
Co-error Table
23
Imp. Observation To minimize PB, we need to maximize: or Where (a1-a2) is the difference of desired signal components at filter output at t=T, and square of this difference signal is the instantaneous power of the difference signal i.e. Signal to Noise Ratio
24
Neyman-Pearson Criterion
Consider a two class problem Following four probabilities can be computed: Probability of detection (hit) Probability of false alarm Probability of miss R1 R2 Probability of correct rejection We do not know the prior probabilities, so Bayes’s optimum classification is not possible However we do know that Probability of False alarm must be below Based on this constraint (Neyman-Pearson criterion) we can design a classifier Observation: maximizing probability of detection and minimizing probability of false alarm are conflicting goals (in general)
25
Receiver Operating Characteristics
ROC is a plot: probability of false alarm vs. probability of detection Probability of detection Classifier 1 Classifier 2 Probability of false alarm Area under ROC curve is a measure of performance Used also to find a operating point for the classifier
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.