Download presentation
Presentation is loading. Please wait.
1
Appendix A: Probability Theory
Consider the deterministic process x = f(z), where x is the observable variable and f contains unobservable variables, e.g., F = f(a), where F: force, a: acceleration. The process inevitably involves uncertainties. Uncertainties may result from error, imprecision, noise, incompletion, distortion, etc.. Probability theory provides a mathematical framework for dealing with processes under uncertainty. 1
2
A.1 Probabilistic Models
-- A probabilistic model is a mathematical description of an uncertain situation. Elements of probability model Two important ingredients of a probabilistic model: sample space and probability law The main ingredients of a probabilistic model 2
3
Random experiment: whose outcome is not
predictable with certainty Sample space S: the set of all possible outcomes Event E: any subset of S Probability p: frequency, degree of belief or degree of difficulty Probability law P(E): the proportion of time that the outcome is in E
4
Example: A coin is tossed 3 times, where the probability
of a head on an individual toss is p. 4
5
A.1.1 Axioms of Probability
A.1.2 Conditional Probability The proportion of F in E 5
6
Bayes’ formula partition S. 6
7
Total probability theorem:
Bayes’ formula becomes 7
8
Multiplication rule: 8
9
9
10
A.2 Random Variables A random variable is a function that assigns a number to each outcome. Head: win $1 Tail : loss $1 Example: A coin is tossed 3 times. Define X as the random variable denoting quantity 10
11
A.2.1 Probability Distribution and Density Functions
The probability distribution function (PDF) of a random variable X for a real number a is Continuous: probability density function Discrete: probability mass function 11
12
Example: A coin is tossed 3 times.
12 12
13
A.2.2 Joint Distribution and Density Functions
-- involving multiple random variables Joint distribution: Marginalizing: sum over the free variable Marginal distributions: Marginal densities: 13
14
A.2.3 Conditional Distributions
If X and Y are independent, Joint densities: Multiplication rule: A.2.3 Conditional Distributions A.2.4 Bayes’ Rule 14 14
15
Let y: going on summer vocation, x: having a suntan.
Example: Let y: going on summer vocation, x: having a suntan. : Casual relation p(x|y): The probability that someone who is known to have gone on summer vocation has a suntan. Bayes’ Rule invert the above dependency to : Diagnostic (Predicative) relation p(y|x): The probability that someone who is known to have a suntan has gone on summer vocation. 15 15
16
A.2.5 Expectation Expectation, expected value, mean : the average
value of X in a large number of experiments Example: 16 16
17
Properties: e.g., 17 17
18
A.2.6 Variance -- How much X varies around the expected value
Properties of variance: 18 18
19
Covariance: 19 19
20
Properties of covariance:
20 20
21
vii) If are independent, the covariance of any two rv’s is zero.
If the occurrence of X makes Y more likely to occur, the covariance is positive. If the occurrence of X makes Y less likely to occur, the covariance is negative. Correlation: 21 21
22
A.2.7 Weak Law of Large Numbers
22 22
23
By the Chebyshev Inequality
23 23
24
A.3 Special Random Variables
A.3.1 Discrete Distributions Bernoulli Distribution Random variable X takes 0/1. Let p be the probability that X = 1. Bernoulli density: 24 24
25
Binomial Distribution
-- N identical independent Bernoulli trials (0/1) Random variable X represents number of 1s. Binomial density: Multinomial Distribution -- N identical independent trials, each takes one of K states (Binomial trial takes one of 2 states (0/1) 25 25
26
26 26
27
Geometric Distribution
-- rv X represents # tosses needed for a head to come up for the first time Geometric density: Poisson Distribution -- rv X represents, e.g., i) # typos in a book with a total of n words, ii) # cars involved in accidents in a city Poisson density: 27 27
28
Dirichlet Distribution
Beta Distribution Dirichlet Distribution 28 28
29
A.3.2 Continuous Distributions
Uniform Distribution -- X takes values over the interval [a, b] 29 29
30
30
31
Normal (Gaussian) Distribution
Normal density: 31 31
32
Exponential density Rayleigh density 32
33
Erlang (gamma) density
33
34
Z-normalization: 34 34
35
Theorem 1: Let f be a differentiable strictly increasing
or strictly decreasing function defined on I. Let X be a continuous random variable having density Let having density Then, Proof: Let the distribution functions of r and s (a) T strictly increasing 35
36
(b) T strictly decreasing
34
37
Theorem 2: Let be independent random
variables having the respective normal densities . Then, has the normal density Proof: Assume that , then Since are independent, 37
38
38
39
39
40
Theorem 3: Central Limit Theorem
Example: Computer generation of normal distribution 40
41
41
42
Input image with gray values of 100 Gaussian noise Histogram
36
43
A.3.3 Chi-Square Distribution
Gamma density: Chi-square density: 43 43
44
44 44
45
A.3.4 t Distribution t density: 45 45
46
A.3.5 F Distribution F density: 46 46
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.