Download presentation
Presentation is loading. Please wait.
2
Uncertainty to epistemic(relating to knowledge)situations involving imperfect or unknown information. Data might be missing or unavailable Data might be present but unreliable or ambiguous. The representation of the data may be imprecise or inconsistent. Data may just be user’s best guest.
3
Probability theory provides a sound principled method for decision making under uncertainty. In propositional logic, we used variables that could take on the logical values True or False. In probability, we use random variables instead. Probabilities can be learned from data. The probability in the range of [0,1] for each possible assignment indicating the probability that the variable has that particular value.
4
In probability we use random variables instead. There are three types of random variable. 1. Boolean random variable:-which can take on values true or false. 2. Discrete random variable:- which can take on a finite number of values. 3. Continuous random variables:-which can take on an infinite number of values.
6
1. P(True)=1;P(False)=0.True variable are certain to be true;False variables are certain to not be true. 2. 0<-P(A)<-1:The probability of any variables being true lies between or at 0 and 1. 3. P(A[B) =P(A) +P(B) for disjoint events A and B.
7
Conditional probability allows us to reason with partial information P(A) is called the priori(or prior) probability of A and P(A|B) is called the a posteriori of A and B
8
Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which determines the probability of an event with uncertain knowledge. Bayes' rule species how to combine data and prior knowledge. In probability theory, it relates the conditional probability and marginal probabilities of two random events. Bayes' theorem was named after the British mathematician Thomas Bayes. The Bayesian inference is an application of Bayes' theorem, which is fundamental to Bayesian statistics.
9
Bayes' theorem can be derived using product rule and conditional probability of event A with known event B: 1. As from product rule we can write: P(A ⋀ B)= P(A|B) P(B) or 2.Similarly, the probability of event B with known event A: P(A ⋀ B)= P(B|A) P(A) Equating right hand side of both the equations, we will get: The above equation (a) is called as Bayes' rule or Bayes' theorem. This equation is basic of most modern AI systems for probabilistic inference.
10
Following are some applications of Bayes' theorem: It is used to calculate the next step of the robot when the already executed step is given. Bayes' theorem is helpful in weather forecasting. It can solve the Monty Hall problem.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.