Probabilities Random Number Generators –Actually pseudo-random –Seed Same sequence from same seed Often time is used. Many examples on web. Custom random.

Slides:



Advertisements
Similar presentations
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Advertisements

Please turn off cell phones, pagers, etc. The lecture will begin shortly.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
NIPRL Chapter 1. Probability Theory 1.1 Probabilities 1.2 Events 1.3 Combinations of Events 1.4 Conditional Probability 1.5 Probabilities of Event Intersections.
1 BASIC NOTIONS OF PROBABILITY THEORY. NLE 2 What probability theory is for Suppose that we have a fair dice, with six faces, and that we keep throwing.
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Probability theory Much inspired by the presentation of Kren and Samuelsson.
Machine Learning CMPT 726 Simon Fraser University CHAPTER 1: INTRODUCTION.
A/Prof Geraint Lewis A/Prof Peter Tuthill
September SOME BASIC NOTIONS OF PROBABILITY THEORY Universita’ di Venezia 29 Settembre 2003.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
I The meaning of chance Axiomatization. E Plurbus Unum.
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Bayesian Inference Anders Gorm Pedersen Molecular Evolution Group Center for Biological Sequence Analysis Technical.
Cognitive Processes PSY 334 Chapter 10 – Reasoning & Decision-Making August 21, 2003.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Does Naïve Bayes always work?
De Finetti’s ultimate failure Krzysztof Burdzy University of Washington.
CHAPTER 10: Introducing Probability
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Introduction to ModelingMonte Carlo Simulation Expensive Not always practical Time consuming Impossible for all situations Can be complex Cons Pros Experience.
Chapter 1 Basics of Probability.
Statistics 3502/6304 Prof. Eric A. Suess Chapter 4.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Using Probability and Discrete Probability Distributions
What is Probability?  Hit probabilities  Damage probabilities  Personality (e.g. chance of attack, run, etc.)  ???  Probabilities are used to add.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Some Uses of Probability Randomized algorithms –for CS in general –for games and robotics in particular Testing Simulation Solving probabilistic problems.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Introduction to Bayesian Networks
Week 11 - Wednesday.  What did we talk about last time?  Exam 2 post-mortem  Combinations.
Ch 6-1 © 2004 Pearson Education, Inc. Pearson Prentice Hall, Pearson Education, Upper Saddle River, NJ Ostwald and McLaren / Cost Analysis and Estimating.
* Roll a pair of dice until you get doubles * In basketball, attempt a three-point shot until you make one * Keep placing $1 bets on the number 15 in.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
CHAPTER 10: Introducing Probability ESSENTIAL STATISTICS Second Edition David S. Moore, William I. Notz, and Michael A. Fligner Lecture Presentation.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Making sense of randomness
Section 3.2 Notes Conditional Probability. Conditional probability is the probability of an event occurring, given that another event has already occurred.
NLP. Introduction to NLP Very important for language processing Example in speech recognition: –“recognize speech” vs “wreck a nice beach” Example in.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Probability Rules. We start with four basic rules of probability. They are simple, but you must know them. Rule 1: All probabilities are numbers between.
Probability Review Risk Analysis for Water Resources Planning and Management Institute for Water Resources 2008.
Introduction to Probability (Dr. Monticino). Assignment Sheet  Read Chapters 13 and 14  Assignment #8 (Due Wednesday March 23 rd )  Chapter 13  Exercise.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Sampling and estimation Petter Mostad
Surveys, Experiments, and Simulations Unit 3 Part 4 Simulations.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Section 5.3: Independence and the Multiplication Rule Section 5.4: Conditional Probability and the General Multiplication Rule.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Review of Probability.
Does Naïve Bayes always work?
Quick Review Probability Theory
Psychology 202a Advanced Psychological Statistics
Markov ó Kalman Filter Localization
CSE-490DF Robotics Capstone
Statistical NLP: Lecture 4
Class #21 – Monday, November 10
From Randomness to Probability
Probability Mutually exclusive and exhaustive events
basic probability and bayes' rule
Probability.
Presentation transcript:

Probabilities Random Number Generators –Actually pseudo-random –Seed Same sequence from same seed Often time is used. Many examples on web. Custom random number generators exist. Can be used in algorithms.

Actions Based on Probabilities Assign probabilities to each action. Probabilities must add to 1. Roulette wheel method –Range of random number generator is size of roulette wheel. –Give each action a section of the roulette wheel proportional to its probability. –Generate a random number. (Run the wheel.)

Example P(go straight) = 50% P(turn left) = 30% P(turn right) = 20% Use random numbers from 0 – Roulette wheel –0 ≤ Go straight <.50 –.50 ≤ turn left <.80 –.80 ≤ turn right < 1.00

What is Probability? Often not well defined. –What does weather forecast of 75% rain mean? –Calling your shots (dart board example). Interpretations –Counting Interpretation –Frequency Interpretation –Subjective Interpretation

Some Uses of Probability Diagnosis Prediction Explaining away –water sprinkler example Randomized algorithms –for CS in general –for games and robotics in particular

Expectation Value Expected value of a variable is a kind of average value of the variable. Sum of utilities times probability. Used in decision theory. Utility may be nonlinear.

Assigning Subjective Probability Fair Bet Fair Price Dutch Book Fallacy –Leads to probability rules

Rules 1 Values: –Real number between 0 and 1. Something happens: –P(something) = 1 Not rule: –P(not A) = 1 – P(A)

Rules 2 Or Rules: P(A  B) –Exclusive events P(A) + P(B) –Not exclusive P(A) + P(B) – P(A  B) no double counting And Rules: P(A  B) –Independent events P(A) P(B) –Conditional: P(A | B) P(B) P(B | A) P(A)

P(A) is prior probability of A P(A | B) is posterior probability of A P(B) is prior probability of B – acts as a normalizing constant Monte Hall Problem Bayes Theorem

Bayesian Network Graph representing probabilistic causal relations between variables. Allows efficient Bayesian reasoning in complicated situations

Simple Example Trapped ---  Locked 100 chests 37 trapped –29 of trapped were locked 63 not trapped –18 of not trapped locked Need to find P(trapped | locked)