A/Prof Geraint Lewis A/Prof Peter Tuthill

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
A Brief Introduction to Bayesian Inference Robert Van Dine 1.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Psychology 10 Analysis of Psychological Data February 26, 2014.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham 2005.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Probability theory Much inspired by the presentation of Kren and Samuelsson.
Visualizing Events Contingency Tables Tree Diagrams Ace Not Ace Total Red Black Total
I The meaning of chance Axiomatization. E Plurbus Unum.
Building Logical Arguments. Critical Thinking Skills Understand and use principles of scientific investigation Apply rules of formal and informal logic.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Lecture 9: p-value functions and intro to Bayesian thinking Matthew Fox Advanced Epidemiology.
1 Probability and Statistics  What is probability?  What is statistics?
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
Inductive Generalizations Induction is the basis for our commonsense beliefs about the world. In the most general sense, inductive reasoning, is that in.
Bayesian Hypothesis Testing for Proportions Antonio Nieto / Sonia Extremera / Javier Gómez PhUSE Annual Conference, 9th-12th Oct 2011, Brighton UK.
CSE 446: Point Estimation Winter 2012 Dan Weld Slides adapted from Carlos Guestrin (& Luke Zettlemoyer)
Bayesian Methods I: Parameter Estimation “A statistician is a person who draws a mathematically precise line from an unwarranted assumption to a foregone.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
The Cartoon Guide to Statistics
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Once again about the science-policy interface. Open risk management: overview QRAQRA.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Not in FPP Bayesian Statistics. The Frequentist paradigm Defines probability as a long-run frequency independent, identical trials Looks at parameters.
Uncertainty in Expert Systems
Making sense of randomness
Once again about the science-policy interface. Open risk management: overview QRAQRA.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Psychology 202a Advanced Psychological Statistics September 22, 2015.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Psychology 202a Advanced Psychological Statistics September 29, 2015.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
Probability. Probability Probability is fundamental to scientific inference Probability is fundamental to scientific inference Deterministic vs. Probabilistic.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Introduction to probability theory Jouni Tuomisto THL.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
MSc Methods part II: Bayesian analysis Dr. Mathias (Mat) Disney UCL Geography Office: 113, Pearson Building Tel:
Bayesian Estimation and Confidence Intervals Lecture XXII.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Review of Probability.
From Randomness to Probability
Bayesian Estimation and Confidence Intervals
Applied statistics Usman Roshan.
What is Probability? Quantification of uncertainty.
Quick Review Probability Theory
Quick Review Probability Theory
Computer vision: models, learning and inference
Psychology 202a Advanced Psychological Statistics
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
From Randomness to Probability
Basic Probability Theory
From Randomness to Probability
From Randomness to Probability
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Statistical NLP: Lecture 4
Bayesian Reasoning Chapter 13 Thomas Bayes,
Bayes for Beginners Luca Chech and Jolanda Malamud
28th September 2005 Dr Bogdan L. Vrusias
Mathematical Foundations of BME Reza Shadmehr
basic probability and bayes' rule
Presentation transcript:

A/Prof Geraint Lewis A/Prof Peter Tuthill Thomas Bayes (1702-1761) Pierre-Simon Laplace (1749-1827) Bayesian Reasoning A/Prof Geraint Lewis A/Prof Peter Tuthill “Probability theory is nothing but common sense, reduced to calculation.” Laplace

Are you a Bayesian or Frequentist? 4 “There are 3 kinds of lies: Lies, Damned Lies, and Statistics” ...and Bayesian Statistics Benjamin Disraeli Frequentists Fig 1. A Frequentist Statistician Fig 2. Bayesian Statistics Conference

} } What is Inference? If A is true then B is true (Major Premise) A = A,B (in Boolean notation) Deductive Inference (Logic) Aristotle 4th Century B.C. A B A is true (Minor Premise) therefore B is true (conclusion) } T → T STRONG SYLLOGISMS B is False (Minor Premise) therefore A is False (conclusion) F ← F Inductive Inference (Plausible Reasoning) Useful to have in your head a concrete example. “When the bough breaks, the cradle will fall”. B is true (Minor Premise) therefore A is more plausible } t ← T WEAK SYLLOGISMS A is false (Minor Premise) therefore B is less plausible F → f

What is Inference? Deductive Logic: Effects or outcomes Cause Inductive Logic: Effects or observations Possible Causes

What is a Probability? Frequentists Bayesians P(A|B) = Real number measure of the plausibility of proposition A, given (conditional upon) the truth of proposition B P(A) = long run relative frequency of A occurring in identical repeats of an observation “A” is restricted to propositions about random variables “A” can be any logical proposition All probabilities are conditional; we must be explicit what our assumptions B are (no such thing as an absolute probability!)

Probability depends on our state of Knowledge Monte Hall A B C ?

Probability depends on our state of Knowledge 7 Red 5 Blue ? 1st draw 2nd draw 5/12 Blue 7/12 Red

The Desiderata of Bayesian Probability Theory Degrees of plausibility are represented by real numbers (higher degree of belief represented by a larger number) With extra evidence supporting a proposition, the plausibility should increase monotonically up to a limit (certainty). Consistency. Multiple ways to arrive at a conclusion must all produce the same answer (see book for additional details)

Logic and Probability In the certainty limit, where probabilities go to zero (falsehood) or one (truth), then the sum and product rules reduce to formal Boolean deductive logic (strong syllogisms). Bayesian Probability is therefore an extension of formal logic into intermediate states of knowledge. Bayesian inference gives a measure of our state of knowledge about nature, not a measure of nature itself.

The two rules underlying probability theory P(A|B) + P(A|B) = 1 SUM RULE: P(A,B|C) = P(A|C) P(B|A,C) PRODUCT RULE: = P(B|C) P(A|B,C) Blue, Left Blue Eyes Right Handed Left Handed All Kangaroos Brown Eyes

Bayes’ Theorem P(Hi|I) P(D|Hi I) P(Hi|D,I) = P(D|I) Posterior P(Hi|I) P(D|Hi I) P(Hi|D,I) = Bayes Theorem: P(D|I) Hi = proposition asserting truth of a hypothesis of interest I = proposition representing prior information D = proposition representing the data P(D|Hi I) = Likelihood: probability of obtaining the data given that the hypothesis is true P(Hi|I) = Prior: probability of hypothesis before new data P(D|I) = Normalization factor (prob all hypothesis i sum to 1)

Example: The Gambler’s coin problem P(H|I) P(D|H I) P(H|D,I) = P(D|I) Normalization factor – Ignore this for now as only need relative merit Prior – what do we know about the coin? Assume H=pdf(head) is uniformly distributed 0-1 Likelihood – if we assume the data D gives R heads in N tosses: P(D|H I)  HR (1-H)N-R The full distribution, assuming independence of throws, is the Binomial Distribution. We omit terms not containing H, and use a proportionality.

Data Example: A fair coin? H H T T

Example: A fair coin?

The effects of the Prior