Programming for Geographical Information Analysis: Advanced Skills

Slides:



Advertisements
Similar presentations
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
Advertisements

Chapter 6 Section 1 Introduction. Probability of an Event The probability of an event is a number that expresses the long run likelihood that an event.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
© 2013 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Introductory Statistics: Exploring the World through.
Keller: Stats for Mgmt & Econ, 7th Ed
5.3A Conditional Probability, General Multiplication Rule and Tree Diagrams AP Statistics.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Aims of session  Last week  Deterministic  Propositional logic  Predicate logic  This week (Basis of this section Johnson and Picton 1995)  Non-monotonic.
Previous Lecture: Data types and Representations in Molecular Biology.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
Random Experiment Random Variable: Continuous, Discrete Sample Space: S Event: A, B, E Null Event Complement of an Event A’ Union of Events (either, or)
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
AP Statistics Semester One Review Part 2 Chapters 4-6 Semester One Review Part 2 Chapters 4-6.
COMP 2208 Dr. Long Tran-Thanh University of Southampton Bayes’ Theorem, Bayesian Reasoning, and Bayesian Networks.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Probability. Randomness When we produce data by randomized procedures, the laws of probability answer the question, “What would happen if we did this.
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
11F Conditional Probability. Conditional probability Conditional probability is where the probability of an event is conditional on (i.e. depends on)
Bayes for Beginners Anne-Catherine Huys M. Berk Mirza Methods for Dummies 20 th January 2016.
Data Mining With SQL Server Data Tools Mining Data Using Tools You Already Have.
1 Neural Codes. 2 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly.
Chapter 4, part E Download this file. Download this file.
MGS3100_07.ppt/Apr 7, 2016/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Decision Tree & Bayes’ Theorem Apr 7, 2016.
Probability and Probability Distributions. Probability Concepts Probability: –We now assume the population parameters are known and calculate the chances.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and Statistics.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Virtual University of Pakistan
CHAPTER 5 Probability: What Are the Chances?
Dealing with Random Phenomena
Quick Review Probability Theory
Conditional probability
Quick Review Probability Theory
Decision Tree Analysis
From last time: on-policy vs off-policy Take an action Observe a reward Choose the next action Learn (using chosen action) Take the next action Off-policy.
Reasoning Under Uncertainty in Expert System
Bayesian Networks: A Tutorial
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
STAT 203 Tree Diagrams and Bayes’ Rule
Introduction to Probabilities
From Randomness to Probability
From Randomness to Probability
Modelling Dr Andy Evans In this lecture we'll look at modelling.
Probability Calculus Farrokh Alemi Ph.D.
Lecture 11 Sections 5.1 – 5.2 Objectives: Probability
Chapter 14 Probability Rules!.
Uncertainty in AI.
Probability Probability underlies statistical inference - the drawing of conclusions from a sample of data. If samples are drawn at random, their characteristics.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
CAP 5636 – Advanced Artificial Intelligence
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
CAP 5636 – Advanced Artificial Intelligence
Honors Statistics From Randomness to Probability
Probability, Part I.
Chapter 6: Probability: What are the Chances?
6. Multistage events and application of probability
Wellcome Trust Centre for Neuroimaging
Chapter 5: Probability: What are the Chances?
Chapter 15 Probability Rules!.
First, a question Can we find the perfect value for a parameter?
Bayes for Beginners Luca Chech and Jolanda Malamud
General Probability Rules
Chapter 5: Probability: What are the Chances?
Statistics and Probability-Part 6
From Randomness to Probability
Chapter 14 February 26, 2004.
basic probability and bayes' rule
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Programming for Geographical Information Analysis: Advanced Skills Online mini-lecture: Introduction to Bayesian statistics and networks Dr Andy Evans

Bayesian Networks Of course, it may be that we see people in one state, and their actions, but have no way of replicating the rulesets in human language. In this case, we can generate a Bayesian Network. These gives probabilities that states will occur together. They allow you to update the probabilities on new evidence. They allow you to chain these “rules” together to make inferences. A slightly more complex way of looking at sequences of probabilistic events is with Bayesian Statistics, particularly where two events may be associated. Bayesian statistics are used in a number of fields, not just for modelling proposed causal relationships. Chiefly they’re used for looking at what the probability of something happening is after something else has occurred. They’re often used for exploring solution spaces – for example, if one solution is found not to be viable, Bayesian statistics may tell you what the probabilities therefore are of other solutions being right. Bayesian statistics are named after their discoverer, the Reverend Thomas Bayes, an eighteenth century English mathematician. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Bayes.html

Bayesian Networks The probability of an event A is equal to the probability of A given some other event B, plus the probability of A given B doesn’t happen. P(A) = P(AB) + P(ABc) P(A) = P(A|B)P(B) + P(A|Bc)P(Bc) Where A|B is the probability of A happening, given B That is, if we know the chance of an event given another event and the chance of an event given the other not happening, we can calculate the chance of the event happening. 1 – P(B)

Bayesian Networks For example, say we want to predict the likelihood we have a republican (“GOP”) president starting a war. We can calculate the likelihood of a war… P(War) = P(WarGvnGOP)P(GOP) + P(WarGvnDemocrat)P(Democrat) 0.70 = 0.66 * 0.64 + 0.76 * 0.36

Bayesian Networks Equally, though the probability of an event happening, given some other event, is the probability of both events happening, divided by the probability of the other event happening. P (B|A) = P(BA) / P(A) P (B|A) = P(B)P(A|B)/P(A) If we have “A” we can use this to calculate B given A.

Bayesian Networks For example, if we have a war, we can use this to calculate the probability we also have a republican president. P(GOPGvnWar) = P(GOP)P(WarGvnGOP)/P(War) 0.6 = (0.64 * 0.66) / 0.7 If a war occurred, we know the probability of that happening, which means we can plug it into the equation to get the probability we have a GOP president. If we’re having a war, and we can also use this as the P(GOP) to calculate the probability of another war: P(War) = P(WarGvnGOP)P(GOP) + P(WarGvnDemocrat)P(Democrat) P (B|A) = P(B)P(A|B)/P(A)

Bayesian statistics Or, more generally… Pr(Bx|A) = Pr(Bx)Pr(A|Bx) Pr(GOPGvnWAR) =Pr(GOP)Pr(WARGvnGOP) ΣPr(Bi)Pr(A|Bi) ΣPr(CAUSES)Pr(WARGvnCAUSES) Where Bi is all the potential causes (or things that must happen in order for A to occur) and Bx is the specific thing we are testing for. ie. Pr(B|A) = Probability of possible Cause B given A. Pr(A|B) = Probability of A given possible Cause B. Pr(A) In the above, B|A means “the probability of B conditional on A”. You can view it as saying that if the event A has happened, what is the probability of B also having happened. Hence, in our example, B|A equals “if the war has happened, what is the probability of the president is a republican” A|B equals “if the president is a republican, what is the probability of war also having happened” Note that these are not necessarily the same thing. It’s sometimes easier to think about things using the inverse probabilities. In our example above this would be Given there’s a war, what is the chances a democrat isn’t in power, vs. what are the chances there’s not peace given a gop is in power. i.e. these are asking two very different things.

Uses Bayesian stats used to control: Help systems. Amazon recommendations. Given a series of situations, and previous responses, what is the likelihood of a response being needed given this situation. Also, other types of systems that learn solutions and predict answers: i.e. AI. Possible uses: Flood gate control. Land use planning. Policy making. For example, see MS Office Helper Not Dead Yet, Wired News, 19 Apr 2001: http://www.wired.com/news/technology/0,1282,43065,00.html

Bayesian Networks In a Bayesian Network the states are linked by probabilities, so: If A then B; if B then C; if C then D Not only this, but this can be updated when an event A happens, propagating the new probabilities by using the new final probability of B to recalculate the probability of C, etc.

Expert Systems All these elements may be brought together in an “Expert System”. These are decision trees, in which rules and probabilities link states. Forward chaining: you input states and the system runs through the rules to suggest a most scenario of action. Backward chaining: you input goals, and the system tells you the states you need to achieve to get there. Don’t have to use Fuzzy Sets or Bayesian probabilities, but often do. Software: Free demo and tutorials: http://www.expertise2go.com/ OpenExpert: http://openexpert.org/ Lists of commercial and freeware tools: http://www.kbsc.com/rulebase.html http://www.pcai.com/web/ai_info/expert_systems.html