Download presentation
Presentation is loading. Please wait.
Published byClemence Scarlett Logan Modified over 9 years ago
1
Psych 548, Miyamoto, Win '15 1 Set Up for Students Your computer should already be turned on and logged in. Open a browser to the Psych 548 website ( you can get it from MyUW ) http://faculty.washington.edu/jmiyamot/p548/p548-set.htm Download the zip file: p548.zip. Unzip the zip file to C:\temp. This process will create a subdirectory, C:\temp\p548. The files for today’s class are in this directory or one of its subdirectories. Run R. Run Rstudio. Load any pdf handouts for today into Acrobat.
2
Bayesian Statistics, Modeling & Reasoning What is this course about? Psychology 548 Bayesian Statistics, Modeling & Reasoning Instructor: John Miyamoto 01/05/2015: Lecture 1-1 This Powerpoint presentation may contain macros that were used to create the slides. The macros aren’t needed to view the slides. If necessary, you can disable the macros without any change to the presentation.
3
Outline What is Bayesian inference? Why is Bayesian statistics, modeling & reasoning relevant to psychology? What is Psych 548 about? Familiarize students with the set up for using MGH 058 Explain Psych 548 website Intro to R Intro to RStudio Intro to the R to BUGS interface Psych 548, Miyamoto, Win '15 3 Lecture probably ends here
4
Bayes Rule – What Is It? Reverend Thomas Bayes, 1702 – 1761 English Protestant minister & mathematician Bayes Rule is fundamentally important to: ♦ Bayesian statistics ♦ Bayesian decision theory ♦ Bayesian models in psychology Psych 548, Miyamoto, Win '15 4 Bayes Rule – Why Is It Important?
5
Psych 548, Miyamoto, Win '15 5 Bayes Rule – Why Is It Important? Bayes Rule is the optimal way to update the probability of hypotheses given data. The concept of "Bayesian reasoning“: 3 related concepts ♦ Concept 1: Bayesian inference is a model of optimal learning from experience. ♦ Concept 2: Bayesian decision theory describes optimal strategies for taking actions in an uncertain environment. Optimal gambling. ♦ Concept 3: Bayesian reasoning represents the uncertainty of events as probabilities in a mathematical calculus. Concepts 1, 2 & 3 are all consistent with the use of the term, "Bayesian", in modern psychology. Bayesian Issues in Psychology
6
Psych 548, Miyamoto, Win '15 6 Bayesian Issues in Psychological Research Does human reasoning about uncertainty conform to Bayes Rule? Do humans reason about uncertainty as if they are manipulating probabilities? ♦ These questions are posed with respect to infants & children, as well as adults. Do neural information processing systems (NIPS) incorporate Bayes Rule? Do NIPS model uncertainties as if they are probabilities. Four Roles for Bayesian Reasoning in Psychology Research
7
Psych 548, Miyamoto, Win '15 7 Four Roles for Bayesian Reasoning in Psychology 1.Bayesian statistics: Analyzing data ♦ E.g., is the slope of the regression of grades on IQ the same for boys as for girls? ♦ E.g., are there group differences in an analysis of variance? Four Roles …. (Continued)
8
Psych 548, Miyamoto, Win '15 8 Four Roles for Bayesian Reasoning in Psychology 1.Bayesian statistics: Analyzing data 2.Bayesian decision theory – a theory of strategic action. How to gamble if you must. 3.Bayesian modeling of psychological processes 4.Bayesian reasoning – Do people reason as if they are Bayesian probability analysts? (At macro & neural levels) ♦ Judgment and decision making – This is a major issue. ♦ Human causal reasoning – is it Bayesian or quasi-Bayesian? ♦ Modeling neural decision making – many proposed models have a strong Bayesian flavor. Four Roles …. (Continued)
9
Psych 548, Miyamoto, Win '15 9 Four Roles for Bayesian Reasoning in Psychology 1.Bayesian statistics: Analyzing data 2.Bayesian decision theory – a theory of strategic action. How to gamble if you must. 3.Bayesian modeling of psychological processes 4.Bayesian reasoning – Do people reason as if they are Bayesian probability analysts? (At macro & neural levels) Psych 548 : Focus on Topics (1) and (3). Includes a little bit of (4). Graphical Representation of Psych 548 Focus on Stats/Modeling
10
Psych 548, Miyamoto, Win '15 10 Graphical Representation of Psych 548 Bayesian Statistics & Modeling: R, OpenBUGS, JAGS Bayesian Models in Child & Adult Psychology & Neuroscience Psych 548 Graph & Text Showing the History of S, S-Plus & R
11
Psych 548, Miyamoto, Win '15 11 Brief History of S, S-Plus, & R S – open source statistics program created by Bell Labs (1976 – 1988 – 1999) S-Plus – commercial statistics program, refinement of S (1988 – present) R – free open source statistics program (1997 – present) ♦ currently the standard computing framework for statisticians worldwide Many contributors to its development ♦ Excellent general computation. Powerful & flexible. ♦ Great graphics. ♦ Multiplatform: Unix, Linux, Windows, Mac ♦ User must like programming BUGS, WinBUGS, OpenBUGS, JAGS S S-Plus R Ancestry of R
12
Psych 548, Miyamoto, Win '15 12 BUGS, WinBUGS, OpenBUGS & JAGS Gibbs Sampling & Metropolis-Hastings Algorithm Two algorithms for sampling from a hard-to-evaluate probability distribution. BUGS – Bayesian inference Under Gibbs Sampling (circa 1995) WinBUGS - Open source (circa 1997) ♦ Windows only OpenBUGS – Open source (circa 2006) ♦ Mainly Windows. Runs within a virtual Windows machine on a Mac. JAGS – Open source (circa 2007) ♦ Multiplatform: Windows, Mac, Linux STAN – Open source (circa 2012) ○ Multiplatform: Windows, Mac, Linux Basic Structure of Bayesian Computation with R & OpenBUGS “BUGS” includes all of these.
13
Psych 548, Miyamoto, Win '15 13 Basic Structure of Bayesian Computation R data preparation analysis of results JAGS Computes approximation to the posterior distribution. Includes diagnostics. rjags functions rjags runjags OpenBUGS/ WinBUGS/ Stan R BRugs functions Brugs functions BRugs R2WinBUGS rstan Outline of Remainder of the Lecture: Course Outline & General Information
14
RStudio Run RStudio Run R from within RStudio Psych 548, Miyamoto, Win '15 14
15
Psych 548, Miyamoto, Win '15 15 Remainder of This Lecture Take 5 minute break Introduce selves Psych 548: What will we study? Briefly view the Psych 548 webpage. Introduction to the computer facility in CSSCR. Introduction to R, BUGS (OpenBUGS & JAGS), and RStudio 5 Minute Break
16
Introduce selves upon return Psych 548, Miyamoto, Win '15 16 Course Goals
17
Psych 548, Miyamoto, Win '15 17 Course Goals Learn the theoretical framework of Bayesian inference. Achieve competence with R, OpenBUGS and JAGS. Learn basic Bayesian statistics ♦ Learn how to think about statistical inference from a Bayesian standpoint. ♦ Learn how to interpret the results of a Bayesian analysis. ♦ Learn basic tools of Bayesian statistical inference - testing for convergence, making standard plots, examing samples from a posterior distribution. --------------------------------------------------------------- Secondary Goals ♦ Bayesian modeling in psychology ♦ Understand arguments about Bayesian reasoning in the psychology of reasoning. The pros and cons of the heuristics & biases movement. Kruschke Textbook
18
Main Text: Kruschke, Doing Bayesian Data Analysis Kruschke, J. K. (2014). Doing bayesian data analysis, second edition: A tutorial with R, JAGS, and Stan. Academic Press. Excellent textbook – worth the price ($90 from Amazon) Emphasis on classical statistical test problems from a Bayesian perspective. Not so much modeling per se. ♦ Binomial inference problems, anova problems, linear regression problems. Computational Requirements R & JAGS (or OpenBUGS) A programming editor like Rstudio is useful. Psych 548, Miyamoto, Win '15 18 Chapter Outline of Kruschke Textbook
19
Main Text: Kruschke, Doing Bayesian Data Analysis Ch 1 – 4: Basic probability background (pretty easy) Ch 5 – 8: Bayesian inference with simple binomial models ♦ Conjugate priors, Gibbs sampling & Metropolis-Hastings algorithm ♦ OpenBUGS or JAGS Ch 9 – 12: Bayesian approach to hierarchical modeling, model comparison, & hypothesis testing. Ch 13: Power & sample size (omit ) Ch 14: Intro generalized linear model Ch 15 – 17: Intro linear regression Ch 18 – 19: Oneway & multifactor anova Ch 20 – 22: Categorical data analysis, logistic regression, probit regression, poisson regression Psych 548, Miyamoto, Win '15 19 Lee & Wagenmakers, Bayesian Graphical Modeling
20
Psych 548, Miyamoto, Win '15 20 Workbook on Bayesian Graphical Modeling Kruschke, J. K. (2014). Doing bayesian data analysis, second edition: A tutorial with R, JAGS, and Stan. Academic Press. ♦ Michael Lee: http://www.socsci.uci.edu/~mdlee/bgm.html ♦ E. J. Wagenmaker: http://users.fmg.uva.nl/ewagenmakers/BayesCourse/BayesBook.html Equivalent Matlab & R code for book are available at the Psych 548 website and at Lee or Wagenmaker's website. Emphasis is on Bayesian models of psychological processes rather than on theory. Lots of examples. Computer Setup in CSSCR
21
Psych 548, Miyamoto, Win '15 21 CSSCR Network & Psych 548 Webpage Click on /Start /Computer. The path & folder name for your Desktop is: C:\users\NetID\Desktop (where "NetID" refers to your NetID) Double click on MyUW on your Desktop. Find Psych 548 under your courses and double click on the Psych 548 website. Download files that are needed for today's class. Save these files to C:\users\NetID\Desktop ♦ Note that Ctrl-D takes you to your Desktop. Run R. Run RStudio. Psych 548 Website - END
22
Psych 548 Website Point out where to download the material for today’s class Point out pdf’s for the textbooks. Psych 548, Miyamoto, Win '15 22 END
23
Time Permitting: Proceed to Bayes Rule Psych 548, Miyamoto, Win '15 23
24
Bayes Rule Reverend Thomas Bayes, 1702 – 1761 British Protestant minister & mathematician Bayes Rule is fundamentally important to: ♦ Bayesian statistics ♦ Bayesian decision theory ♦ Bayesian models in psychology Psych 548, Miyamoto, Win '15 24 Next: Explanation of Bayes Rule
25
Bayes Rule – Explanation Psych 548, Miyamoto, Win '15 25 Odds Form of Bayes Rule Posterior Probability of the Hypothesis Likelihood of the Data Prior Probability of the Hypothesis Normalizing Constant
26
Bayes Rule – Explanation Psych 548, Miyamoto, Win '15 26 Odds Form of Bayes Rule Posterior Probability of the Hypothesis Likelihood of the Data Prior Probability of the Hypothesis Normalizing Constant
27
Bayes Rule – Odds Form Psych 548, Miyamoto, Win '15 27 Bayes Rule for H given D Bayes Rule for not-H given D Odds Form of Bayes Rule Explanation of Odds form of Bayes Rule
28
Bayes Rule (Odds Form) H = a hypothesis, e.g.., hypothesis that the patient has cancer = the negation of the hypothesis, e.g.., the hypothesis that the patient does not have cancer D = the data, e.g., a + result for a cancer test Psych 548, Miyamoto, Win '15 28 Posterior Odds Likelihood Ratio (diagnosticity) Prior Odds (base rate) Interpretation of a Medical Test Result
29
Psych 548, Miyamoto, Win '15 29 Bayesian Analysis of a Medical Test Result (Look at Handout) QUESTION: A physician knows from past experience in his practice that 1% of his patients have cancer (of a specific type) and 99% of his patients do not have the cancer. He also knows the probabilities of a positive test result (+ result) given cancer and given no cancer. These probabilities are: P(+ test | Cancer) =.792andP(+ test | no cancer) =.096 Suppose Mr. X has a positive test result. What is the probability that Mr. X has cancer? Write down your intuitive answer. (Note to JM: Write estimates on board) Solution to this problem
30
Psych 548, Miyamoto, Win '15 30 Given Information in the Diagnostic Inference from a Medical Test Result P(+ test | Cancer) =.792 (true positive rate a.k.a. hit rate) P(+ test | no cancer) =.096 (false positive rate a.k.a. false alarm rate) P(Cancer) = Prior probability of cancer =.01 P(No Cancer) = Prior probability of no cancer = 1 - P(Cancer) =.99 Mr. X has a + test result. What is the probability that Mr. X has cancer? Solution to this problem
31
Psych 548, Miyamoto, Win '15 31 Bayesian Analysis of a Medical Test Result P(+ test | Cancer) = 0.792and P(+ test | no cancer) = 0.096 P(Cancer) = Prior probability of cancer = 0.01 P(No Cancer) = Prior probability of no cancer = 0.99 P(Cancer | + test) = 1 / (12 + 1) = 0.077 Digression concerning What Are Odds?
32
Psych 548, Miyamoto, Win '15 32 Digression: Converting Odds to Probabilities IfX / (1 – X) = Y ThenX = Y(1 – X) = Y – XY SoX + XY = Y SoX(1 + Y) = Y SoX = Y / (1 + Y) Conclusion:If Y are the odds for an event, then, Y / (1 + Y) is the probability of the event Return to Slide re Medical Test Inference
33
Psych 548, Miyamoto, Win '15 33 Bayesian Analysis of a Medical Test Result P(+ test | Cancer) = 0.792and P(+ test | no cancer) = 0.096 P(Cancer) = Prior probability of cancer = 0.01 P(No Cancer) = Prior probability of no cancer = 0.99 P(Cancer | + test) = (1/12) / (1 + 1/12) = 1 / (12 + 1) = 0.077 Compare the Normative Result to Physician’s Judgments
34
Psych 548, Miyamoto, Win '15 34 Continue with the Medical Test Problem P(Cancer | + Result) = (.792)(.01)/(.103) =.077 Posterior odds against cancer are (.077)/(1 -.077) or about 1 chance in 12. Notice: The test is very diagnostic but still P(cancer | + result) is low because the base rate is low. David Eddy found that about 95 out of 100 physicians stated that P(cancer | +result) is about 75% in this case (very close to the 79% likelihood of a + result given cancer). General Characteristics of Bayesian Inference
35
Psych 548, Miyamoto, Win '15 35 General Characteristics of Bayesian Inference The decision maker (DM) is willing to specify the prior probability of the hypotheses of interest. DM can specify the likelihood of the data given each hypothesis. Using Bayes Rule, we infer the probability of the hypotheses given the data Comparison Between Bayesian & Classical Stats - END
36
How Does Bayesian Stats Differ from Classical Stats? Bayesian: Common Aspects Statistical Models Credible Intervals – sets of parameters that have high posterior probability Bayesian: Divergent Aspects Given data, compute the full posterior probability distribution over all parameters Generally null hypothesis testing is nonsensical. Posterior probabilities are meaningful; p-values are half-assed. MCMC approximations to posterior distributions. Classical: Common Aspects Statistical Models Confidence Intervals – which parameter values are tenable after viewing the data. Classical: Divergent Aspects No prior distributions in general, so this idea is meaningless or self-deluding. Null hypothesis testing P-values MCMC approximations are sometimes useful but not for computing posterior distributions. Psych 548, Miyamoto, Win '15 36 END
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.