Download presentation
Presentation is loading. Please wait.
Published bySherman Moris Sullivan Modified over 5 years ago
1
Bayesian Connections Charlie Strausser
2
Outline and Goal Going to be explaining two connected concepts
Rule of Bayes (Commonly taught in statistics classes) Bayesian Inference (Not taught in statistics class) Goal: Explain how one can update their view of the world using statistical methods. Illuminate the multiple different statistical methods for uncovering, predicting, and modeling the world.
3
Conditional Probability
4
Rule of Bayes
5
Solving a Problem Using the Rule of Bayes
A cab was involved in a hit and run accident at night. Two cab companies, the Green and the Blue, operate in the city. 85% of the cabs in the city are Green and 15% are Blue. A witness identified the cab as Blue. The court tested the reliability of the witness under the same circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time. What is the probability that the cab involved in the accident was Blue rather than Green knowing that this witness identified it as Blue? First Instinct?
6
Let’s Solve P(G)=.15 P(B)=.85 P(Identified as Blue | B)=.8
P(Identified as Green | G)=.8 P(Identified as Blue)=.8*.15+.2*.85=.29 (Total Probability) We have Rule of Bayes: P(A|B)=P(B|A)*P(A)/P(B) So P(B|Identified as Blue)=P(Identified as Blue | B)*P(B)/P(Identified as Blue=.8*.15/.29=.413
7
Compare to Intuition Statistical vs. Causal Base Rates WYSIATI
8
Bayesian Inference P(A): Prior P(A|B):Posterior P(B|A):Likelihood
P(D): Evidence
9
Simplified Example: Political Predicting
10
Nate Silver
11
“Yeah, but it’s ****ing poker”
12
Bayesian vs. Frequentist Learning
13
Bayesian vs. Frequentist Inference
14
When Genius Failed Complexity Analysis
Bayesian Inference/Inverse Probability
15
“[Bayesian Inference] seeks to solve problems when you don’t have enough initial information to make a firm conclusion. It’s highly valuable when clues are sketchy. You first form a hypothesis using the best information you have. Then you take that hypothesis and test it against subsequent events. We call those indications and warnings. That’s why it’s called inverse probability. You’re working backwards to test the validity of your original hypothesis. The intelligence community uses inverse probability to prevent terrorist attacks. But you can also apply it to the capital markets. Janet Yellen would say, for example, if you gave her 5 million data points she could predict economic outcomes. But that’s impossible. She’ll never have that data. So it should come as no surprise that the Fed has the worst forecasting record in the world. It had been wrong six years in a row since 2009. We do the exact opposite of what Janet Yellen does. We realize we don’t have the data. We instead develop a hypothesis, but we need to test it constantly against subsequent data. So we continually check our hypothesis with real world data. If it’s consistent with our hypothesis, we keep pursuing it. But if the data doesn’t match, then we have to abandon our initial hypothesis and devise a new one. And the new hypothesis is also subjected to the same tests. It’s not a static model like the Fed uses.”
16
AI IS ABOUT TO LEARN MORE LIKE HUMANS—WITH A LITTLE UNCERTAINTY
17
Leela Chess Zero
18
Questions? Thoughts on if AI is “human”?
Was the rise of Bayesian Inference simply because AI was previously not that strong? Should there be a Bayesian Inference class at W&M? Does Bayesian logic best describe how humans adapt to new information situation? Can you apply Bayesian logic to better how you react to challenging beliefs?
19
Sources uncertainty/ beginners-simple-english/ forecasting.html and-bayes-theorem/
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.