Review of Probabilities and Basic Statistics

Slides:



Advertisements
Similar presentations
Point Estimation Notes of STAT 6205 by Dr. Fan.
Advertisements

CS433: Modeling and Simulation
Random Variables ECE460 Spring, 2012.
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
Review of Basic Probability and Statistics
Chapter 1 Probability Theory (i) : One Random Variable
Short review of probabilistic concepts
Probability Densities
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
1 Review of Probability Theory [Source: Stanford University]
Evaluating Hypotheses
Machine Learning CMPT 726 Simon Fraser University
Probability and Statistics Review
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
The moment generating function of random variable X is given by Moment generating function.
3-1 Introduction Experiment Random Random experiment.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Probability Distributions and Frequentist Statistics “A single death is a tragedy, a million deaths is a statistic” Joseph Stalin.
Probability and Statistics Review Thursday Sep 11.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Theory Summary
Recitation 1 Probability Review
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
PBG 650 Advanced Plant Breeding
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
Theory of Probability Statistics for Business and Economics.
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
PROBABILITY CONCEPTS Key concepts are described Probability rules are introduced Expected values, standard deviation, covariance and correlation for individual.
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Probability (outcome k) = Relative Frequency of k
Basics on Probability Jingrui He 09/11/2007. Coin Flips  You flip a coin Head with probability 0.5  You flip 100 coins How many heads would you expect.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
2003/02/19 Chapter 2 1頁1頁 Chapter 2 : Basic Probability Theory Set Theory Axioms of Probability Conditional Probability Sequential Random Experiments Outlines.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Stat 223 Introduction to the Theory of Statistics
Pattern Recognition Probability Review
Probabilistic Analysis of Computer Systems
The Exponential and Gamma Distributions
Probability Theory and Parameter Estimation I
Probability for Machine Learning
Appendix A: Probability Theory
Graduate School of Information Sciences, Tohoku University
Basic Probability aft A RAJASEKHAR YADAV.
Review of Probability and Estimators Arun Das, Jason Rebello
Multinomial Distribution
Distributions and Concepts in Probability Theory
Statistical NLP: Lecture 4
HKN ECE 313 Exam 1 Review Session
Chapter 2 Notes Math 309 Probability.
Stat 223 Introduction to the Theory of Statistics
Experiments, Outcomes, Events and Random Variables: A Revisit
HKN ECE 313 Exam 2 Review Session
Continuous Random Variables: Basics
Presentation transcript:

Review of Probabilities and Basic Statistics 10-701 Recitations 9/18/2018 Recitation 1: Statistics Intro

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Review: the concept of probability Sample space Ω – set of all possible outcomes Event E ∈Ω – a subset of the sample space Probability measure – maps Ω to unit interval “How likely is that event E will occur?” Kolmogorov axioms P E ≥0 P Ω =1 P 𝐸 1 ∪ 𝐸 2 ∪… = 𝑖=1 ∞ 𝑃(𝐸 𝑖 ) Ω 𝐸 9/18/2018 Introduction to Probability Theory

Introduction to Probability Theory Reasoning with events Venn Diagrams 𝑃 𝐴 =𝑉𝑜𝑙(𝐴)/𝑉𝑜𝑙 (Ω) Event union and intersection 𝑃 𝐴 𝐵 =𝑃 𝐴 +𝑃 𝐵 −𝑃 𝐴∩𝐵 Properties of event union/intersection Commutativity: 𝐴∪𝐵=𝐵∪𝐴; 𝐴∩𝐵=𝐵∩𝐴 Associativity: 𝐴∪ 𝐵∪C =(𝐴∪𝐵)∪C Distributivity: 𝐴∩ 𝐵∪𝐶 =(𝐴∩𝐵)∪(𝐴∩𝐶) 9/18/2018 Introduction to Probability Theory

Introduction to Probability Theory Reasoning with events DeMorgan’s Laws (𝐴∪𝐵) 𝐶 = 𝐴 𝐶 ∩ 𝐵 𝐶 (𝐴∩𝐵) 𝐶 = 𝐴 𝐶 ∪ 𝐵 𝐶 Proof for law #1 - by double containment (𝐴∪𝐵) 𝐶 ⊆ 𝐴 𝐶 ∩ 𝐵 𝐶 … 𝐴 𝐶 ∩ 𝐵 𝐶 ⊆ (𝐴∪𝐵) 𝐶 9/18/2018 Introduction to Probability Theory

Introduction to Probability Theory Reasoning with events Disjoint (mutually exclusive) events 𝑃 𝐴∩𝐵 =0 𝑃 𝐴∪𝐵 =𝑃 𝐴 +𝑃(𝐵) examples: 𝐴 and 𝐴 𝐶 partitions NOT the same as independent events For instance, successive coin flips 𝑆 1 𝑆 2 𝑆 3 𝑆 4 𝑆 5 𝑆 6 9/18/2018 Introduction to Probability Theory

Introduction to Probability Theory Partitions Partition 𝑆 1 … 𝑆 𝑛 Events cover sample space 𝑆 1 ∪…∪ 𝑆 𝑛 =Ω Events are pairwise disjoint 𝑆 𝑖 ∩ 𝑆 𝑗 = ∅ Event reconstruction 𝑃 𝐴 = 𝑖=1 𝑛 𝑃(𝐴∩𝑆 𝑖 ) Boole’s inequality 𝑃 𝑖=1 ∞ 𝐴 𝑖 ≤ 𝑖=1 𝑛 𝑃(𝐴 𝑖 ) Bayes’ Rule 𝑃 𝑆 𝑖 |𝐴 = 𝑃 𝐴 𝑆 𝑖 𝑃( 𝑆 𝑖 ) 𝑗=1 𝑛 𝑃 𝐴 𝑆 𝑗 𝑃( 𝑆 𝑗 ) 9/18/2018 Introduction to Probability Theory

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Random Variables Random variable – associates a value to the outcome of a randomized event Sample space 𝒳: possible values of rv 𝑋 Example: event to random variable Draw 2 numbers between 1 and 4. Let r.v. X be their sum. E 11 12 13 14 21 22 23 24 31 32 33 34 41 42 43 44 X(E) 2 3 4 5 6 7 8 Induced probability function on 𝒳. x 2 3 4 5 6 7 8 P(X=x) 1 16 2 16 3 16 4 16 9/18/2018 Random Variables

Cumulative Distribution Functions 𝐹 𝑋 𝑥 =𝑃 𝑋≤𝑥 ∀𝑥∈𝒳 The CDF completely determines the probability distribution of an RV The function 𝐹 𝑥 is a CDF i.i.f lim 𝑥→−∞ 𝐹 𝑥 =0 and lim 𝑥→∞ 𝐹 𝑥 =1 𝐹 𝑥 is a non-decreasing function of 𝑥 𝐹 𝑥 is right continuous: ∀ 𝑥 0 lim 𝑥→ 𝑥 0 𝑥 > 𝑥 0 𝐹 𝑥 =𝐹( 𝑥 0 ) 9/18/2018 Random Variables

Identically distributed RVs Two random variables 𝑋 1 and 𝑋 2 are identically distributed iif for all sets of values 𝐴 𝑃 𝑋 1 ∈𝐴 =𝑃 𝑋 2 ∈𝐴 So that means the variables are equal? NO. Example: Let’s toss a coin 3 times and let 𝑋 𝐻 and 𝑋 𝐹 represent the number of heads/tails respectively They have the same distribution but 𝑋 𝐻 =1− 𝑋 𝐹 9/18/2018 Random Variables

Discrete vs. Continuous RVs Step CDF 𝒳 is discrete Probability mass 𝑓 𝑋 𝑥 =𝑃 𝑋=𝑥 ∀𝑥 Continuous CDF 𝒳 is continuous Probability density 𝐹 𝑋 𝑥 = −∞ 𝑥 𝑓 𝑋 𝑡 𝑑𝑡 ∀𝑥 9/18/2018 Random Variables

Interval Probabilities Obtained by integrating the area under the curve 𝑃 𝑥 1 ≤𝑋≤ 𝑥 2 = 𝑥 1 𝑥 2 𝑓 𝑥 𝑥 𝑑𝑥 𝑥 1 𝑥 2 This explains why P(X=x) = 0 for continuous distributions! 𝑃 𝑋=𝑥 ≤ lim 𝜖→0 𝜖 >0 [ 𝐹 𝑥 𝑥 − 𝐹 𝑥 (𝑥−𝜖)] =0 9/18/2018 Random Variables

Moments Expectations nth moment of a probability distribution The expected value of a function 𝑔 depending on a r.v. X~𝑃 is defined as 𝐸𝑔 𝑋 = 𝑔(𝑥)𝑃 𝑥 𝑑𝑥 nth moment of a probability distribution 𝜇 𝑛 = 𝑥 𝑛 𝑃 𝑥 𝑑𝑥 mean 𝜇= 𝜇 1 nth central moment 𝜇 𝑛 ′= 𝑥−𝜇 𝑛 𝑃 𝑥 𝑑𝑥 Variance 𝜎 2 = 𝜇 2 ′ 9/18/2018 Random Variables

Multivariate Distributions Example Uniformly draw 𝑋 and 𝑌 from the set {1,2,3}2 𝑊=𝑋+𝑌; 𝑉=|𝑋−𝑌| Joint 𝑃 𝑋,𝑌 ∈𝐴 = (𝑥,𝑦)𝜖𝐴 𝑓(𝑥,𝑦) Marginal 𝑓 𝑌 𝑦 = 𝑥 𝑓(𝑥,𝑦) For independent RVs: 𝑓 𝑥 1 ,…, 𝑥 𝑛 = 𝑓 𝑋 1 𝑥 1 … 𝑓 𝑋 𝑛 ( 𝑥 𝑛 ) W V 1 2 PW 1/9 3 2/9 4 3/9 5 6 PV 4/9 9/18/2018 Random Variables

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Properties of Common Distributions Bernoulli 𝑋= 1 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑝 0 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 1−𝑝 0≤𝑝≤1 Mean and Variance 𝐸𝑋=1𝑝+0 1−𝑝 =𝑝 𝑉𝑎𝑟𝑋= 1− 𝑝 2 𝑝+ 0− 𝑝 2 1−𝑝 =𝑝(1−𝑝) MLE: sample mean Connections to other distributions: If 𝑋 1 … 𝑋 𝑛 ~ 𝐵𝑒𝑟𝑛(𝑝) then Y= 𝑖=1 𝑛 𝑋 𝑖 is Binomial(n,p) Geometric distribution – the number of Bernoulli trials needed to get one success 9/18/2018 Properties of Common Distributions

Properties of Common Distributions Binomial 𝑃 𝑋=𝑥;𝑛,𝑝 = 𝑛 𝑥 𝑝 𝑥 (1−𝑝) 𝑛−𝑥 Mean and Variance 𝐸𝑋= 𝑥=0 𝑛 𝑥 𝑛 𝑥 𝑝 𝑥 (1−𝑝) 𝑛−𝑥 = …=𝑛𝑝 𝑉𝑎𝑟𝑋=𝑛𝑝(1−𝑝) NOTE: Sum of Bin is Bin Conditionals on Bin are Bin 𝑽𝒂𝒓𝑿=𝑬 𝑿 𝟐 − (𝑬𝑿) 𝟐 9/18/2018 Properties of Common Distributions

Properties of the Normal Distribution Operations on normally-distributed variables 𝑋 1 , 𝑋 2 ~ 𝑁𝑜𝑟𝑚 0,1 , then 𝑋 1 ± 𝑋 2 ~𝑁(0,2) 𝑋 1 / 𝑋 2 ~ 𝐶𝑎𝑢𝑐ℎ𝑦(0,1) 𝑋 1 ~ 𝑁𝑜𝑟𝑚 𝜇 1 , 𝜎 1 2 , 𝑋 2 ~ 𝑁𝑜𝑟𝑚 𝜇 2 , 𝜎 2 2 and 𝑋 1 ⊥ 𝑋 2 then 𝑍= 𝑋 1 + 𝑋 2 ~ 𝑁𝑜𝑟𝑚 𝜇 1 + 𝜇 2 , 𝜎 1 2 + 𝜎 2 2 If 𝑋 ,𝑌 ~ 𝑁 𝜇 𝑥 𝜇 𝑦 , 𝜎 𝑋 2 𝜌 𝜎 𝑋 𝜎 𝑌 𝜌 𝜎 𝑋 𝜎 𝑌 𝜎 𝑌 2 , then 𝑋+𝑌 is still normally distributed, the mean is the sum of the means and the variance is 𝜎 𝑋+𝑌 2 = 𝜎 𝑋 2 + 𝜎 𝑌 2 +2𝜌 𝜎 𝑋 𝜎 𝑌 , where 𝜌 is the correlation 9/18/2018 Properties of Common Distributions

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Estimating Distribution Parameters Let 𝑋 1 … 𝑋 𝑛 be a sample from a distribution parameterized by 𝜃 How can we estimate The mean of the distribution? Possible estimator: 1 𝑛 𝑖=1 𝑛 𝑋 𝑖 The median of the distribution? Possible estimator: 𝑚𝑒𝑑𝑖𝑎𝑛(𝑋 1 … 𝑋 𝑛 ) The variance of the distribution? Possible estimator: 1 𝑛 𝑖=1 𝑛 ( 𝑋 𝑖 − 𝑋 ) 2 9/18/2018 Estimators

Bias-Variance Tradeoff When estimating a quantity 𝜃, we evaluate the performance of an estimator by computing its risk – expected value of a loss function R 𝜃, 𝜃 =𝐸 𝐿(𝜃, 𝜃 ), where 𝐿 could be Mean Squared Error Loss 0/1 Loss Hinge Loss (used for SVMs) Bias-Variance Decomposition: 𝑌=𝑓 𝑥 +𝜀 𝐸𝑟𝑟 𝑥 =𝐸 𝑓 𝑥 − 𝑓 𝑥 2 = (𝐸 𝑓 𝑥 −𝑓(𝑥)) 2 +𝐸 𝑓 𝑥 −𝐸 𝑓 𝑥 2 + 𝜎 𝜀 2 Bias Variance 9/18/2018 Estimators

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Conditional Probabilities Review: Conditionals Conditional Variables 𝑃 𝑋 𝑌 = 𝑃(𝑋,𝑌) 𝑃(𝑌) note X;Y is a different r.v. Conditional Independence 𝑋⊥𝑌 |𝑍 X and Y are cond. independent given Z iif 𝑃 𝑋,𝑌 𝑍 =𝑃 𝑋 𝑍 𝑃 𝑌 𝑍 Properties of Conditional Independence Symmetry 𝑋⊥𝑌 |𝑍 ⟺𝑌⊥𝑋 |𝑍 Decomposition 𝑋⊥ 𝑌,𝑊 𝑍 ⇒𝑋⊥𝑌 𝑍 Weak Union 𝑋⊥ 𝑌,𝑊 𝑍 ⇒𝑋⊥𝑌 𝑍,𝑊 Contraction (𝑋⊥𝑊 𝑍,𝑌) , 𝑋⊥𝑌 𝑍 ⇒𝑋⊥𝑌,𝑊 𝑍 Can you prove these? 9/18/2018 Conditional Probabilities

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Priors and Posteriors We’ve so far introduced the likelihood function 𝑃 𝐷𝑎𝑡𝑎 𝜃) - the likelihood of the data given the parameter of the distribution 𝜃 𝑀𝐿𝐸 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝜃 𝑃 𝐷𝑎𝑡𝑎 𝜃) What if not all values of 𝜃 are equally likely? 𝜃 itself is distributed according to the prior 𝑃 𝜃 Apply Bayes rule 𝑃 𝜃 𝐷𝑎𝑡𝑎)= 𝑃 𝐷𝑎𝑡𝑎 𝜃)𝑃(𝜃) 𝑃(𝐷𝑎𝑡𝑎) 𝜃 𝑀𝐴𝑃 = 𝑎𝑟𝑔𝑚𝑎𝑥 𝜃 𝑃(𝜃|𝐷𝑎𝑡𝑎)= 𝑎𝑟𝑔𝑚𝑎𝑥 𝜃 𝑃 𝐷𝑎𝑡𝑎 𝜃)𝑃(𝜃) 9/18/2018 Bayes Rule

Conjugate Priors Some examples If the posterior distributions 𝑃 𝜃 𝐷𝑎𝑡𝑎) are in the same family as the prior prob. distribution 𝑃 𝜃 , then the prior and the posterior are called conjugate distributions and 𝑃 𝜃 is called conjugate prior Some examples Likelihood Conjugate Prior Bernoulli/Binomial Beta Poisson Gamma (MV) Normal with known (co)variance Normal Exponential Multinomial Dirichlet I’ll send a derivation How to compute the parameters of the Posterior? 9/18/2018 Bayes Rule

Probabilistic Inference Problem: You’re planning a weekend biking trip with your best friend, Min. Alas, your path to outdoor leisure is strewn with many hurdles. If it happens to rain, your chances of biking reduce to half not counting other factors. Independent of this, Min might be able to bring a tent, the lack of which will only matter if you notice the symptoms of a flu before the trip. Finally, the trip won’t happen if your advisor is unhappy with your weekly progress report. 9/18/2018 Probabilistic Inference

Probabilistic Inference Problem: You’re planning a weekend biking trip with your best friend, Min. Your path to outdoor leisure is strewn with many hurdles. If it happens to rain, your chances of biking reduce to half not counting other factors. Independent of this, Min might be able to bring a tent, the lack of which will only matter if you notice the symptoms of a flu before the trip. Finally, the trip won’t happen if your advisor is unhappy with your weekly progress report. Variables: O – the outdoor trip happens A – advisor is happy R – it rains that day T – you have a tent F – you show flu symptoms 9/18/2018 Probabilistic Inference

Probabilistic Inference Problem: You’re planning a weekend biking trip with your best friend, Min. Alas, your path to outdoor leisure is strewn with many hurdles. If it happens to rain, your chances of biking reduce to half not counting other factors. Independent of this, Min might be able to bring a tent, the lack of which will only matter if you notice the symptoms of a flu before the trip. Finally, the trip won’t happen if your advisor is unhappy with your weekly progress report. Variables: O – the outdoor trip happens A – advisor is happy R – it rains that day T – you have a tent F – you show flu symptoms A O F R T 9/18/2018 Probabilistic Inference

Probabilistic Inference How many parameters determine this model? P(A|O) => 1 parameter P(R|O) => 1 parameter P(F, T|O) => 3 parameters In this problem, the values are given; Otherwise, we would have had to estimate them Variables: O – the outdoor trip happens A – advisor is happy R – it rains that day T – you have a tent F – you show flu symptoms A O F R T 9/18/2018 Probabilistic Inference

Probabilistic Inference The weather forecast is optimistic, the chances of rain are 20%. You’ve barely slacked off this week so your advisor is probably happy, let’s give it an 80%. Luckily, you don’t seem to have the flu. What are the chances that the trip will happen? Think of how you would do this. Hint #1: do the variables F and T influence the result in this case? Hint #2: use the fact that the combinations of values for A and R represent a partition and use one of the partition formulas we learned A O F R T 9/18/2018 Probabilistic Inference

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference 9/18/2018 Recitation 1: Statistics Intro

Recitation 1: Statistics Intro Overview Introduction to Probability Theory Random Variables. Independent RVs Properties of Common Distributions Estimators. Unbiased estimators. Risk Conditional Probabilities/Independence Bayes Rule and Probabilistic Inference Questions? 9/18/2018 Recitation 1: Statistics Intro