CSE 321 Discrete Structures

Slides:



Advertisements
Similar presentations
Dana Shapira Hash Tables
Advertisements

Lecture 4A: Probability Theory Review Advanced Artificial Intelligence.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
2 – In previous chapters: – We could design an optimal classifier if we knew the prior probabilities P(wi) and the class- conditional probabilities P(x|wi)
Chapter 4 Mathematical Expectation.
STAT Section 5 Lecture 23 Professor Hao Wang University of South Carolina Spring 2012 TexPoint fonts used in EMF. Read the TexPoint manual before.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Probabilistic Analysis and Randomized Algorithm. Worst case analysis Probabilistic analysis  Need the knowledge of the distribution of the inputs Indicator.
UMass Lowell Computer Science Analysis of Algorithms Spring, 2002 Chapter 5 Lecture Randomized Algorithms Sections 5.1 – 5.3 source: textbook.
CSE 2331/5331 Topic 5: Prob. Analysis Randomized Alg.
Lec 18 Nov 12 Probability – definitions and simulation.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Machine Learning CMPT 726 Simon Fraser University CHAPTER 1: INTRODUCTION.
1 Randomized Algorithms Andreas Klappenecker [using some slides by Prof. Welch]
Chapter 5. Operations on Multiple R. V.'s 1 Chapter 5. Operations on Multiple Random Variables 0. Introduction 1. Expected Value of a Function of Random.
Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 
Gaussians Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read the TexPoint.
Probability and Statistics Dr. Saeid Moloudzadeh Axioms of Probability/ Basic Theorems 1 Contents Descriptive Statistics Axioms of Probability.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
CSE 531: Performance Analysis of Systems Lecture 2: Probs & Stats review Anshul Gandhi 1307, CS building
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
BY Lecturer: Aisha Dawood. The hiring problem:  You are using an employment agency to hire a new office assistant.  The agency sends you one candidate.
CSE 321 Discrete Structures Winter 2008 Lecture 19 Probability Theory TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Ch Counting Techniques Product Rule If the first element or object of an ordered pair can be used in n 1 ways, and for each of these n1 ways.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
CDA6530: Performance Models of Computers and Networks Chapter 1: Review of Practical Probability TexPoint fonts used in EMF. Read the TexPoint manual before.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
Section 7.4 Use of Counting Techniques in Probability.
Hashing TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA Course: Data Structures Lecturer: Haim Kaplan and Uri Zwick.
CDA6530: Performance Models of Computers and Networks Chapter 1: Review of Practical Probability TexPoint fonts used in EMF. Read the TexPoint manual before.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Counting Techniques (Dr. Monticino). Overview  Why counting?  Counting techniques  Multiplication principle  Permutation  Combination  Examples.
Random Variables If  is an outcome space with a probability measure and X is a real-valued function defined over the elements of , then X is a random.
Spring 2016 COMP 2300 Discrete Structures for Computation Donghyun (David) Kim Department of Mathematics and Physics North Carolina Central University.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Theory of Computational Complexity M1 Takao Inoshita Iwama & Ito Lab Graduate School of Informatics, Kyoto University.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
1 COMP2121 Discrete Mathematics Principle of Inclusion and Exclusion Probability Hubert Chan (Chapters 7.4, 7.5, 6) [O1 Abstract Concepts] [O3 Basic Analysis.
Chapter 6: Discrete Probability
Chapter 4 Using Probability and Probability Distributions
Applied Discrete Mathematics Week 11: Relations
22C:19 Discrete Math Discrete Probability
Section 7.6 Bayes’ Theorem.
Hashing Course: Data Structures Lecturer: Uri Zwick March 2008
Dirichlet process tutorial
TexPoint fonts used in EMF.
TexPoint fonts used in EMF.
The Binomial and Geometric Distributions
Advanced Artificial Intelligence
CSE 321 Discrete Structures
CSE 321 Discrete Structures
6.2/6.3 Probability Distributions and Distribution Mean
CS200: Algorithm Analysis
CSE 321 Discrete Structures
CSE 321 Discrete Structures
Chapter 5: Probabilistic Analysis and Randomized Algorithms
CSE 321 Section, Week 8 Natalie Linnell.
CSE 321 Discrete Structures
Hashing Course: Data Structures Lecturer: Uri Zwick March 2008
CSE 321 Discrete Structures
Heapsort and d-Heap Neil Tang 02/14/2008
Chapter 5: Probabilistic Analysis and Randomized Algorithms
Lecture 9 Randomized Algorithms
Chapter 5 – Probability Rules
Presentation transcript:

CSE 321 Discrete Structures Winter 2008 Lecture 21 Probability: Expectation, Analysis of Algorithms TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA

Announcements Readings Probability Theory 6.4 (5.3) Expectation Advanced Counting Techniques – Ch 7. Not covered Relations Chapter 8 (Chapter 7)

Highlights from Lecture 20: Bayes’ Theorem Suppose that E and F are events from a sample space S such that p(E) > 0 and p(F) > 0. Then

Testing for disease Disease is very rare: p(D) = 1/100,000 Testing is accurate: False negative: 1% False positive: 0.5% Suppose you get a positive result, what do you conclude? p(D | Y)  1/500 P(YC|D) P(Y|DC)

Expectation The expected value of random variable X(s) on sample space S is:

Flip a coin until the first head Expected number of flips? Probability Space: Computing the expectation:

Linearity of Expectation E(X1 + X2) = E(X1) + E(X2) E(aX) = aE(X)

Hashing H: M  [0..n-1] If k elements have been hashed to random locations, what is the expected number of elements in bucket j? What is the expected number of collisions when hashing k elements to random locations?

Hashing analysis Sample space: [0..n-1]  [0..n-1]  . . .  [0..n-1] Random Variables Xj = number of elements hashed to bucket j C = total number of collisions Bij = 1 if element i hashed to bucket j Bij = 0 if element i is not hashed to bucket j Cab = 1 if element a is hashed to the same bucket as element b Cab = 0 if element a is hashed to a different bucket than element b

Counting inversions Let p1, p2, . . . , pn be a permutation of 1 . . . n pi, pj is an inversion if i < j and pi > pj 4, 2, 5, 1, 3 1, 6, 4, 3, 2, 5 7, 6, 5, 4, 3, 2, 1

Expected number of inversions for a random permutation

Insertion sort for i :=1 to n-1{ j := i; 4 2 5 1 3 for i :=1 to n-1{ j := i; while (j > 0 and A[ j - 1 ] > A[ j ]){ swap(A[ j -1], A[ j ]); j := j – 1; }

Expected number of swaps for Insertion Sort

Left to right maxima max_so_far := A[0]; for i := 1 to n-1 if (A[ i ] > max_so_far) max_so_far := A[ i ]; 5, 2, 9, 14, 11, 18, 7, 16, 1, 20, 3, 19, 10, 15, 4, 6, 17, 12, 8

What is the expected number of left-to-right maxima in a random permutation