Computer simulation Sep. 9, 2013. QUIZ 2 Determine whether the following experiments have discrete or continuous out comes A fair die is tossed and the.

Slides:



Advertisements
Similar presentations
Random variables 1. Note  there is no chapter in the textbook that corresponds to this topic 2.
Advertisements

Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
AP STATISTICS Simulation “Statistics means never having to say you're certain.”
Business Statistics for Managerial Decision
Mathematics in Today's World
1 Methods of Experimental Particle Physics Alexei Safonov Lecture #21.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Evaluating Hypotheses
2-1 Sample Spaces and Events Conducting an experiment, in day-to-day repetitions of the measurement the results can differ slightly because of small.
Statistical Background
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Lecture Slides Elementary Statistics Twelfth Edition
Review of Probability and Random Processes
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Simulation and Probability
Statistical inference Population - collection of all subjects or objects of interest (not necessarily people) Sample - subset of the population used to.
Random Variable and Probability Distribution
Chapter 6: Normal Probability Distributions
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
© Copyright McGraw-Hill CHAPTER 6 The Normal Distribution.
 1  Outline  stages and topics in simulation  generation of random variates.
2.1 Random Variable Concept Given an experiment defined by a sample space S with elements s, we assign a real number to every s according to some rule.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Copyright © Cengage Learning. All rights reserved. CHAPTER 9 COUNTING AND PROBABILITY.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Applied Business Forecasting and Regression Analysis Review lecture 2 Randomness and Probability.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Monte Carlo Methods Versatile methods for analyzing the behavior of some activity, plan or process that involves uncertainty.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
Introduction to Behavioral Statistics Probability, The Binomial Distribution and the Normal Curve.
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
One Random Variable Random Process.
The two way frequency table The  2 statistic Techniques for examining dependence amongst two categorical variables.
1 2 nd Pre-Lab Quiz 3 rd Pre-Lab Quiz 4 th Pre-Lab Quiz.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Stats 845 Applied Statistics. This Course will cover: 1.Regression –Non Linear Regression –Multiple Regression 2.Analysis of Variance and Experimental.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Introduction.
Expected values of discrete Random Variables. The function that maps S into S X in R and which is denoted by X(.) is called a random variable. The name.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
CHAPTER 5 Simulation Modeling. Introduction In many situations a modeler is unable to construct an analytic (symbolic) model adequately explaining the.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
Probability and Moment Approximations using Limit Theorems.
From the population to the sample The sampling distribution FETP India.
ELEC 303 – Random Signals Lecture 17 – Hypothesis testing 2 Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 2, 2009.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Lecture Slides Elementary Statistics Eleventh Edition and the Triola Statistics Series by.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Introduction. What is probability? Probability is the chance that a given event will occur. Examples Probability that it will rain tomorrow. Probability.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
AP Statistics From Randomness to Probability Chapter 14.
Monte Carlo Methods Some example applications in C++
Probability Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. It would be natural to guess that each of these events occurs.
Lecture 2 – Monte Carlo method in finance
M248: Analyzing data Block A UNIT A3 Modeling Variation.
Presentation transcript:

Computer simulation Sep. 9, 2013

QUIZ 2 Determine whether the following experiments have discrete or continuous out comes A fair die is tossed and the number of dots on the face noted. Identify the random experiment, the set of outcomes, and the probabilities of each possible outcome.

Introduction Introduce computer simulations (ComSim) Show how to use ComSim to provide counter examples (can’t be used to prove theorems) simulate the outcomes of a discrete random variable Give examples of typical ComSim used in probability Monte Carlo computer approaches are a broad class of computationa algorithms that rely on repeated random sampling to obtain numerical results; i.e., by running simulations many times over in order to calculate those same probabilities heuristically just like actually playing and recording your results in a real casino situation: hence the name. 3

Why Use Computer Simulations? To provide counterexamples to proposed theorems To build intuition by experimenting with random numbers To lend evidence to a conjecture (an unproven proposition). 4

Building intuition using ComSim If U 1 and U 2 are the outcomes of two experiments each is a number from 0 to 1. What are the probabilities of X = U 1 + U 2 ? The mathematical answer will be given in later lectures Assume X is equally likely to be in the interval [0,2]. Let’s check if our intuition is correct by carrying out a ComSim. Generate values U 1 and U 2 them sum them up to obtain X. Repeat this procedure M times. Build a histogram which gives the number of outcomes in each bin. 5

Building intuition using ComSim Assume the following M = 8 outcomes were obtained {1.7, 0.7, 1.2, 1.3, 1.8, 1.4, 0.6, 0.4} Choosing the four bins [0, 0.5], (0.5, 1], (1, 1.5], (1.5, 2] we get 6

Building intuition using ComSim It is clear that values of X are not equally likely. 7 M = 1000 More probable

Building intuition using ComSim The probabilities are higher near one because there are more ways to obtain these values. X = 2 can be obtained from U 1 = U 2 = 1, but X = 1 can be obtained from U 1 = U 2 = ½ or U 1 = ¼, U 2 = ¾, or U 1 = ¾, U 2 = ¼, etc. 8

Building intuition using ComSim The result can be extended to the addition of three of more experimental outcomes. Define X 3 = U 1 + U 2 + U 3 and X 4 = U 1 + U 2 + U 3 + U 4. The histogram appears more like a bell-shaped(Gaussian) curve Conjecture: as we add more outcomes we obtain Gaussian 9

ComSim of Random Phenomena A random variable (RV) X is a variable whose value is subject to variations due to chance. Discrete : the number of dots on a die X can take on the values in the set {1, 2, 3, 4, 5, 6} Continuous : the distance of a dart from the center of the dartboard radius r = 1. {r : 0 ≤ r ≤ 1} To determine various properties of X we perform a number of experiments (trials) that is denoted by M. Assume that X = {x 1, x 2, …,x N } with probabilities { p 1, p 2, …,p N }. 10

ComSim of Random Phenomena As an example if N = 3 we can generate M values of X by using the following code segment. 11 For a continuous RV X that is Gaussian we can use the code

Determining Characteristics of RV The probability of the outcomes in the discrete case and the PDF in the continuous case are complete description of a random phenomenon. Consider a discrete RV, the outcome of a coin toss. Let X be 0 if a tail is observed with probability p and let X be 1 if head is observed with probability 1 – p. 12 To determine the probability of head we could toss a coin a large number of times and estimate p. We can simulate this result by using ComSim and estimate p. However, it is not always correct. p is slightly large than the true value of 0.4 due to imperfection of the random number generator

Probability density function(PDF) estimation PDF can be estimated by first finding the histogram and then dividing the number of outcomes in each bin by M to obtain the probability. The to obtain the PDF p X ( x ) recall that the probability of X taking on a value in an interval is found as the area under the PDF and if a = x 0 – Δ x/2 and b = x 0 – Δ x/2,where Δ x is small, then Hence, we need only divide the estimated probability by the bin width Δ x. 13

Probability density function(PDF) estimation Applying this estimation procedure to the set of simulated outcomes that has Gaussian PDF we are able to obtain estimated PDF. 14

Probability of an interval To determine P [ a ≤ X ≤ b ] we need to generate M realizations of X, then count the number of outcomes that fall into the [ a, b ] interval and divide by M. If we let a = 2, and b = ∞, then we should obtain the value (using numerical integration) And therefore very few realizations can be expected to fall in this interval. 15

Determining Characteristics of RV Average(mean) value A mean value of transformed variable f(x) = x 2 16

Multiple random variables Consider an experiment the choice of a point in the square {(x, y) : 0 ≤ x ≤ 1, 0 ≤ y ≤ 1} according to some procedure. So it yields two RVs or the vector [ X 1, X 2 ] T. This procedure may or may not cause the value of x 1 to depend on the value of x No dependencyDependency There is a strong dependency, because if for example x 1 = 0.5, then x 2 would have to lie in the interval [0.25, 0.75].

Multiple Random Variables Consider the two random vectors, where U i is generated using rand. 18 Then the result of M = 1000 realizations are the scatter diagrams No dependencyDependency

Monte Carlo simulation Consider a circle inscribed in a unit square. Given that the circle and the square have a ratio of areas that is π/4, the value of π can be approximated using a Monte Carlo method: 1.Draw a square on the ground, then inscribe a circle within it. 2.Uniformly scatter some objects of uniform size (grains of rice or sand) over the square. 3.Count the number of objects inside the circle and the total number of objects. 4.The ratio of the two counts is an estimate of the ratio of the two areas, which is π/4. Multiply the result by 4 to estimate π. 19

Digital Communications In a phase-shift keyed (PSK) digital system a bit is communicated to receiver by sending 0 : s 0 (t) = Acos(2 π F 0 t + π) 1 : s 1 (t) = Acos(2 π F 0 t ) 20 The receiver

Digital Communications The input to the receiver is the noise corrupted signal where w(t) is the channel noise. The output of the multiplier will be (ignoring the noise) 21 Recall

Digital Communications After the lowpass filter, which filters out the Acos(2 π F 0 t ) part of the signal and sampler we have 22 To model the channel noise we assume that the actual value ξ of observed is, where W is a Gaussian RV

Digital Communications It is of interest to determine how the error depends on the signal amplitude A. If A is a large positive amplitude, the chance that the noise will cause an error or equivalently, ξ ≤ 0,should be small. The probability of error P e = P[A/2 + W ≤ 0] 23 Usually P e =