D. Samet I. Samet D. Schmeidler S2S Two sums of money, S and 2S are put in two envelopes. Probability 1/2: the double sum is in the blue… To switch or.

Slides:



Advertisements
Similar presentations
The Complexity of Agreement A 100% Quantum-Free Talk Scott Aaronson MIT.
Advertisements

On Complexity, Sampling, and -Nets and -Samples. Range Spaces A range space is a pair, where is a ground set, it’s elements called points and is a family.
Non myopic strategy Truth or Lie?. Scoring Rules One important feature of market scoring rules is that they are myopic strategy proof. That means that.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Better Ways to Cut a Cake Steven Brams – NYU Mike Jones – Montclair State University Christian Klamler – Graz University Paris, October 2006.
Congestion Games with Player- Specific Payoff Functions Igal Milchtaich, Department of Mathematics, The Hebrew University of Jerusalem, 1993 Presentation.
ECO290E: Game Theory Lecture 5 Mixed Strategy Equilibrium.
Econ 805 Advanced Micro Theory 1 Dan Quint Fall 2008 Lecture 4 – Sept
Ch. 10: What is a number?. MAIN DEFINITION OF THE COURSE: A symmetry of an object (in the plane or space) means a rigid motion (of the plane or space)
Chapter 5.1 & 5.2: Random Variables and Probability Mass Functions
The Probabilistic Method Shmuel Wimer Engineering Faculty Bar-Ilan University.
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 2.
Chapter 19: Confidence Intervals for Proportions
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Tirgul 10 Rehearsal about Universal Hashing Solving two problems from theoretical exercises: –T2 q. 1 –T3 q. 2.
CSE115/ENGR160 Discrete Mathematics 02/07/12
Probability Distributions
UNIT II: The Basic Theory Zero-sum Games Nonzero-sum Games Nash Equilibrium: Properties and Problems Bargaining Games Bargaining and Negotiation Review.
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
EC941 - Game Theory Francesco Squintani Lecture 3 1.
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
1 Random Variables and Probability Distributions: Discrete versus Continuous For this portion of the session, the learning objective is:  Learn that the.
UNIT II: The Basic Theory Zero-sum Games Nonzero-sum Games Nash Equilibrium: Properties and Problems Bargaining Games Bargaining and Negotiation Review.
Discrete Probability Distributions
Adding Integers with the Same Sign
Test your knowledge Of Properties from Chapters 1 & 2.
Probability Models Chapter 17.
COMP14112: Artificial Intelligence Fundamentals L ecture 3 - Foundations of Probabilistic Reasoning Lecturer: Xiao-Jun Zeng
Segment Measure and Coordinate Graphing
Chapter 6: Probability Distributions
Copyright ©2011 Nelson Education Limited The Normal Probability Distribution CHAPTER 6.
Probability and inference Random variables IPS chapters 4.3 and 4.4 © 2006 W.H. Freeman and Company.
Lecture 9. If X is a discrete random variable, the mean (or expected value) of X is denoted μ X and defined as μ X = x 1 p 1 + x 2 p 2 + x 3 p 3 + ∙∙∙
Random Variables an important concept in probability.
Population distribution VS Sampling distribution
Econ 805 Advanced Micro Theory 1 Dan Quint Fall 2009 Lecture 4.
1 Chapter 18 Sampling Distribution Models. 2 Suppose we had a barrel of jelly beans … this barrel has 75% red jelly beans and 25% blue jelly beans.
7.1 – Discrete and Continuous Random Variables
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Chapter 6. Threshold Logic. Logic design of sw functions constructed of electronic gates different type of switching element : threshold element. Threshold.
7.2 Means and Variances of Random Variables.  Calculate the mean and standard deviation of random variables  Understand the law of large numbers.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Lecture 8. Random variables Random variables and probability distributions Discrete random variables (Continuous random variables)
Survey – extra credits (1.5pt)! Study investigating general patterns of college students’ understanding of astronomical topics There will be 3~4 surveys.
Random Variables an important concept in probability.
A random variable is a variable whose values are numerical outcomes of a random experiment. That is, we consider all the outcomes in a sample space S and.
Statistics Chapter 6 / 7 Review. Random Variables and Their Probability Distributions Discrete random variables – can take on only a countable or finite.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Chapter 4 Random Variables - 1. Outline Random variables Discrete random variables Expected value 2.
Zero-sum Games The Essentials of a Game Extensive Game Matrix Game Dominant Strategies Prudent Strategies Solving the Zero-sum Game The Minimax Theorem.
A random variable is a variable whose values are numerical outcomes of a random experiment. That is, we consider all the outcomes in a sample space S and.
Negotiating Socially Optimal Allocations of Resources U. Endriss, N. Maudet, F. Sadri, and F. Toni Presented by: Marcus Shea.
Section 7.1. Probability of an Event We first define these key terms: An experiment is a procedure that yields one of a given set of possible outcomes.
By: Donté Howell Game Theory in Sports. What is Game Theory? It is a tool used to analyze strategic behavior and trying to maximize his/her payoff of.
Multi-Variable Calculus. Definition of a Function of Two Variables.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Special Discrete Distributions. Bernoulli Trials The basis for the probability models we will examine in this chapter is the Bernoulli trial. We have.
SIR method continued. SIR: sample-importance resampling Find maximum likelihood (best likelihood × prior), Y Randomly sample pairs of r and N 1973 For.
Probability Distributions ( 확률분포 ) Chapter 5. 2 모든 가능한 ( 확률 ) 변수의 값에 대해 확률을 할당하는 체계 X 가 1, 2, …, 6 의 값을 가진다면 이 6 개 변수 값에 확률을 할당하는 함수 Definition.
Discrete and Continuous Random Variables Section 7.1.
Existence of Non-measurable Set
Pattern Recognition Probability Review
Sampling Distributions
The Duality Theorem Primal P: Maximize
Discrete and Continuous Random Variables
Lecture 8.
What is Probability? Quantification of uncertainty.
Existence of Non-measurable Set
Simple Random Sample A simple random sample (SRS) of size n consists of n elements from the population chosen in such a way that every set of n elements.
Presentation transcript:

D. Samet I. Samet D. Schmeidler

S2S Two sums of money, S and 2S are put in two envelopes. Probability 1/2: the double sum is in the blue… To switch or not to switch?

S2S Probability 1/2: the double sum is in the red. To switch or not to switch? Two sums of money, S and 2S are put in two envelopes. Probability 1/2: the double sum is in the blue…

To switch or not to switch? An envelope is selected at random and handed to you. You can take the money in the envelope or take the money in the other envelope. The situation is symmetric. Why switch?

To switch or not to switch? An argument for switching. Suppose there is X in the red envelope. In the blue: 2X or ½X, each with probability 1/2. The expected sum in the blue: ½ (2X) + ½ (½ X) = 1¼ X This is true for any X. Switch before you see the amount in the red envelope. X An envelope is selected at random and handed to you.

To switch or not to switch? An argument for switching. Suppose there is X in the red envelope. X The same argument holds if you receive the blue envelope. No matter what envelope you get switch without opening it! Mind blowing !

John E. Littlewood (Ervin Schrödinger) 1953 Kraitchik, M Zabell, S.L Nalebuff, B Christensen, R., Utts J 1994 Jackson, F., P. Menzies, G. Oppy Castel, P., B. Diderik Sobel, J. H Brams, S. J., D. M. Kilgour Chae, K. C. Broome, J Bruss, F. T. 1997Arntzenius, F., D. McCarthy Merryfield, K.G., N. Viet, S. Watson Scott, A. D., M. Scott McGrew, T., D. Shier, H. S. Silverstein 1998 Norton, J.D Clark, M., N. Shackel, Horgan, T Blachman, N. M., D. M. Kilgour 2002 Chase, J. Chalmers, D.

荫鋙 舷聪 I write two real numbers x < y on two slips of paper, and put each in one of the envelopes. Which is the larger one? An envelope is selected at random and you observe the number in it. You can bet that the number is the larger of the two, or the smaller. If you are right I pay you 1 if you are wrong you pay me 1.

What can you guarantee? Always bet the number you observe is the larger The probability you are right is 1/2. Can you guarantee that your expected payoff is positive? your expected payoff is 0.

What can you guarantee? A threshold strategy: b Fix a number b (for big) If observed number ≥ b If observed number < b. bet it is the larger. bet it is the smaller.

.. What can you guarantee? b..... Case 1 b ≤ x < y xy Observing either x or y you bet it is the larger. Your expected payoff is 0.

.. What can you guarantee? b..... Case 2 x < y < b xy Observing either x or y you bet it is the smaller. Your expected payoff is 0.

.. What can you guarantee? b..... Case 3 x < b ≤ y xy Observing x you bet it is the smaller. You gain 1 for sure! Observing y you bet it is the larger.

What can you guarantee? xy Or any distribution that assigns positive probability to non-trivial intervals. A mixed strategy: choose b from a normal distribution.

What can you guarantee? xy With probability p b ≤ x < y. Your expected payoff is 0. p q r With probability q x < y < b. Your expected payoff is 0. With probability r, x < b ≤ y. Your gain is 1 for sure. This mixed strategy guarantees that your expected payoff is positive.

What can you guarantee? You can guarantee that your expected payoff is positive. Really??? Therefore, I cannot guarantee for myself zero expected payoffs.

Why cannot I guarantee 0? P is a probability over pairs of numbers x 1 and x 2 in the red and blue envelopes. x1x1 x2x2 x 2 > x 1 x 1 > x 2

Why cannot I guarantee 0? x1x1 x2x2 Property 1: Given any set of values A of x 1, the probability of x 2 > x 1 and x 1 > x 2 is 1/2. x 1 > x 2 x 2 > x 1 A /21/2 1/21/2 P(x 1 > x 2 ) | A) = 1/2 P(x 2 > x 1 ) | A) = 1/2 P is a probability over pairs of numbers x 1 and x 2 in the red and blue envelopes.

B Why cannot I guarantee 0? x1x1 x2x2 Property 2: Given any set of values B of x 2, the probability of x 2 > x 1 and x 1 > x 2 is 1/2. x 1 > x 2 x 2 > x 1 P(x 1 > x 2 ) | B) = 1/2 P(x 2 > x 1 ) | B) = 1/ /21/2 1/21/2 P is a probability over pairs of numbers x 1 and x 2 in the red and blue envelopes.

Why cannot I guarantee 0? When I use the mixed strategy P: Given any information you get about one of the numbers, the probability it is the larger is 1/2. Your bet must result in zero expected payoff. There is no probability P with these two properties. A contradiction!

Explaining puzzle 1 An assumption is made: No matter what envelope you hold and which number you observe, The probability you have the bigger number is 1/2 This assumption presupposes a probability distribution P over pairs of sums which was shown, by puzzle 2, not to exist.

. A direct proof x 1 > x 2 x 2 > x 1.. Q1Q1 Q2Q2 A1A1 A2A2. P(Q 1 )+ P(A 1 )= P(A 2 ) P(Q 2 ) = P(A 1 )  P(A 2 )  P(Q 2 ) = 0 P(Q 1 ) = 0 P( ) = 0

Property 2: Given any set of values B of x 2, the probability of x 2 > x 1 and x 2 < x 1 is 1/2. There is no probability distribution P over pairs of numbers x 1 and x 2 such that, Property 1: Given any set of values A of x 1, the probability of x 2 > x 1 and x 2 < x 1 is 1/2. We have shown the following... We now show that the following generalization holds...

There is no probability distribution P over pairs of numbers x 1 and x 2 such that, Property 1: Given any set of values A of x 1, the probability of x 2 > x 1 and x 2 < x 1 is 1/2. We allow the events x 2 > x 1, x 2 < x 1 and also x 2 = x 1 to have any probability. Property 2: Given any set of values B of x 2, the probability of x 2 > x 1 and x 2 < x 1 is 1/2.

There is no probability distribution P over pairs of numbers x 1 and x 2 such that, Property 1: Given any set of values A of x 1, the probabilities of x 2 > x 1, x 2 = x 1, and x 2 < x 1 are p, q, r. Property 2: Given any set of values B of x 2, the probability of x 2 > x 1 and x 2 < x 1 is 1/2.

There is no probability distribution P over pairs of numbers x 1 and x 2 such that, Property 1: Given any set of values A of x 1, the probabilities of x 2 > x 1, x 2 = x 1, and x 2 < x 1 are p, q, r. Property 2: Given any set of values B of x 1, the probabilities of x 2 > x 1, x 2 = x 1, and x 2 < x 1 are p, q, r. This is wrong if any of p, q, r is 1.

There is no probability distribution P over pairs of numbers x 1 and x 2 such that, Property 1: Given any set of values A of x 1, the probabilities of x 2 > x 1, x 2 = x 1, and x 2 < x 1 are p, q, r, all  1. Property 2: Given any set of values B of x 1, the probabilities of x 2 > x 1, x 2 = x 1, and x 2 < x 1 are p, q, r, all  1.

The one-observation theorem X 1, …, X n are n random variables. Y = f (X 1, …, X n ) is their order. If Y depends on (X 1, …, X n ), then it depends on at least one of the variables. e.g. X 7 < X 2 = X 4 < X 2 …

Interactive epistemology: a reminder states partitions knowledge common knowledge common prior posteriors 1/10 2/10 3/10 1/10 2/10 1/10 0 2/3 1/3

Agreeing to disagree is impossible. (Aumann, 1976) Having common knowledge Having different posteriors for an event. Is agreeing to agree possible? E. Lehrer and D. Samet Is agreeing to agree possible? E. Lehrer and D. Samet

1’s profits 2’s profits Each point is a state that specifies the profits of the firms. Each firm knows only its own profit. 1’s partition – vertical lines. 2’s partition – horizontal lines.

1’s profits 2’s profits 1’s profits 2’s profits E E E P - a common prior, symmetric w.r.t. the axes. The posteriors of E in each state are 1/2. It is common knowledge that the posteriors coincide. There is no common prior for which the agents agree to agree on nontrivial posteriors for E.... There is a common prior for which the agents agree to agree on nontrivial posteriors for E.

1’s profits 2’s profits E E At each state the agents are ignorant of E: They cannot tell whether or not E. F – a 4-state event.... When F is added to the agents’ information they are still ignorant of E.

1’s profits 2’s profits E E At each state the agents are ignorant of E: They cannot tell whether or not E.... There is a finite event F which after being added to the agents’ information, they are still ignorant of E.

1’s profits 2’s profits At each state the agents are ignorant of E: They cannot tell whether or not E. E 1’s maximal profit in F 1 cannot tell not- E 2 cannot tell E no

For countable partitions, it is possible to agree to agree on nontrivial posteriors for E iff there exists a finite F which after being added to the agents’ information, they are ignorant of E. For countable partitions, it is possible to agree to agree on nontrivial posteriors for E iff there exists a finite F which after being added to the agents’ information, they are ignorant of E.

2’s profit Shift-rotate left this line an irrational distance E The state space: the four thick lines in the four unit-squares The prior: the uniform distribution 1’s profit The posteriors of E: 1/2 in each state It is possible to agree to agree on nontrivial posteriors of E. there is no finite F as required.