Presentation is loading. Please wait.

Presentation is loading. Please wait.

Replicator Dynamics.

Similar presentations


Presentation on theme: "Replicator Dynamics."— Presentation transcript:

1 Replicator Dynamics

2 Nash makes sense if… Rational Calculating

3 Such as Auctions…

4 Or Oligopolies…

5 But why would Nash matter for our puzzles?

6 Norms/rights/morality are not chosen after careful consideration at the time they influence our behavior We believe we have rights We feel angry when uses service but doesn’t pay We kinda like the mug the experimenter gave us Rather, they are adopted beforehand via prestige-biased imitation reinforcement learning evolution Do these dynamic processes lead to Nash?

7 Yes. We’ll confirm this by modeling prestige-biased learning and evolution in games using the canonical model: the replicator dynamic

8 We consider a large population, , of players Each period, a player is randomly matched with another player and they play a two-player game

9 Each player is assigned a strategy
Each player is assigned a strategy. Players cannot choose their strategies

10 We can think of this in a few ways, e.g.,:
Players are “born with” their mother’s strategy (ignore sexual reproduction) Players “imitate” others’ strategies Note that rationality and conscious deliberation don’t enter the picture

11 Suppose there are two strategies, and We start with: Some number, , of players assigned strategy and some number, , of players assigned strategy

12 We denote the proportion of the population playing strategy as , so:

13 The state of the population is given by where , , and

14 Since players interact with another randomly chosen player in the population, a player’s expected payoff is determined by the payoff matrix and the proportion of each strategy in the population (we could use actual payoffs, but by law of large numbers and linearity of payoffs,  it won’t effect anything)

15 For example, consider the coordination game: And the following starting frequencies:
(we could use actual payoffs, but by law of large numbers and linearity of payoffs,  it won’t effect anything)

16 Payoff for player who is playing is Since depends on and we write

17 We interpret payoffs as rate of reproduction (fitness)

18 The average fitness, , of a population is the weighted average of the two fitness values.

19 How fast do and grow? Recall First, we need to know how fast grows Let Each individual reproduces at a rate , and there are of them. So: Next we need to know how fast grows. By the same logic: By the quotient rule, and with a little simplification…

20 Current frequency of strategy
This is the replicator equation: Growth rate of A Current frequency of strategy Own fitness relative to the average

21 This is the replicator equation:
Growth rate of A Current frequency of strategy Because that’s how many As can reproduce Own fitness relative to the average This is our key property. More successful strategies grow faster

22 If: : The proportion of As is non-zero : The fitness of A is above average Then: : A will be increasing in the population

23 We are interested in states where since these are places where the proportions of A and B are stationary We will call such points steady states

24 The steady states are such that

25 Recall the payoffs of our (coordination) game:
b c d A B a > c b < d

26 = “asymptotically stable” steady states
i.e. steady states s.t. the dynamics point toward it

27 What were the pure Nash equilibria of the coordination game?

28 a, a b, c c, b d, d A B

29 Replicator predicts we’ll end up at one of these

30 And the mixed strategy equilibrium is:

31

32 Replicator teaches us: We end up at Nash (…if we end) AND not just any Nash (e.g., not mixed Nash in coordination) Start here.

33 Let’s generalize this to three strategies:

34 Now… is the number playing

35 Now… is the proportion playing

36 The state of population is where , , , and

37 For example, Consider the Rock-Paper-Scissors Game: With starting frequencies:
1 -1

38 Fitness for player playing is

39 In general, fitness for players with strategy R is:

40 The average fitness, f, of the population is:

41 Current frequency of strategy
Replicator is still: Current frequency of strategy Own fitness relative to the average

42

43

44

45 Notice not asymptotically stable It cycles

46 R P S 2 -1

47

48 Note now is asymptotically stable


Download ppt "Replicator Dynamics."

Similar presentations


Ads by Google