An Introduction to Markov Chains

Slides:



Advertisements
Similar presentations
1 Introduction to Discrete-Time Markov Chain. 2 Motivation  many dependent systems, e.g.,  inventory across periods  state of a machine  customers.
Advertisements

Markov Models.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Cell Transportation How things get in and out of Cells.
Markov Chains Extra problems
Operations Research: Applications and Algorithms
Moving Cellular Materials Chapter 3 Notes. Cell Membrane Controls what moves into and out of cells. Controls what moves into and out of cells. Membrane.
Moving Cellular Materials Pg
4/15/ :21 PM 7.3 Cell Transport © 2007 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are.
Topics Review of DTMC Classification of states Economic analysis
An Introduction to Markov Chains. Homer and Marge repeatedly play a gambling game. Each time they play, the probability that Homer wins is 0.4, and the.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
INDR 343 Problem Session
Brownian Motion and Diffusion Equations. History of Brownian Motion  Discovered by Robert Brown, 1827  Found that small particles suspended in liquid.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
INDR 343 Problem Session
1 Introduction to Stochastic Models GSLM Outline  discrete-time Markov chain  motivation  example  transient behavior.
Law of Averages (Dr. Monticino)
Lecture 11 – Stochastic Processes
A User-Oriented Software Reliability Model Per Trygve Myhrer 20 februar Roger C. Cheung.
stochastic processes(2)
THE NATURE OF PROBABILITY Copyright © Cengage Learning. All rights reserved. 13.
 { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
The Cell Environment Jennifer Naples DI Biology. Cell Membrane Works as the gatekeeper of the cell It is selectively permeable – it lets some substances.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
The Cell Membrane and Passive Transport The Cell Membrane made of two phospholipid layers made of two phospholipid layers The cell membrane has two major.
ST3236: Stochastic Process Tutorial 6
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Cells and Heredity Lesson 1.4 The Cell in Its Environment
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Passive and Active Transport Biology I. Main Idea Cellular transport moves substances within the cell and moves substances into and out of the cell.
Ch. 4: “Cells & Their Environment”
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Copyright © Cengage Learning. All rights reserved. Probability and Statistics.
PASSIVE TRANSPORT This lesson meets the following DoE Specific Curriculum Outcome for Biology 11: 314-1, 314-3,
Copyright © Cengage Learning. All rights reserved.
3.2 Diffusion and Cell Transport
Discrete-time markov chain (continuation)
PageRank and Markov Chains
Lecture 49 More on Phase Transition, binary system
IENG 362 Markov Chains.
Osmosis & Diffusion Packet #8 Sections 7.3 & 36.1.
Probability: Living with the Odds
6.3 Probabilities with Large Numbers
Operations Research: Applications and Algorithms
Probabilistic Dynamic Programming
Expected Value and Fair Games
Osmosis Bellringer: Sketch a cell in a isotonic, hypertonic, and hypotonic solution showing the solute molecules. Objective: SWBAT predict the movement.
Probability 14.1 Experimental Probability 14.2 Principles of Counting
How do materials get in and out of a cell ??
The Cell in It’s Environment
Types of Transport Review
Foreigner - Cold as Ice
Data Analysis and Statistical Software I ( ) Quarter: Autumn 02/03
How do cells take in food when they don’t have hands?
CHAPTER 5 Homeostasis & Transport
Random WALK, BROWNIAN MOTION and SDEs
Statistical Mechanics and Soft Condensed Matter
Passive Cell Transport
PASSIVE TRANSPORT S7L2: Students will describe the structure and function of cells, tissues, organs, and organ systems. a. Explain that cells take in nutrients.
Discrete-time markov chain (continuation)
1.2 Cell Transportation.
Section 12.4 Expected Value (Expectation)
Selectively permeable membranes
Lecture 11 – Stochastic Processes
Presentation transcript:

An Introduction to Markov Chains

Homer and Marge repeatedly play a gambling game Homer and Marge repeatedly play a gambling game. Each time they play, the probability that Homer wins is 0.4, and the probability that Homer loses is 0.6. They only have 4 cents between them. A “Drunkard’s Walk”

Homer and Marge both start with $2 0 1 2 3 4 1 2 3 4 P(Homer wins) = .4 P(Homer loses) = .6 Homer and Marge both start with $2

A Markov Chain is a mathematical model for a process which moves step by step through various states. In a Markov chain, the probability that the process moves from any given state to any other particular state is always the same, regardless of the history of the process.

A Markov chain consists of states and transition probabilities. Each transition probability is the probability of moving from one state to another in one step. The transition probabilities are independent of the past, and depend only on the two states involved. The matrix of transition probabilities is called the transition matrix.

0 1 2 3 4 P(Homer wins) = .4 P(Homer loses) = .6 0 1 2 3 4 P(Homer wins) = .4 P(Homer loses) = .6 Homer and Marge both start with $2 The states are the amount of money Homer has

If P is the transition matrix for a Markov Chain, then the nth power of P gives the probabilities of going from state to state in exactly n steps.

If the vector v represents the initial state, then the probabilities of winding up in the various states in exactly n steps are exactly v times the nth power of P .

When they both start with $2, the probability that Homer is ruined is 9/13. If Homer starts with $ x and Marge starts with $ N-x, and P(Homer wins) = p, P(Homer loses) = q, then the probability Homer is ruined is

Suppose you bet on red in roulette Suppose you bet on red in roulette. P(win) = 18/38 = 9/19; P(lose) = 10/19. Suppose you and the house each have $10 Now suppose you have $ 10 and the house has $20

Now suppose you and the house each have $100.

Andrei Markov (1856-1922) Paul Eherenfest: Diffusion model, early 1900s Statistical interpretation of the second law of thermodynamics: The entropy of a closed system can only increase. Proposed the “Urn Model” to explain diffusion. Albert Einstein, 1905 Realized Brownian motion would provide a magnifying glass into the world of the atom. Brownian motion has been extensively modeled by Markov Chains

Particles are separated by a semi-permeable membrane, which they can pass through in either direction. Suppose that there are N black particles inside the membrane, and N white particles outside the membrane. Each second, one random molecule goes from outside the membrane to inside, and vice versa. There are N+1 states, given by the number of white molecules inside. Osmosis

0 1 2 3 4 5 5 molecules

N molecules

N molecules

If this process runs for a while, an interesting question is: How much time, on average, is the process in each state? A Markov chain with transition matrix P is said to be regular if some power of P has all positive entries for some n. In a regular Markov chain, it is possible to get from any state to any other state in n steps.

The Markov chain for our osmosis process is regular The Markov chain for our osmosis process is regular. Even starting with all black particles inside, if a white particle entered at every step, then the process would pass from zero white inside through all possible states.

For a regular Markov chain, the amount of time the process spends in each state is given by the fixed probability vector, which is the vector a such that Pa = a. Moreover, for any probability vector w, No matter what the starting state, if the process runs for a long time, the probability of being in a given state is given by a.

In the long run, the fraction of time the process spends in each state is given by the fixed probability vector.

For N particles, the fixed vector is: 1 1 1 1 2 1 1 3 3 1 1 4 6 4 1 1 5 10 10 5 1 1 6 15 20 15 6 1

Fixed vectors N = 4 (1/70, 16/70, 36/70, 16/70, 1/70)

Now suppose 1000 molecules The percent of the time that there are between 225 and 275 black molecules inside is 0.999. The percent of the time that there are either fewer than 100 black or more than 400 black molecules inside is

If the universe is 15 billion years old, the average amount of time that a system with 1000 molecules will have fewer than 100 black or more than 400 black molecules inside the membrane is