Markov Chains (Part 2) - University of...

23
Markov Chains - 1 Markov Chains (Part 2) More Examples and Chapman-Kolmogorov Equations

Transcript of Markov Chains (Part 2) - University of...

Markov Chains - 1

Markov Chains (Part 2)

More Examples and Chapman-Kolmogorov Equations

A Stock Price Stochastic Process •  Consider a stock whose price either goes up or down every day. Let

Xt be a random variable that is: –  0 if the stock price goes up on day t, and –  1 if the stock price goes down on day t.

•  The probability that the stock price goes up tomorrow, given it goes up today, is 0.7. If the stock goes down today, the probability that it goes up tomorrow is 0.5.

•  Does the stochastic process Xt possess the Markovian property? •  What is the one-step transition probability matrix?

Markov Chains - 2

A Stock Price Stochastic Process •  Consider a stock whose price either goes up or down every day. Let

Xt be a random variable that is: –  0 if the stock price goes up on day t, and –  1 if the stock price goes down on day t.

•  The probability that the stock price goes up tomorrow, given it goes up today, is 0.7. If the stock goes down today, the probability that it goes up tomorrow is 0.5.

•  Does the stochastic process Xt possess the Markovian property? •  What is the one-step transition probability matrix?

Markov Chains - 3

Stock behavior today

Probability of stock going up tomorrow

Probability of stock going down tomorrow

Up 0.7 0.3 Down 0.5 0.5

A Stock Price Stochastic Process •  Now, suppose the probability of whether the stock goes up or down

tomorrow depends on the stock’s behavior today and yesterday •  Intuitively, can we define a stochastic process Xt that possesses the

Markovian property?

Markov Chains - 4

Stock behavior yesterday

Stock behavior today

Probability of stock going up tomorrow

Up Up 0.9 Down Up 0.6 Up Down 0.5 Down Down 0.3

A Stock Price Stochastic Process •  We can expand the state space to include a little bit of history, and create a

Markov chain. •  Let Xt be a random variable that has four states:

–  0, 0 if the stock price went up yesterday on day t-1 and up today on day t, –  1, 0 if the stock price went down yesterday on day t-1 and up today on day t, –  0, 1 if the stock price went up yesterday on day t-1 and down today on day t, –  1, 1 if the stock price went down yesterday on day t-1 and down today on day t

•  Intuitively, now the stochastic process Xt satisfies the Markovian property

Markov Chains - 5

A Stock Price Stochastic Process •  Now, the one-step transition probability matrix is 4x4

Markov Chains - 6

State Transition (t-1,t) to (t,t+1)

0,0 (up, up)

1,0 (down, up)

0,1 (up, down)

1,1 (down, down)

0,0 (up, up)

0.9 0 0.1 0

1,0 (down, up) 0.6 0 0.4 0

0,1 (up, down) 0 0.5 0 0.5

1,1 (down, down) 0 0.3 0 0.7

Multi-step Transition Probabilities •  So far, we have only focused on one-step transition probabilities pij

–  But these don’t directly provide answers to some interesting questions •  For example, if it is sunny today, what is the probability that it will

be sunny day after tomorrow? •  If the stock went down today, what is the probability that it will go

down three days later? –  These are called multi-step (or n-step) transition probabilities. –  In particular, we want to find P(Xt+n=j | Xt= i), which is denoted by pij

(n)

•  The Chapman-Kolmogorov (C-K) equation is a formula to calculate n-step transition probabilities.

Markov Chains - 8

n-step Transition Probabilities

•  If the one-step transition probabilities are stationary, then the n-step transition probabilities are written: P(Xt+n=j | Xt=i) = P(Xn=j | X0=i) for all t = pij

(n)

•  Interpretation:

Xt

j

i

1 0 n t+1 t … … 2 t+n … t+2

i

j

Markov Chains - 9

Inventory Example n-step Transition Probabilities

•  p12(3) = conditional probability that…

starting with one camera, there will be two cameras after three weeks

•  Four ways that could happen:

t

Xt 3

2

1

1 0 2 3

Two-step Transition Probabilities for the Weather Example

•  Intuition: to go from state 0 to 0 in two steps we can either –  Go from 0 to 0 in one step and then go from 0 to 0 in one step OR –  Go from 0 to 1 in one step and then go from 1 to 0 in one step

•  Therefore, p(2)00 = P(X2 = 0 | X0 =0) = p00p00+p01p10

•  In short,

•  You just wrote down your first Chapman-Kolmogorov equation using intuition

•  Now use the above intuition to write down the other 2-step transition probabilities p(2)

01 ,p(2)10 ,p(2)

11

•  These four two-step transition probabilities can be arranged in a matrix P(2) called the two-step transition matrix

Markov Chains - 10

p00(2) = p0k pk0

k = 0

1!

Two-step Transition Probabilities for the Weather Example

•  Interpretation : p01(2) is the probability that the weather day after

tomorrow will be rainy if the weather today is sunny. •  An interesting observation: the two-step transition matrix is the

square of the one-step transition matrix!!! That is, P(2)=P2

•  Why? Recall matrix product to write down P2 and confirm that it

equals P(2) above.

Markov Chains - 11

!

P ( 2 ) =p

00

( 2 ) p01

( 2 )

p10

( 2 ) p11

( 2 )

"

# $

%

& ' =

p00p

00+ p

01p10

p00p

01+ p

01p11

p10p

00+ p

11p10

p10p

01+ p

11p11

"

# $

%

& '

Two-step Transition Probabilities for General Markov Chains

•  For a general Markov chain with states 0,1,…,M, to make a two-step

transition from i to j, we go to some state k in one step from i and then go from k to j in one step. Therefore, the two-step transition probability matrix is, P(2)=P2

Markov Chains - 12

with pij(2) = pik pkj

k=0

M

!P(2) =

p00(2) p01

(2) ... p0M(2)

p10(2) p11

(2) ... p1M(2)

! !pM 0(2) pM1

(2) ... pMM(2)

!

"

#####

$

%

&&&&&

n-step Transition Probabilities for General Markov Chains

•  For a general Markov chain with states 0,1,…,M, the n-step transition

from i to j means the process goes from i to j in n time steps •  Let m be a non-negative integer not bigger than n. The Chapman-

Kolmogorov equation is:

•  Interpretation: if the process goes from state i to state j in n steps then

it must go from state i to some state k in m (less than n) steps and then go from k to j in the remaining n-m steps.

•  In matrix notation, P(n)=P(m) P(n-m). This implies that the n-step transition matrix is the nth power of the one-step transition matrix (Why? - substitute m=1 and see what happens!)

Markov Chains - 13

pij(n) = pik

(m)pkj(n!m)

k=0

M

"

Markov Chains - 14

Chapman-Kolmogorov Equations

•  Consider the case when m = 1:

pij(n) = pik

(m) pkj(n!m)

k=0

M

" for all i, j, n and 0 ≤ m ≤ n

!

pij(n) = p

ikpkj(n"1) = P #P(n"1)

k = 0

M$

t

Xt

j

i

k

1 0 n

M

!

pik

!

pkj(n"1)

Markov Chains - 15

Chapman-Kolmogorov Equations

•  The pij(n) are the elements of the n-step transition

matrix, P(n)

•  Note, though, that

P(n) = P !P(n"1)

= P !P !P(n" 2)

!

= P !P !P"P = Pn

Markov Chains - 16

How to use C-K Equations

•  To answer the following question: what is the probability that starting in state i the Markov chain will be in state j after n steps? –  First write down the one-step transition probability matrix. –  Then use your calculator to calculate the nth power of this one-

step transition probability matrix –  Write down the ijth entry of this nth power matrix.

Markov Chains - 17

Weather Example n-step Transitions

Two-step transition probability matrix:

P(2) = =

2

8.02.05.05.0!"

#$%

&

RainySunny

RainySunny

!"

#$%

&

74.026.065.035.0

RainySunny

RainySunny

Markov Chains - 18

Inventory Example n-step Transitions

Two-step transition probability matrix: P(2) =

= Note: even though p12=0, p12

(2) >0

!

0.080 0.184 0.368 0.3680.632 0.368 0 00.264 0.368 0.368 00.080 0.184 0.368 0.368

"

#

$ $ $ $

%

&

' ' ' '

2

!!!!

"

#

$$$$

%

&

165.0300.0286.0249.0097.0233.0319.0351.0233.0233.0252.0283.0165.0300.0286.0249.0

Markov Chains - 19

Inventory Example n-step Transitions

p13(2) = probability that the inventory goes from 1 camera to

3 cameras in two weeks = 0.233

(note: even though p13 = 0) Question:

Assuming the store starts with 3 cameras, find the probability there will be 0 cameras in 2 weeks

p30

(2) = 0.249

Markov Chains - 20

(Unconditional) Probability in state j at time n

•  The transition probability pij(n) is a conditional probability,

P(Xn=j | X0=i) •  How do we “un-condition” the probabilities? •  That is, how do we find the (unconditional) probability of

being in state j at time n, P(Xn=j)? •  The probabilities P(X0=i) define the initial state distribution

P(Xn = j) = P(Xn = j | X0 = i)P(X0 = i)i=0

M

!

= pij(n)P(X0 = i)

i=0

M

!

Markov Chains - 21

Inventory Example Unconditional Probabilities

•  If initial conditions were unknown, we might assume it’s equally likely to be in any initial state: P(X0=0) = ¼ = P(X0=1) = P(X0=2) = P(X0=3)

•  Then, what is the probability that we order (any) camera in two weeks? P(order in 2 weeks) = P(in state 0 at time 2) = P(X0=0)p00

(2)+P(X0=1)p10(2)+P(X0=2)p20

(2) +P(X0=3)p30(2)

= ¼(0.249) + ¼(0.283) + ¼(0.351) + ¼(0.249) = 0.283

Markov Chains - 22

Steady-State Probabilities

•  As n gets large, what happens? •  What is the probability of being in any state?

(e.g., in the inventory example, what happens as more and more weeks go by?)

•  Consider the 8-step transition probability for the inventory example.

P(8) = P8 = !

!!!

"

#

$$$$

%

&

166.0264.0285.0286.0166.0264.0285.0286.0166.0264.0285.0286.0166.0264.0285.0286.0

Markov Chains - 23

Steady-State Probabilities

•  In the long-run (e.g., after 8 or more weeks), the probability of being in state j is …

•  These probabilities are called the steady state probabilities

•  Another interpretation is that πj is the fraction of time the process is in state j (in the long-run)

•  This limit exists for any “irreducible ergodic” Markov chain (Next, we will define these terms, then return to steady-state probabilities)

jnijnp !=

"#

)(lim