{ X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
-
Upload
sheila-atkinson -
Category
Documents
-
view
218 -
download
1
Transcript of { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
{Xn: n =0, 1, 2, ...} is a discrete time stochastic process
Markov Chains
{Xn: n =0, 1, 2, ...} is a discrete time stochastic process
If Xn = i the process is said to be in state i at time n
Markov Chains
{Xn: n =0, 1, 2, ...} is a discrete time stochastic process
If Xn = i the process is said to be in state i at time n
{i: i=0, 1, 2, ...} is the state space
Markov Chains
{Xn: n =0, 1, 2, ...} is a discrete time stochastic process
If Xn = i the process is said to be in state i at time n
{i: i=0, 1, 2, ...} is the state space
If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).
Markov Chains
{Xn: n =0, 1, 2, ...} is a discrete time stochastic process
If Xn = i the process is said to be in state i at time n
{i: i=0, 1, 2, ...} is the state space
If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).
Pij is the transition probability from state i to state j
Markov Chains
0
00 01 02
10 11 12
0 1 2
0, , 0 1, 0,1,...
...
...
. . . .
. . . .
...
. . . .
. . . .
ij ijj
i i i
P i j P i
P P P
P P P
P P P
P
P: transition matrix
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:
P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:
P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =
State 0 = rainState 1 = no rain
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:
P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =
State 0 = rainState 1 = no rain
1
1
P
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p
(i≠0, M)
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p
(i≠0, M)
P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p
(i≠0, M)
Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.
P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p
(i≠0, M)
P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p
(i≠0, M)
Pi, i+1=P(Xn=i+1|Xn-1 =i}; Pi, i-1=P(Xn=i-1|Xn-1 =i}
Pi, i+1= p;
Pi, i-1=1-p for i≠0, M
P0,0= 1; PM, M=1 for i≠0, M (0 and M are called absorbing states)
Pi, j= 0, otherwise
random walk: A Markov chain whose state space is 0, 1, 2, ..., and Pi,i+1= p = 1 - Pi,i-1 for i=0, 1,
2, ..., and 0 < p < 1 is said to be a random walk.
Chapman-Kolmogorv Equations
{ | }, 0, , 0nij n m mP P X j X i n i j
Chapman-Kolmogorv Equations
1
{ | }, 0, , 0nij n m m
ij ij
P P X j X i n i j
P P
Chapman-Kolmogorv Equations
1
0
{ | }, 0, , 0
for all , 0, and , 0
( )
nij n m m
ij ij
n m n mij ik kjk
P P X j X i n i j
P P
P P P n m i j
Chapman - Kolmogrov equations
0{ | },
n mij n mP P X j X i
0
00
{ | },
= { , | }
n mij n m
n m nk
P P X j X i
P X j X k X i
0
00
0 00
{ | },
= { , | }
{ | , } { | }
n mij n m
n m nk
n m n nk
P P X j X i
P X j X k X i
P X j X k X i P X k X i
0
00
0 00
00
{ | },
= { , | }
{ | , } { | }
{ | } { | }
n mij n m
n m nk
n m n nk
n m n nk
P P X j X i
P X j X k X i
P X j X k X i P X k X i
P X j X k P X k X i
0
00
0 00
00
0 0
{ | },
= { , | }
{ | , } { | }
{ | } { | }
n mij n m
n m nk
n m n nk
n m n nk
m n n mkj ik ik kjk k
P P X j X i
P X j X k X i
P X j X k X i P X k X i
P X j X k P X k X i
P P P P
( ) : the matrix of transition probabilities n nijn P
P
( )
( ) ( ) ( )
: the matrix of transition probabilities n nij
n m n m
n P
P
P P × P
( )
( ) ( ) ( )
1
: the matrix of transition probabilities
(Note: if [ ] and [ ], then [ ])
n nij
n m n m
M
ij ij ik kjk
n P
a b a b
P
P P × P
A B A × B
Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:
P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =
What is the probability that it will rain four days from today given that it is raining today? Let = 0.7 and = 0.4.
State 0 = rainState 1 = no rain
400What is ?P
400What is ?
0.7 0.3
0.4 0.6
P
P
400
(2)
What is ?
0.7 0.3
0.4 0.6
0.7 0.3 0.7 0.3 0.61 0.39
0.4 0.6 0.4 0.6 0.52 0.48
P
P
P ×
400
(2)
(4) (2) (2)
What is ?
0.7 0.3
0.4 0.6
0.7 0.3 0.7 0.3 0.61 0.39
0.4 0.6 0.4 0.6 0.52 0.48
0.61 0.39 0.61 0.39 0.5749 0.4251
0.52 0.48 0.52 0.48 0.5668 0.4332
P
P
P ×
P P × P ×
400
(2)
(4) (2) (2)
400
What is ?
0.7 0.3
0.4 0.6
0.7 0.3 0.7 0.3 0.61 0.39
0.4 0.6 0.4 0.6 0.52 0.48
0.61 0.39 0.61 0.39 0.5749 0.4251
0.52 0.48 0.52 0.48 0.5668 0.4332
0.574
P
P
P
P ×
P P × P ×
9
How do we calculate ( )?nP X j
Unconditional probabilities
0
How do we calculate ( )?
Let ( )
n
i
P X j
P X i
Unconditional probabilities
0
0 01
How do we calculate ( )?
Let ( )
( ) ( | ) ( )
n
i
n ni
P X j
P X i
P X j P X j X i P X i
Unconditional probabilities
0
0 01
1
How do we calculate ( )?
Let ( )
( ) ( | ) ( )
n
i
n ni
nij ii
P X j
P X i
P X j P X j X i P X i
P
Unconditional probabilities
0
State is accessible from state if 0 for some 0.
Two states that are accessible to each other are said
to communicate ( ).
Any state communicates with itself since 1.
nij
ii
j i P n
i j
P
Classification of States
State is accessible from state if 0 for some 0.nijj i P n
Classification of States
State is accessible from state if 0 for some 0.
Two states that are accessible to each other are said
to communicate ( ).
.
nijj i P n
i j
Classification of States
0
State is accessible from state if 0 for some 0.
Two states that are accessible to each other are said
to communicate ( ).
Any state communicates with itself since 1.
nij
ii
j i P n
i j
P
Classification of States
State communicates with state , for all 0.i i i
Properties
State communicates with state , for all 0.
If state communicates with state , then state communicates
with state .
i i i
i j j
i
Properties
State communicates with state , for all 0.
If state communicates with state , then state communicates
with state .
If state communicates with state , and state communicates
with st
i i i
i j j
i
i j j
ate , then state communicates with state .k i k
Properties
0
If communicates with and communicates with ,
then there exist some and for which 0 and 0.
0.
n mij jk
n m n m n mik ir rk ij jkr
i j j k
m n P P
P P P P P
Two states that communicate are said to belong to the same class.
Classification of States (continued)
Two states that communicate are said to belong to the same class.
Two classes are either identical or disjoint
(have no communicating states).
Classification of States (continued)
Two states that communicate are said to belong to the same class.
Two classes are either identical or disjoint
(have no communicating states).
A Markov chain is said to be if it has onl
irreducible y one class
(all states communicate with each other).
Classification of States (continued)