State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden...
Transcript of State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden...
![Page 1: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/1.jpg)
State Space Models and Hidden MarkovModels
Mladen Kolar and Rob McCulloch
![Page 2: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/2.jpg)
1. Sequential Data2. Markov Models3. State Space Models and Hidden Markov Models4. Learning HMM from fully observable data5. Learning parameters with hidden states6. Possible inference tasks in an HMM7. The Dishonest Casino from the HMM R-package8. Stochastic Volatility as an HMM9. Summary
![Page 3: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/3.jpg)
1. Sequential Data
The underlying assumption we use for our basic learning models(linear regression with regularization, trees, neural nets) is that therows of our data our exchangeable.
That is, we can permute the rows and the data means the samething. Our basic example of this is when we have a iid sample froma finite population.
This assumption underlies our usual out-of-sample strategies (e.g.cross validation) for tuning models.
1
![Page 4: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/4.jpg)
Often, our data has meaningful ordering.
The most obvious example would be time series data where areobservations are taken over time:
I daily tempuratures
I weekly returns on a financial asset.
I monthly returns on a financial asset.
Can we find a pattern over time ...???
But we could look at many kinds “orderings”
I character order within words
I type of work (noun, verb, ..) with sentences.
If the last word was a pronoun is the next work likely to be a verb...???
2
![Page 5: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/5.jpg)
Handwritten character recognition
3
![Page 6: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/6.jpg)
Structured prediction
4
![Page 7: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/7.jpg)
Sequential data
I time-series data (speach)I characters in a sentenceI base pairs along a DNA strand
5
![Page 8: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/8.jpg)
2. Markov Models
Suppose we observe data
X = (X1,X2, . . . ,Xn)
where the order matters.
X1 is the first one.X2 is the second one.
Then it makes sense to think about the joint distribution of X interms of the conditionals
P(Xi | X1,X2, . . . ,Xi−1)
6
![Page 9: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/9.jpg)
We can always write the joint as
P(X1,X2, . . . ,Xn) = P(X1)n∏
i=2
P(Xi | X1,X2, . . . ,Xi−1)
I we get the first one
I and the next each X depend on the past.
Markov models make the simplifying assumption that
P(Xi | X1,X2, . . . ,Xi−1) = P(Xi | Xi−1)
The next one only depends on the past through the current one.
7
![Page 10: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/10.jpg)
Markov Model
Joint distribution of n arbitrary random variables
P(X1,X2, . . . ,Xn) = P(X1) · P(X2 | X1) · . . . · P(Xn | Xn−1)
= P(X1) ·n∏
i=2P(Xi | Xi−1)
8
![Page 11: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/11.jpg)
The AR(1) and Random Walk Models
The simplest time series model used in statistics is the AR(1)model.
Yt = β0 + β1Yt−1 + εt
where the εt are iid.
The εt are the part of Yt unpredictable from the past.
9
![Page 12: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/12.jpg)
The classic special case of the AR(1) model is the random walkmodel:
Yt = β0 + Yt−1 + εt
The next Y is the previous Y plus a random increment β0 + εt .
10
![Page 13: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/13.jpg)
Example
0 20 40 60 80 100
05
1015
Random walk model
Index
A lot of financial price data looks like a random walk See ARandom Walk Down Wall Street by Burton Malkiel..
11
![Page 14: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/14.jpg)
Note
The random walk model give us nonstationary data.
For example example, there is no “mean level”.
For this kind of data it is common to difference the data:
Yt − Yt−1 = β0 + εt
The differences has simpler stochastic properties.
12
![Page 15: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/15.jpg)
3. State Space Models and Hidden Markov Models
State space models think of the system we are observing as have astate which evolves in Markov fashion and, at each time, anobservation which depends on the state.
The classic example is the state is the position of a rocket and theobservation is a noisy observation of the position.
In our characters within words example, the state is the actualcharacter and the observation is the character image, the digitizedwritten character.
13
![Page 16: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/16.jpg)
Understanding the HMM Semantics
P(O1 | X1 = x1) probability of an image given the letter is x1P(X2 = x2 | X1 = x1) probability that letter x2 will follow letter x1Decision about X2 is influenced by all letters.
In our character example:Xi is the character which is the state.Oi is the image of the written character which is the observation.
14
![Page 17: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/17.jpg)
In general, we specify a state space model with three components:
The state evolution:
P(Xi | Xi−1)
The observation distribution:
P(Oi | Xi )
To get things rolling we also need the distribution of the intialstate:The Initial State:
P(X1)
15
![Page 18: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/18.jpg)
In our character example we have:
16
![Page 19: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/19.jpg)
From the key three distribution:
I initial state distribution
I Markov state evolution
I Observation distribution
We can express the full joint distribution all the states and all theobservations.
17
![Page 20: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/20.jpg)
HMMs semantics: Joint distribution
P(X1, . . . ,Xn,O1, . . . ,On)
= P(X1) · P(O1 | X1) ·n∏
i=2P(Xi | Xi−1) · P(Oi | Xi)
P(X1, . . . ,Xn, | o1, . . . , on)
∝ P(X1) · P(o1 | X1) ·n∏
i=2P(Xi | Xi−1) · P(oi | Xi)
18
![Page 21: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/21.jpg)
Hidden Markov Model
A Hidden Markov Model is just a state space model where both Xi
and Oi are discrete.
Our character example gives us an HMM.
19
![Page 22: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/22.jpg)
Perhaps the most basic state space model is
Observation Equation
Ot = µt + vt
State Equation
µt = β0 + β1 µt−1 + wt
where both vt and wt are iid.
A time varying mean level observed with noise.
Note that in this case it is common to use the terms “stateequation” and “observation equation” but the equations are just away of specifying the conditional distributions.
20
![Page 23: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/23.jpg)
4. Learning HMM from fully observable data
Applications analyzed using state space models can either haveboth Oi and Xi observed, fully observed or have only Oi observed.In this second case the states are latent variables.
Our character example could be fully observed.
Our rocket example would have a latent state.
Our time varying mean would have a latent state.
21
![Page 24: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/24.jpg)
In the fully observed case, estimation is relatively straighforward.
If we have parametrized models:
P(Xi | Xi−1, θs)
andP(Oi | Xi , θo)
then
P(X ,O | ) =
P(X1)P(O1 | X1)∏
P(Xi | Xi−1, θs)∏
p(Oi | Xi , θo)
so the likelihood factorizes into a term for θs and a term for θo .
22
![Page 25: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/25.jpg)
For an HMM the parameters are the Markov transition probabilities
P(Xi = a | Xi−1 = b)
and the observation probabilities
P(O = o | X = a)
and both of these can be estimated by simple counts.
23
![Page 26: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/26.jpg)
Learning HMM from fully observable data
Have m data points
Each data point looks like:I X1 = b, X2 = r , X3 = a, X4 = c, X5 = eI O1 = image of b, O2 = image of r , O3 = image of a,
O4 = image of c, O5 = image of e
24
![Page 27: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/27.jpg)
Learning HMM from fully observable data
Learn 3 distributions
P(X1 = a) = Count(X1 = a)m
P(Oi = 54 | Xi = a) = Count(saw letter a and its observation was 54)Count(saw letter a)
P(Xi = b | Xi−1 = a) = Count(saw a letter b following an a)Count(saw an a followed by something)
How many parameters do we have to learn?
25
![Page 28: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/28.jpg)
5. Learning parameters with hidden states
Learning the parameters with latent states is more difficult.
There are many approaches.
For example, the Baum-Welch algorithm takes an EM approach.
26
![Page 29: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/29.jpg)
6. Possible inference tasks in an HMM
Given an estimated state-space model, there are a variety ofinteresting questions we can ask.
What are the future outcomes?
What are the past states?
27
![Page 30: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/30.jpg)
Possible inference tasks in an HMM
Evaluation
Given HMM parameters and observation sequence {oi}5i=1 findthe probability of observation sequence
P(o1, . . . , o5)
Can be computed using forward algorithm.
28
![Page 31: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/31.jpg)
Possible inference tasks in an HMM
Decoding
Marginal probability of a hidden variable
P(Xi = a | o1, o2, . . . , on)
Can be computed using forward-backward algorithm.I linear in the length of the sequence, because HMM is a tree
29
![Page 32: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/32.jpg)
Possible inference tasks in an HMM
Viterbi decoding
Most likely trajectory for hidden vars
maxx1,...,xn
P(X1 = x1, . . . ,Xn = xn | o1, . . . , on)
I most likely word that generated imagesI very similar to forward-backward algorithm
Not the same as decoding
30
![Page 33: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/33.jpg)
Most likely state vs. Most likely trajectoryMost likely state at position i :
argmaxa
P(Xi = a | o1, o2, . . . , on)
Most likely assignment of state trajectory
maxx1,...,xn
P(X1 = x1, . . . ,Xn = xn | o1, . . . , on)
Solution not the same!
x y P(x , y)0 0 0.350 1 0.051 0 0.31 1 0.3
31
![Page 34: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/34.jpg)
7. The Dishonest Casino from the HMM R-package
An unfair casino has two states, Fair and Unfair.
Toss of a die is our outcome O ∈ {1, 2, 3, 4, 5, 6} andX ∈ {Fair ,Unfair}.
In the Fair state, P(Oi = k | Xi = Fair) = 1/6 , k = 1, 2, 3, 4, 5, 6.
In the Unfair state, P(Oi = 6 | Xi = Unfair) = .5 andP(Oi = k | Xi = Unfair) = .1, k = 1, 2, 3, 4, 5.
32
![Page 35: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/35.jpg)
Parameters of the model
transProbs: P(Xi | Xi−1)emissionProbs: P(Oi | Xi )startProbs: P(X1)
##states
States = c("Fair", "Unfair")
## symbols (outcomes)
Symbols = 1:6
##distributions
transProbs = matrix(c(0.99, 0.01, 0.02, 0.98), c(length(States),
length(States)), byrow = TRUE)
emissionProbs = matrix(c(rep(1/6, 6), c(rep(0.1, 5), 0.5)),
c(length(States), length(Symbols)), byrow = TRUE)
startProbs=c(.5,.5)
##name the rows and columns
row.names(transProbs)=States
colnames(transProbs)=States
row.names(emissionProbs)=States
colnames(emissionProbs)=as.character(Symbols)
33
![Page 36: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/36.jpg)
transProbs: P(Xi | Xi−1)emissionProbs: P(Oi | Xi )startProbs: P(X1)
> States
[1] "Fair" "Unfair"
> Symbols
[1] 1 2 3 4 5 6
> transProbs
Fair Unfair
Fair 0.99 0.01
Unfair 0.02 0.98
> emissionProbs
1 2 3 4 5 6
Fair 0.1666667 0.1666667 0.1666667 0.1666667 0.1666667 0.1666667
Unfair 0.1000000 0.1000000 0.1000000 0.1000000 0.1000000 0.5000000
> startProbs
[1] 0.5 0.5
So, for example, the probability of going from state=Fair tostate=Unfair is .01:P(Xi = Unfair | Xi−1 = Fair) = .01.
The probability P(Oi = 6 | Xi = Unfair) = .5. 34
![Page 37: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/37.jpg)
Simulate from the HMM model:
library(HMM)
hmm = initHMM(States, Symbols, startProbs=startProbs,
transProbs = transProbs, emissionProbs = emissionProbs)
nSim = 2000
set.seed(99)
sim = simHMM(hmm, nSim)
35
![Page 38: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/38.jpg)
top: Xi
bottom: Oi , color indicates Xi .
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●
●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
0 500 1000 1500 2000
0.0
0.2
0.4
0.6
0.8
1.0
Index
stat
e
0=fair, 1=Unfair
●
●●
●●
●
●●●
●
●●●●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●●
●
●
●
●
●●●●
●
●
●
●
●●
●
●
●
●
●
●●●●●●●
●
●
●
●●●
●
●
●
●●●
●
●
●
●●
●
●●●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●
●●
●●●●●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●
●
●●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●
●●●
●●
●
●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●●●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●
●
●●●●
●
●
●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●
●●
●
●
●●
●●
●
●●●●
●
●
●
●
●●●●
●
●●
●●●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●●
●
●
●
●●
●●●●
●●
●
●●
●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●
●
●
●
●
●
●●●●●●
●
●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●
●
●
●
●●
●
●
●
●
●
●
●●●●●●●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●
●
●
●●●●
●●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●●
●
●●
●
●
●●
●
●
●
●
●
●●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●●●
●
●●
●
●
●●●●
●
●
●●●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●●
●
●
●
●
●●●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●●
●
●
●●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●●
●
●●●
●
●●●●●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
0 500 1000 1500 2000
12
34
56
Index
outc
ome
(die
)
black=Fair, Red=Unfair
36
![Page 39: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/39.jpg)
Use Baum-Welch to estimate the model starting at the true values:
e1 = baumWelch(hmm,sim$observation)
print(e1)
$hmm
$hmm$States
[1] "Fair" "Unfair"
$hmm$Symbols
[1] 1 2 3 4 5 6
$hmm$startProbs
Fair Unfair
0.5 0.5
$hmm$transProbs
to
from Fair Unfair
Fair 0.98896977 0.01103023
Unfair 0.01830519 0.98169481
$hmm$emissionProbs
symbols
states 1 2 3 4 5 6
Fair 0.1589834 0.1646821 0.16786499 0.17041688 0.1781368 0.1599158
Unfair 0.1001766 0.1053668 0.08249513 0.09131092 0.1253341 0.4953165
Pretty good!
37
![Page 40: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/40.jpg)
Start at bad:itransProbs = matrix(c(0.5, 0.5, 0.5, 0.5), c(length(States),
length(States)), byrow = TRUE)
iemissionProbs = matrix(c(rep(1/6, 6), rep(1/6, 6)),
c(length(States), length(Symbols)), byrow = TRUE)
ihmm3 = initHMM(States, Symbols, transProbs = itransProbs, emissionProbs = iemissionProbs)
e3 = baumWelch(ihmm3,sim$observation)
print(e3)
print(e3)
$hmm
$hmm$States
[1] "Fair" "Unfair"
$hmm$Symbols
[1] 1 2 3 4 5 6
$hmm$startProbs
Fair Unfair
0.5 0.5
$hmm$transProbs
to
from Fair Unfair
Fair 0.5 0.5
Unfair 0.5 0.5
$hmm$emissionProbs
symbols
states 1 2 3 4 5 6
Fair 0.136 0.1415 0.1345 0.1395 0.1575 0.291
Unfair 0.136 0.1415 0.1345 0.1395 0.1575 0.291
Pretty bad. 38
![Page 41: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/41.jpg)
Start at true transition, but bad emission.
> ihmm4 = initHMM(States, Symbols, transProbs = transProbs, emissionProbs = iemissionProbs)
> e4 = baumWelch(ihmm4,sim$observation)
> print(e4)
$hmm
$hmm$States
[1] "Fair" "Unfair"
$hmm$Symbols
[1] 1 2 3 4 5 6
$hmm$startProbs
Fair Unfair
0.5 0.5
$hmm$transProbs
to
from Fair Unfair
Fair 0.98896977 0.01103023
Unfair 0.01830519 0.98169481
$hmm$emissionProbs
symbols
states 1 2 3 4 5 6
Fair 0.1589834 0.1646821 0.16786499 0.17041688 0.1781368 0.1599158
Unfair 0.1001766 0.1053668 0.08249513 0.09131092 0.1253341 0.4953165
Pretty good!
39
![Page 42: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/42.jpg)
Use Viterbi to infer states:
estates = viterbi(e4$hmm,sim$observation)
print(table(estates,sim$states))
estates Fair Unfair
Fair 1162 104
Unfair 67 667
40
![Page 43: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/43.jpg)
State estimation:
●
●●
●●
●
●●●
●
●●●●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●●
●
●
●
●
●●●●
●
●
●
●
●●
●
●
●
●
●
●●●●●●●
●
●
●
●●●
●
●
●
●●●
●
●
●
●●
●
●●●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●
●●
●●●●●●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●
●
●●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●
●●●
●●
●
●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●●●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●
●
●●●●
●
●
●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●
●●
●
●
●●
●●
●
●●●●
●
●
●
●
●●●●
●
●●
●●●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●●●
●
●●
●
●
●
●●
●●●●
●●
●
●●
●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●
●
●
●
●
●
●●●●●●
●
●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●
●
●
●
●●
●
●
●
●
●
●
●●●●●●●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●
●
●
●●●●
●●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●●
●
●●
●
●
●●
●
●
●
●
●
●●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●●●
●
●●
●
●
●●●●
●
●
●●●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●●
●
●
●
●
●●●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●●
●
●
●●
●
●
●
●●●●
●
●●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●●
●
●●●
●
●●●●●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
0 500 1000 1500 2000
12
34
56
Index
outc
ome
(die
)
black=wrong, Red=correct
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●
●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
0 500 1000 1500 2000
0.0
0.2
0.4
0.6
0.8
1.0
Index
true
sta
te
true states
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
0 500 1000 1500 2000
0.0
0.2
0.4
0.6
0.8
1.0
Index
estim
ated
sta
te
estimated states
Seems to miss the quick back and forths.
41
![Page 44: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/44.jpg)
8. Stochastic Volatility as an HMM
An important model in Finance is the stochastic volatility model.
We use a state space to model a time varying variance.
42
![Page 45: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/45.jpg)
A fit to some data:
43
![Page 46: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/46.jpg)
Our Data:
A time series plot (x axis is week number) of the weekly returnsfrom an equally weighted portfolio of 50 stocks from the SP500.
●
●
●
●
●
●●
●
●
●●●
●●●
●●●
●
●●
●●●
●
●●
●
●●●
●
●
●●●
●●
●
●
●
●
●
●
●●●
●●●
●●●●●●●
●●
●●
●
●
●
●●●
●●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●●
●●●
●
●●
●
●●
●●
●
●●●●●●●
●
●
●●●
●●
●●●
●
●
●
●
●●●●
●●●
●
●
●
●●●
●●
●●
●
●
●●●●●●
●
●
●●●
●
●
●
●
●●
●
●●●
●
●●●
●
●●
●
●●●●●
●●
●
●
●
●●●●●●●
●●
●
●●●●●
●●●●●●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●●
●
●
●●●
●
●
●
●
●
●●●
●
●
●
●
●
●●●●
●
●
●●
●●
●
●
●●
●●
●●●
●
●
●
●
●●
●
●
●●
●●●
●●●●
●
●
●●●
●
●
●
●●
●●●
●●●●
●●●●●
●●
●
●
●●●●
●●●●●
●
●
●●●●●●●
●●●●
●
●
●●
●
●
●●●
●●●
●
●●●●
●●●
●
●
●●●●●●
●
●●
●
●
●
●●●
●
●●
●●
●●
●
●
●
●
●●
●
●●
●●●●
●
●●●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●●
●●
●●●
●●
●
●
●
●
●
●●●
●
●●
●
●
●●
●
●●●
●
●
●●
●
●●
●
●
●
●●
●
●●●
●
●
●●●
●
●
●
●
●
●●●
●●●
●
●●●
●
●●
●●●●●
●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●●●●
●●
●
●●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●●
●●
●●●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●●●
●●●
●●●●●●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●●
●
●
●
●●●
●
●
●●●●
●●●
●●●
●
●
●
●
●●●
●
●
●●●●
●●●●●
●
●
●
●●●
●●
●●
●●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●●●●
●●●
●
●
●
●
●
●●●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●●●●●●
●
●●
●
●
●
●
●
●●●
●●●●
●●
●
●●
●●
●
●●
●●●●●●
●
●
●●●●
●
●
●●●
●
●●
●●
●
●
●
●
●
●●●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●●●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●●●
●
●
●
●
●●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●●
●●
●
●●●●●
●
●
●
●●●●
●
●●●●
●
●
●●●●
●●●
●●●
●
●
●
●
●
●
●
●
●●●●●●
●●
●●
●●
●●●●●●●●
●●
●●●●●●●●●●●●●
●
●●
●●
●
●
●●●●●
●
●●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●●●●●●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●●●●
●●
●
●
●
●●
●●●
●●
●
●
●
●●●●
●
●
●●
●
●●●●
●
●●
●●
●
●●
●
●●
●
●●
●
●●●
●
●
●
●
●
●●●●●●
●
●
●
●
●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●●●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●●●
●
●
●●●
●
●
●●●●●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●●●●
●●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●●●
●●●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●●
●
●
●
●
●
●●
●●
●●
●
●
●
●●
●●●●
●
●●●
●
●●
●
●
●
●
●●
●
●
●
●●●
●●
●
●●●●
●●●●●●
●
●●●
●
●●
●
●●
●
●
●
●●
●●●●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●●●
●●
●
●
●●●●
●
●
●
●●
●
●●●
●●●●●●
●●●●
●
●
●
●
●
●●●
●
●●
●
●●
●●●
●
●
●
●
●●
●●
●
●●
●●
●●
●●●●●
●
●●
●
●●
●
●
●●
●
●
●●●
●
●
●
●●
●●
●
●
●
●
●
●●●●
●
●
●●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●●●
●
●●
●
●
●
●●
●
●●●
●
●
●●●
●
●
●
●●
●
●●
●●●●
●
●●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●●
●
●
●
●●
●●
●●
●
●
●
●●●●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●●●●
●
●
●
●●
●
●●●●
●
●
●
●●●●
●
●
●●
●●●●
●
●●●
●●
●
●●●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●●●●
●
●
●●●
●
●
●
●●
●
●●●
●
●●●●
●●
●
●●●
●
●●●●
●
●
●
●
●●●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●●●
●
●●●
●
●●
●
●
●
●
●●●●●
●●
●
●●
●●●●
●
●
●
●
●●
●●
●
●●●●●●●●
●
●●●●
●●
●●
●
●
●
●
●
●●
●●
●
●
●●●●●●●
●●●
●
●●●●
●
●
●
●●●
●●●
●●●●
●
●
●●●●
●
●
●●●●
●
●●●●●
●
●
●
●
●
●
●
●●●●
●●●
●●●●●
●●●
●
●
●●●●●●●●
●
●●●
●●●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●●●
●
●
●●●●●
●
●●
●
●
●
●●
●
●●●
●
●●
●
●●●●
●●
●●
●
●●
●
●
●●
●●●●
●●
●●●●
●●
●●
●●
●
●●●●
●●
●
●●●
●
●●●●●●●●●●●●●
●●●●●●
●●●●
●
●
●
●●
●●●●●●●●
●●●●
●
●●
●
●●●●●
●●
●
●●●
●
●●●
●
●●●
●●
●
●●
●●●●●●●
●
●
●●●●●●
●
●●
●●●
●●
●
●
●
●
●
●
●
●●●
●
●●
●
●
●
●●
●●●●●●●●
●●●
●
●●●●●●●●●●●●●●●
●●●
●
●
●
●●
●
●
●●●●●●
●
●●
●
●
●●●●
●●
●●●●●●●
●
●
●
●
●●●●●●
●●
●
●
●●●
●
●●
●
●●●●●
●●●
●●●●●●
●●●●
●
●
●●●●
●●
●
●
●
●
●
●●●
●
●
●
●●●●●●●●●●●
●●●●●●●●
●
●
●●●●●●●●●●●●●●●●
●
●●●
●
●●●●●●
●
●●●
●
●
●●●●●●
●
●
●●
●
●
●
●●●
●
●●●●●●●●
●
●
●●●●●●
●●●●●●●●
●
●●●
●
●
●●●●●●
●
●
●
●●●
●
●●●●●
●●●●
●
●●●
●●●
●●
●●
●
●
●
●●●
●●
●
●●
●
●
●●
●●●●
●
●
●●●
●
●●●
●
●
●
●●
●●
●
●●
●●
●●●
●
●●
●●●●
●
●●
●●●●●●●
●●
●●
●●●●
●●●●●●●●●
●●
●●
●●
●●●●●
●●
●●
●●●●●
●●●●●●
●●●●●●●●●●●
●●●
●●●
●
●●●●●●●●●●●●●
●
●●
●●
●
●●●●●●●
●
●●●●●●
●●
●●
0 500 1000 1500 2000 2500
−0.
060.
000.
06
Index
y
weekly returns of equally weighted 50 from SP500
●
●
●
●●●●
●
●
●●
●
●●●
●●●
●
●●
●●●
●
●
●●●●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●●●●●
●
●●
●●
●
●
●
●●●●
●
●
●
●●
●●
●●
●●
●●
●
●
●
●●
●
●
●●
●
●●●
●
●●●●
●
●●●●●●
●●
●
●●●
●●
●●●
●
●
●●
●
●●●
●
●●
●
●●●●
●
●●●
●
●
●
●
●●●
●
●
●●●●
●●
●
●
●
●●
●
●●●●
●●●●
●●
●
●●●●●
●●●
●
●
●●●●●●●●●●●●●●
●●
●●●●●
●
●●
●
●
●
●
●
●
●
●●
●
●●●●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●●
●
●
●
●●
●●●●
●●
●●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●
●●●
●
●●
●●●●●
●
●●●●●
●
●
●●
●
●
●
●●
●●●●
●
●
●●●●
●
●●●
●
●●●
●●
●
●
●●●
●●●●●
●
●●●●●●
●
●●
●
●●●
●●●
●●
●●
●
●
●
●
●
●
●
●●
●●
●●●●
●
●●●
●●●●
●
●●●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●●
●●●●
●●
●
●●
●
●
●
●
●
●●●
●
●●
●
●
●●
●
●●●
●
●
●●●
●
●●
●
●
●●
●
●
●●
●
●
●●●
●●
●
●
●
●●●
●
●
●
●
●●
●
●
●●
●
●●
●●
●●●
●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●●
●
●
●
●●
●
●
●
●
●●
●●●●
●●
●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●●
●●
●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●●●●
●●
●●●●
●●
●●●
●
●
●●
●
●●●●●
●●
●●
●
●●●
●
●●
●
●●●●
●
●●●
●
●
●
●
●●●
●
●
●●●
●
●●●
●●●
●
●
●●●●
●
●●
●●●●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●●●
●
●●●●●
●●
●
●
●
●
●
●
●●●
●●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●●●
●
●●
●●●●
●
●
●
●●●●●
●●●●●●
●
●
●
●●●●●
●●●●●●
●
●●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●●●●●
●
●●●●●
●
●
●
●
●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●●●●●●
●
●●
●
●
●
●
●●●
●
●
●●
●
●
●
●●●●●
●
●
●
●
●
●
●●
●
●●●●●●●
●
●●●●
●●●●●●●●●●●●●●●●●●●
●●●
●
●
●●●●●
●
●●●●●
●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●●●●●
●
●
●●●●●
●
●●●●●●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●●●●
●
●
●
●
●●●●●
●
●●
●●●
●●
●●●
●●
●
●●
●
●
●●
●●●●●
●
●●
●
●
●
●●●●●●
●
●
●
●
●
●●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●●●●●
●
●
●
●●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●●
●●
●●
●
●●
●
●
●
●●●
●●●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●●●
●
●●●●●●
●
●
●
●
●
●●●●
●
●●●
●
●
●
●●●
●●●
●●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●●●●
●
●●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●●●
●●●●●●
●
●●
●
●●
●
●●●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●●●●
●●●
●
●
●
●●
●
●●●
●
●
●
●
●●●●
●
●
●
●
●
●●
●●
●●
●
●●●●●●●●●
●
●
●
●
●●
●
●
●●
●
●
●●●
●
●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●●●●●●
●
●●●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●●
●
●
●●
●●●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●●
●
●
●
●
●●●
●●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●●●
●
●●
●●
●
●
●
●
●●●
●
●●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●●●
●
●●●
●●
●
●
●
●●●
●
●
●●●
●
●
●●
●
●
●●●
●
●●●
●●
●
●●●●●●
●
●●
●
●
●
●
●
●
●
●●●●
●
●
●
●●●●
●
●●●●
●
●●
●
●●●
●
●●●●●●●
●
●
●
●
●
●●
●
●●
●
●
●●●
●
●●
●●●
●●
●
●●●●
●
●
●
●●●
●
●●●
●
●●●
●
●●●
●●●●●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●●●●●●●●
●
●●●●
●●
●
●●●
●
●
●
●
●
●●
●
●
●●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●●●
●●
●
●
●●
●●●
●●
●●●●
●
●●
●●
●●●
●
●
●●●●
●●●●●
●
●●●●●
●
●●●
●●●●●●●●
●
●
●●●
●
●●●●●●
●●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●●●
●
●
●●●●●●
●
●●●●
●
●●
●●●●
●
●●
●●
●●●
●●●
●
●●●
●
●
●●
●
●●●
●
●
●●●●
●●
●●●●
●●
●●●●●
●
●●●
●
●●●
●
●●●●●
●
●●●●
●●●
●●
●●●●●●
●●●●●●●●●
●●
●
●●●
●
●●
●●●●●●●
●
●
●●●●●
●●
●
●
●●
●●●●●
●●●●●●●
●
●
●●●●●●●
●●
●●
●
●
●
●
●
●
●●
●
●●
●●
●
●●●
●
●
●
●●●
●
●●●●●●
●
●●
●
●
●●●●●●●●●●●●●
●
●
●
●
●
●
●●●
●
●
●
●●●●
●●●
●●●●●●●
●
●
●
●●
●
●
●
●●
●
●
●●●●●●
●
●●●●
●●
●
●●
●
●
●
●●
●●●●●●●●●
●
●●
●
●
●●●
●
●●
●
●
●●●
●
●
●●
●
●
●
●
●
●●●●●
●●●
●●
●
●●●●●●
●
●
●
●●●●●●●●●●●
●
●●●●
●
●●●
●
●●
●
●●●
●
●●●
●
●
●●
●
●●●●●
●
●●●●
●●●
●
●
●
●●●●●●
●●
●●
●●●●
●
●
●●
●●●●●●●●
●●
●●●●●
●
●
●
●
●●●
●
●●●●●●●●●
●
●●●
●●●
●●
●●
●
●●●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●●
●
●
●●●
●
●●
●
●
●●●
●
●●
●
●
●●
●
●
●●
●●●●●●●●
●●
●●●●●
●●●●●●
●
●
●
●●●
●●●●●●●●
●
●●
●●●●●
●
●●●●●
●
●●●●●●●●●●●
●●
●
●●
●
●●●●●●●●●●●●●
●
●
●
●●
●
●●●●●●●
●
●●●●●●●●
●●
0 500 1000 1500 2000 2500
0.00
0.03
0.06
Index
y
absolute weekly returns of equally weighted 50 from SP500
The volatility varies noticeably!!Very head to see any pattern in the mean level over time. 44
![Page 47: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/47.jpg)
Let’s model the volatility by disretizing the absolute returns intothree groups.
qv = quantile(y,probs=c(.3,.7,1))
yc = cut(y,c(0,qv))
levels(yc) = c("low","mid","high")
hist(y)
for(i in 1:3) abline(v=qv[i],lwd=3)
We cut at the 30% and 70% quantiles giving three groups.
yc is our O.
45
![Page 48: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/48.jpg)
Here are the groups.
Histogram of y
y
Fre
quen
cy
0.00 0.02 0.04 0.06
020
040
060
080
010
00
Our Oi will just be which of the three groups the return is in.
46
![Page 49: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/49.jpg)
Fit the HMM.
Symbols = levels(yc)
States=c("Lvol","Hvol")
transProbs= matrix(c(c(.9,.1),c(.1,.9)),nrow=2,byrow=T)
emissionProbs= matrix(c(c(.5,.4,.1),c(.1,.3,.6)),nrow=2,byrow=T)
hmm = initHMM(States, Symbols, transProbs = transProbs, emissionProbs = emissionProbs) #no startProbs
ehmm = baumWelch(hmm,as.character(yc))
estates = viterbi(ehmm$hmm,as.character(yc))
47
![Page 50: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/50.jpg)
Intial HMM:
> hmm
$States
[1] "Lvol" "Hvol"
$Symbols
[1] "low" "mid" "high"
$startProbs
Lvol Hvol
0.5 0.5
$transProbs
to
from Lvol Hvol
Lvol 0.9 0.1
Hvol 0.1 0.9
$emissionProbs
symbols
states low mid high
Lvol 0.5 0.4 0.1
Hvol 0.1 0.3 0.6
48
![Page 51: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/51.jpg)
Estimated HMM:
> ehmm
$hmm
$hmm$States
[1] "Lvol" "Hvol"
$hmm$Symbols
[1] "low" "mid" "high"
$hmm$startProbs
Lvol Hvol
0.5 0.5
$hmm$transProbs
to
from Lvol Hvol
Lvol 0.990995061 0.009004939
Hvol 0.006428351 0.993571649
$hmm$emissionProbs
symbols
states low mid high
Lvol 0.4042614 0.4592877 0.1364509
Hvol 0.2293875 0.3595039 0.4111086
49
![Page 52: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/52.jpg)
here is the fit.
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
0 500 1000 1500 2000 2500
0.0
0.2
0.4
0.6
0.8
1.0
Index
nest
ates
●
●
●
●●●●
●
●
●●
●
●●●
●●●
●
●●
●●●
●
●
●●●●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●●●●●
●
●●
●●
●
●
●
●●●●
●
●
●
●●
●●
●●
●●
●●
●
●
●
●●
●
●
●●
●
●●●
●
●●●●
●
●●●●●●
●●
●
●●●
●●
●●●
●
●
●●
●
●●●
●
●●
●
●●●●
●
●●●
●
●
●
●
●●●
●
●
●●●●
●●
●
●
●
●●
●
●●●●
●●●●
●●
●
●●●●●
●●●
●
●
●●●●●●●●●●●●●●
●●
●●●●●
●
●●
●
●
●
●
●
●
●
●●
●
●●●●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●●
●
●
●
●●
●●●●
●●
●●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●
●●●
●
●●
●●●●●
●
●●●●●
●
●
●●
●
●
●
●●
●●●●
●
●
●●●●
●
●●●
●
●●●
●●
●
●
●●●
●●●●●
●
●●●●●●
●
●●
●
●●●
●●●
●●
●●
●
●
●
●
●
●
●
●●
●●
●●●●
●
●●●
●●●●
●
●●●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●●
●●●●
●●
●
●●
●
●
●
●
●
●●●
●
●●
●
●
●●
●
●●●
●
●
●●●
●
●●
●
●
●●
●
●
●●
●
●
●●●
●●
●
●
●
●●●
●
●
●
●
●●
●
●
●●
●
●●
●●
●●●
●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●●
●
●
●
●●
●
●
●
●
●●
●●●●
●●
●●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●●
●●
●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●●●●
●●
●●●●
●●
●●●
●
●
●●
●
●●●●●
●●
●●
●
●●●
●
●●
●
●●●●
●
●●●
●
●
●
●
●●●
●
●
●●●
●
●●●
●●●
●
●
●●●●
●
●●
●●●●
●
●
●
●
●
●
●
●●●●
●
●
●●
●
●
●●●
●
●●●●●
●●
●
●
●
●
●
●
●●●
●●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●●●
●
●●
●●●●
●
●
●
●●●●●
●●●●●●
●
●
●
●●●●●
●●●●●●
●
●●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●●●●●
●
●●●●●
●
●
●
●
●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●●●●●●
●
●●
●
●
●
●
●●●
●
●
●●
●
●
●
●●●●●
●
●
●
●
●
●
●●
●
●●●●●●●
●
●●●●
●●●●●●●●●●●●●●●●●●●
●●●
●
●
●●●●●
●
●●●●●
●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●●●●●
●
●
●●●●●
●
●●●●●●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●●●●
●
●
●
●
●●●●●
●
●●
●●●
●●
●●●
●●
●
●●
●
●
●●
●●●●●
●
●●
●
●
●
●●●●●●
●
●
●
●
●
●●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●●●●●
●
●
●
●●
●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●●
●●
●●
●
●●
●
●
●
●●●
●●●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●●●
●
●●●●●●
●
●
●
●
●
●●●●
●
●●●
●
●
●
●●●
●●●
●●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●●●●
●
●●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●●●
●●●●●●
●
●●
●
●●
●
●●●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●●●●
●●●
●
●
●
●●
●
●●●
●
●
●
●
●●●●
●
●
●
●
●
●●
●●
●●
●
●●●●●●●●●
●
●
●
●
●●
●
●
●●
●
●
●●●
●
●
●
●●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●●●●●●
●
●●●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●●
●
●
●●
●●●
●
●
●●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●●
●
●
●
●
●●●
●●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●●●
●
●●
●●
●
●
●
●
●●●
●
●●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●●●
●
●●●
●●
●
●
●
●●●
●
●
●●●
●
●
●●
●
●
●●●
●
●●●
●●
●
●●●●●●
●
●●
●
●
●
●
●
●
●
●●●●
●
●
●
●●●●
●
●●●●
●
●●
●
●●●
●
●●●●●●●
●
●
●
●
●
●●
●
●●
●
●
●●●
●
●●
●●●
●●
●
●●●●
●
●
●
●●●
●
●●●
●
●●●
●
●●●
●●●●●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●●●●●●●●
●
●●●●
●●
●
●●●
●
●
●
●
●
●●
●
●
●●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●●
●
●●●
●●
●
●
●●
●●●
●●
●●●●
●
●●
●●
●●●
●
●
●●●●
●●●●●
●
●●●●●
●
●●●
●●●●●●●●
●
●
●●●
●
●●●●●●
●●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●●●
●
●
●●●●●●
●
●●●●
●
●●
●●●●
●
●●
●●
●●●
●●●
●
●●●
●
●
●●
●
●●●
●
●
●●●●
●●
●●●●
●●
●●●●●
●
●●●
●
●●●
●
●●●●●
●
●●●●
●●●
●●
●●●●●●
●●●●●●●●●
●●
●
●●●
●
●●
●●●●●●●
●
●
●●●●●
●●
●
●
●●
●●●●●
●●●●●●●
●
●
●●●●●●●
●●
●●
●
●
●
●
●
●
●●
●
●●
●●
●
●●●
●
●
●
●●●
●
●●●●●●
●
●●
●
●
●●●●●●●●●●●●●
●
●
●
●
●
●
●●●
●
●
●
●●●●
●●●
●●●●●●●
●
●
●
●●
●
●
●
●●
●
●
●●●●●●
●
●●●●
●●
●
●●
●
●
●
●●
●●●●●●●●●
●
●●
●
●
●●●
●
●●
●
●
●●●
●
●
●●
●
●
●
●
●
●●●●●
●●●
●●
●
●●●●●●
●
●
●
●●●●●●●●●●●
●
●●●●
●
●●●
●
●●
●
●●●
●
●●●
●
●
●●
●
●●●●●
●
●●●●
●●●
●
●
●
●●●●●●
●●
●●
●●●●
●
●
●●
●●●●●●●●
●●
●●●●●
●
●
●
●
●●●
●
●●●●●●●●●
●
●●●
●●●
●●
●●
●
●●●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●●
●
●
●●●
●
●●
●
●
●●●
●
●●
●
●
●●
●
●
●●
●●●●●●●●
●●
●●●●●
●●●●●●
●
●
●
●●●
●●●●●●●●
●
●●
●●●●●
●
●●●●●
●
●●●●●●●●●●●
●●
●
●●
●
●●●●●●●●●●●●●
●
●
●
●●
●
●●●●●●●
●
●●●●●●●●
●●
0 500 1000 1500 2000 2500
0.00
0.02
0.04
0.06
Index
y
Might be interesting with more groups !!!
50
![Page 53: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/53.jpg)
9. Summary
Useful for modeling sequential data with few parameters usingdiscrete hidden states that satisfy Markov assumption.
Lot’s of applications:
Speech, OCR, finance
Representation:
I initial prob, transition prob, emission prob
I Parameter sharing, only need to learn 3 distributions
51
![Page 54: State Space Models and Hidden Markov Models · 2. Markov Models 3. State Space Models and Hidden Markov Models 4. Learning HMM from fully observable data 5. Learning parameters with](https://reader035.fdocuments.net/reader035/viewer/2022062507/5fcf0c871009fd57285d1765/html5/thumbnails/54.jpg)
Summary
Algorithms for inference and learning in HMMsI Computing marginal likelihood of the observed sequence:
forward algorithmI Predicting a single hidden state: forward-backwardI Predicting an entire sequence of hidden states: viterbiI Learning HMM parameters:
I hidden states observed: simple countingI otherwise Baum-Welch algorithm (an instance of an EM
algorithm)
52