ARX Model estimation
-
Upload
hoang-tuong -
Category
Documents
-
view
107 -
download
4
description
Transcript of ARX Model estimation
ARMAX for System
IdentificationDr. Robert Oates
(Based on notes written by Prof. William Harwin)
1
Summary
Systems Identification
Least Squares Linear Regression
Example – How many Harrys are
Pottering?
ARMAX Models and Least Squares Linear
Regression
Common ARMAX Variants
2
WARNING – MATHS!
I tend to avoid using equations in
presentations
It is totally unavoidable here
3
Systems Identification and Modelling
Noise
Real System
System
Model
Adaptive
Component
++
+-
Input
(u)
Output
(y)
Modelling Error
(e)
Model
Reparameterisation
4
Systems Identification and State
Estimation There is an obvious connection between
◦ Identifying unknown parameters that relate input to output (Systems ID)
ARMAX
ARMA
ARX
◦ Identifying unknown parameters that represent a system’s state (State Estimation)
Particle Filters
Kalman Filters
5
Systems Identification and State
Estimation In fact both sets of algorithms use Gauss’
Least Squares Linear Regression
This is an algorithm designed to minimise
the errors for a given function
6
Least Squares Linear Regression
Notation
◦ – The matrix of observed outputs of a
system
◦ –The inputs to the system
◦ –The system model parameters
◦ – The estimate of y based on a system
model and the inputs to the system
y
y
θ
U
7
Least Squares Linear Regression
Calculating the estimate of the output
Uθy ˆ
8
Least Squares Linear Regression
Calculating the model error
Calculating the sum-of-squares error
(SSE)
yye ˆ
eeT
SSE
n
i
iSSE
e
ee1
2
9
Least Squares Linear Regression
Substituting in the definition of e
Substituting in the definition of and
transposing the 1st bracket
)y(y)y(yT ˆˆ SSEe
y
θU))(yUθ(yTTT SSEe
10
Least Squares Linear Regression
Expanding and rearranging
yUUUθUUyUUUθ
yUUUUyyy
T1TTT
T1T
T1TTT
SSEe
11
Least Squares Linear Regression
Expanding and rearranging
UxUxKTTSSEe
yUU)U(UyyyK T1TTT
yUU)(Uθx T1T
12
Least Squares Linear Regression
But wait!
If
Is the only thing we can change, the
smallest possible value of error is when
x = 0 – i.e.
yUU)(Uθx T1T
yUU)(Uθ T1T
13
LSE Example
Mr Gauss, meet Mr Potter The National Office of Statistics keeps
track of all the baby names for certain years (1998, 1999, 2007,2008, 2009)
In 1997 “Harry Potter and the Philosopher’s Stone” was released
Assuming that this causes an exponential growth in the number of “Harrys” can we use the data to predict 2009’s result?
14
Building the Model
Let’s assume that the number is growing
exponentially and use the following model
Let’s also use two model parameters,
making
Uθy
yUθ
)log(
e
01 bbθ
θ
15
Building the Model
As the dimensionality of U and has to
match, we pad U with 1s
θ
12008
12007
11999
11998
U
16
Building the Model
To use the standard formulation of the
LSE we’ll linearise the data by taking logs
8.7008477
8.674368
8.499844
8.468213
6008
5851
4914
4761
log)log('yy
17
Calculating the Parameters
Using the equation
We can calculate the values of b0 and b1
that minimise the error
yUU)(Uθ T1T
636.8790619-
0.02269839θ
18
Example R Code
19
Predicting the Future
Using the optimal model parameters we
can estimate how many “Harrys” will be
born in 2009
6136ˆ 12009 θy e
•Actual number of Harrys from 2009 : 6143 •(from National Office for Statistics)
20
ARMAX and Recursive Least
Squares For modelling generic systems ARMAX is
the standard
◦ AR – AutoRegressive The current output has a relationship to the previous
values of the output
◦ MA – Moving average The noise model used
◦ X – eXogeneous inputs The system relies not only on the current value of the
input, but the history of inputs
21
ARMAX and Recursive Least
Squares An ARMAX Model
isciscii
sbisbii
saisaii
i
eececec
ububub
yayaya
y
...
...
...
2211
2211
2211
22
ARMAX and Recursive Least
Squares
θφiiy
Tiiiiii eeuuyy ......... 212121 iφ
Tccbbaa ......... 212121θ
Collecting all of these instances together gives:
eΦθy
23
ARMAX and Recursive Least
Squares Using the normal LSE for an ARMAX
model
But there are problems
◦ Every time a new observation is added we
have to recalculate the entire thing
◦ We’re performing a matrix inversion on a big
matrix-VERY costly
yΦΦ)(ΦθT1T
24
ARMAX and Recursive Least
Squares Recursion to the rescue!
Can we rephrase the update into a
recursive form?
Correction
becomes
1
nn θθ
yΦΦ)(Φθ T1T
25
ARMAX and Recursive Least
Squares Kalman of “Kalman Filters” fame provides
us with
Where:
nnnn K 1θθ
GainKalman theisnK
nnn yy ˆ
26
ARMAX and Recursive Least
Squares But there’s still a problem!
New equations
We’re still performing a massive matrix inversion
11
1
1
1
ˆˆ
ˆ
T
nnnn
nnn
nnnn
n
T
nnn
K
K
y
φφPP
φP
θθ
θφ
27
ARMAX and Recursive Least
Squares However, there is The Matrix Inversion
Lemma that allows us to convert this
inversion into a more manageable form
nn
T
n
n
T
nnnn
T
nnnn
φPφ
PφφPP
φφPP
1
111
11
1
1
28
Noise
The previous model assumes we know
what the noise terms are
This is plainly ridiculous!
However, we already compute an
estimate of the noise
We can simply substitute this in for en
n
29
Variants of ARMAX
There are two important variants of
ARMAX
◦ ARMAX with Forgetting
◦ Instrumental ARMAX
30
ARMAX With Forgetting
For when the system’s properties are slowly
shifting
Allows less emphasis to be placed on older
observations
All equations stay the same except for P
Introduce a “forgetting factor” (λ) between 0-1
1
1
111
nn
T
n
n
T
nnnnn
φPφ
PφφPPP
31
Instrumental ARMAX
ARMAX assumes that the noise is not
changing in proportion to the input
Not true for many systems
Introduce an instrumental variable
that allows us to avoid the results
becoming skewed by assuming
independence
32
Instrumental Variables - Example
Smoking is correlated with poor health
But can we be sure that is a causal
relationship?
◦ What if there is something which causes
both?
◦ What if poor health causes smoking!?
33
Instrument Variables - Example
Smoking Health
Watching
Television
Adverts
Changing
The
Taxes on
Smoking
34
Instrumental Variables and
Estimators
InputOutput
F(Input,Noise)
Noise
Instrumental
Variable
Output
F(Noise(Input),
Input)
35
Instrumental ARMAX
Two of the standard ARMAX Equations
change
36
Instrumental ARMAX
Two equations change
nn
T
n
n
T
nnnnn
nnnK
zPΦ
PΦzPPP
zP
1
111
1
37
Instrumental ARMAX
All that remains is to find values for z that
are not correlated with the noise, but are
correlated with the input
A popular choice is an estimate of the
current output which uses old model
parameters – so unaffected by the latest
noise
38
Summary
LSE is an effective tool for estimating
parameters that you can’t directly
measure
ARMAX is a generalised model for
discrete, time-varying systems
Many different variants and techniques
available to address a variety of problems
39