TIME SERIES ANALYSIS · 08/03/2016 J.MURO 1 TIME SERIES ANALYSIS. LECTURE 8: LECTURE 8: ARIMA ARIMA...
Transcript of TIME SERIES ANALYSIS · 08/03/2016 J.MURO 1 TIME SERIES ANALYSIS. LECTURE 8: LECTURE 8: ARIMA ARIMA...
08/03/2016 J.MURO 1
TIME SERIES ANALYSISTIME SERIES ANALYSISTIME SERIES ANALYSISTIME SERIES ANALYSIS.
LECTURE 8: LECTURE 8: LECTURE 8: LECTURE 8: ARIMAARIMAARIMAARIMA MODELS: MODELS: MODELS: MODELS: SEASONAL MODELS (1).SEASONAL MODELS (1).SEASONAL MODELS (1).SEASONAL MODELS (1).
J.MURO 2
Seasonal models.
�Concepts recap.
�Which type of information suggests the building of a seasonal model?
08/03/2016
J.MURO 3
Seasonal Processes.
�Univariate Time Series Analysis (TSA).
�Objective: Forecasting.
�Wold (1938):
◦ Any stationary process can be uniquely represented by
Y(t) ≡ Yt = Dt + Xt
◦ D: linear deterministic (usually the mean) ; X is
stochastic MA(∞), uncorrelated. B
08/03/2016
J.MURO 4
Seasonal information.
�Real TS socioeconomic data collected on a lower than annual basis. It captures the information of annual TS variables but for shorter periods. Unemployment (EPA vs INEM).
�Seasonal TS show a seasonality pattern, with variations in the level of the series which recur, at the same period, every year. A repeated pattern dependent on the seasons.
�Examples: quarterly; monthly.
08/03/2016
Inbound Tourism (Spanish monthly data)
08/03/2016 J.MURO
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
7,000,000
8,000,000
9,000,000
10,000,000
11,000,000
I II III IV I II III IV I II III IV I II III IV I II III IV I II III IV
1996 1997 1998 1999 2000 2001
Inbound Tourism (Spain)
J.MURO 6
8.1. ARIMA(p,d,q)*(P,D,Q)s models.
�TS models for quarterly or monthly data present non-seasonal (regular) and seasonal components.
�We will utilize multiplicative processes.
�For the sake of simplicity in the exposition we will first analyze “pure” seasonal processes.
08/03/2016
J.MURO 7
Formal expression:
Φs(L)[1-Ls]D Yt= Θs(L)εt
where:
Φs(L)=1-φ1Ls-φ2L
2s-......-φp LPs
Θs(L)=1+θ1Ls+θ2L
2s+.....+θq LQs
08/03/2016
J.MURO 8
8.1.1. Modelling steps.
�Box-Jenkins (1970).
�Steps:
◦ Identification;
◦estimation;
◦Diagnostic checking (validation) and
◦forecasting.
F
08/03/2016
Modelling strategy (steps)
Identification
Estimation
Diagnostic
checking
Forecasting
08/03/2016 J.MURO
Yes
No
BOX-JENKINS (1970).
B
08/03/2016 J.MURO 10
Identification
�Is the time series mean stationary in its seasonal component (D value)?
�Is the time series variance stationary?
�Which is the seasonal autoregresive process order (P value)?
�Which is the seasonal moving average process order (Q value)?
B
J.MURO 11
8.1.2. ARMA(P,Q)s models.
�Some simple examples.
�AR(p)s.
�MA(q)s.
�Mixed, ARMA(p,q)s
◦ F
08/03/2016
J.MURO 12
Some simple examples.
AR(1)4 with parameter equal to 0.7.
AR(1)4 with parameter equal to -0.7.
AR(1)4 with parameter equal to 1.
AR(1)12 with parameter equal to 0.8.
AR(1)12 with parameter equal to -0.8.
AR(1)12 with parameter equal to 1.
08/03/2016
J.MURO 13
Some simple examples.
MA(1)4 with parameter equal to 0.5.
MA(1)4 with parameter equal to -0.5.
MA(1)12 with parameter equal to - 0.6.
MA(1)12 with parameter equal to 0.6.
ARMA(1,1)4 with parameters 0.8 y 0.5, respectively.
ARMA(1,1)12 with parameters 0.7 y 0.6, respectively.
B
08/03/2016
J.MURO 14
AR(P)s processes.
�Stationarity.
�Autocorrelation function.
�Partial autocorrelation function.
�Examples.
◦ B
08/03/2016
J.MURO 15
MA(Q)s processes.
�Invertibility.
�Autocorrelation function.
�Partial autocorrelation function.
�Examples.
◦ B
08/03/2016
J.MURO 16
ARMA(P,Q)s processes.
�Stationarity and invertibility.
�Autocorrelation function.
�Partial autocorrelation function.
�Examples.
◦B
08/03/2016
J.MURO 17
AR(P)s: Stationarity.
�Absence of seasonal unit roots.
�The seasonal polynomial have many roots.
�In particular.
08/03/2016
08/03/2016 J.MURO 18
S=4 1-L4= 0; implies:
L=1; L= -1; L= i; L= -i
s=12 1-L12= 0 implies:
L=1; L= -1; and the roots of the polynomials
1+ L2; 1-L + L2; 1+L + L2
2
2
31
31
LL
LL
++
+−
B
J.MURO 19
AR(P)s: Autocorrelation Function.
�Sample ACF has spikes at lag s and its multiples: 2s, 3s, 4s.....
�If you only consider the above lags (spikes) the ACF tails off.
B
08/03/2016
J.MURO 20
AR(P)s: Partial Autocorrelation Function.
�Sample PACF cuts off at lag sP (has no spikes for lags greater than sP ).
�PACF values for lags less than or equal to sP, but s multiples, depend on the parameters of the process values.
◦ B
08/03/2016
J.MURO 21
MA(Q)s: Autocorrelation Function.
�Sample ACF cuts off at lag sQ (has no spikes for lags greater than sQ ).
�ACF values for lags less than or equal to sQ, but s multiples, depend on the parameters of the process values.
◦B
08/03/2016
J.MURO 22
MA(Q)s: Partial Autocorrelation Function.
�Sample PACF has spikes at lag s and its multiples: 2s, 3s, 4s.....
�If you only consider the above lags (spikes) the PACF tails off (exponential or tapering sinusoidal, tapering oscillating, decay).
◦ B
08/03/2016
J.MURO 23
ARMA(P,Q)s: Autocorrelation Function.
�ACF has no definite pattern until the lag sQ.For lags greater than sQ it behaves like an AR(P)s process.
◦B
08/03/2016
J.MURO 24
ARMA(P,Q)s: Partial Autocorrelation Function.
�PACF has no definite pattern until the lag sP.For lags greater than sP it behaves like an MA(Q)s process.
◦B
08/03/2016