10. Basic Regressions with Times Series Data

21
10. Basic Regressions with Times Series Data 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties of OLS Under Classical Assumptions 10.4 Functional Form, Dummy Variables, and Index Numbers 10.5 Trends and Seasonality

description

10. Basic Regressions with Times Series Data. 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties of OLS Under Classical Assumptions 10.4 Functional Form, Dummy Variables, and Index Numbers 10.5 Trends and Seasonality. - PowerPoint PPT Presentation

Transcript of 10. Basic Regressions with Times Series Data

Page 1: 10. Basic Regressions with Times Series Data

10. Basic Regressions with Times Series Data

10.1 The Nature of Time Series Data10.2 Examples of Time Series Regression

Models10.3 Finite Sample Properties of OLS Under

Classical Assumptions10.4 Functional Form, Dummy Variables,

and Index Numbers10.5 Trends and Seasonality

Page 2: 10. Basic Regressions with Times Series Data

10.1 Nature of Time SeriesTime series data is any data that follows one

observation (location, person, etc) over time-temporal ordering is very important for time

series data (higher observations correspond to more recent data)-this is due to the fact that the past can affect the future but not the other way around

-recall that for cross-sectional data ordering was of little importance

-a sequence of random variables indexed by time is call a STOCHASTIC (random) PROCESS or TIME SERIES PROCESS

Page 3: 10. Basic Regressions with Times Series Data

10.1 Random Time SeriesHow is time series data considered to be

random?1) We don’t know the future.2) There are a variety of variables that impact

the future.3) Future outcomes are thus random variables.-Each data point is one possible outcome, or

realization-If certain conditions were different, the

realization could have been different-but we don’t have a time machine to go back in time and obtain this realization

Page 4: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-The simplest time series model, closest to cross-sectional models, is a STATIC MODEL relating two variables y and z:

(10.1) ...,2,1,10 ntuzy ttt -this equation models a contemporaneous relationship between y and z-here a change in z has an IMMEDIATE effect on y-for example, if eating chocolate each day made one (un)happy:

ttt uchocolateU 10

Page 5: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-If one or more variables affect our y variable in time periods after the current period, we have a FINITE DISTRIBUTED LAG (FDL) MODEL:

ttttt uzzzy ...221100 -In this case the variable z has an impact on y now and in as many future time periods as is included in the model-For example, if chocolate consumption affected (un)happiness today AND tomorrow:

tttt uchocolatechocolateU 1100

Page 6: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-If our model lags two periods in the future, it is an FDL of order two:

ttttt uzzzy 221100

-to interpret our delta coefficients, assume a one-time, one unit, increase in z today:

czczczcz

t

t

t

t

2

1

2

1

1

-and so on in all preceding and proceeding time periods

Page 7: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-Assuming zero error, we have a situation of:

cccycccycccycccy

cccy

t

t

t

t

t

21003

21002

21001

2100

21001

)1()1(

)1(

-Where this one-time increase affects 3 time periods

Page 8: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-We can then calculate that:

01

210021001 )(-)1(

tt

tt

yyccccccyy

-Therefore delta0 is the immediate change in y due to a one-unit change in z-delta0 is often called the IMPACT PROPENSITY or IMPACT MULTIPLIER-likewise delta1 is the change in y one period after z’s change and delta2 is the change in y two periods after z’s change

Page 9: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-We can also analyze the effect on y due to a PERMANENT one unit increase in z:

)1()1()1()1()1(

)1(

21002

21001

2100

21001

cccycccy

cccycccy

t

t

t

t

-Immediately, y increases by delta0

-After 1 period, y has increased by delta0+delta1

-After 2 periods, y has increased by delta0+delta1+delta2…

Page 10: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-After 3 periods, y has increased by delta0+delta1+delta2+delta3

-this long-run change in y given a permanent increase in z is called the LONG-RUN PROPENSITY (LRP) or LONG-RUN MULTIPLIER-a finite distributed lag model of order q and the corresponding LRP would be:

q

tqtqttt

LRP

uzzzy

...

...

10

1100

Page 11: 10. Basic Regressions with Times Series Data

10.2 Time Series Regressions-Note that the long-run propensity (LRP) of a time series regression can cause high multicollinearity-Therefore it is often not possible to obtain precise estimates of each delta, but rather we obtain a good estimate of the LRP.

-note that different sources use either t=0 or t=1 as the base year-our text considers t=1 the base year

Page 12: 10. Basic Regressions with Times Series Data

10.3 Finite Sample Properties of OLS under Classical Assumptions

-in this section we will see how the 6 Classical Linear model (CLM) assumptions are modified from their time-series form in order to imply to finite (small) sample properties of OLS in time series regressions

Note that xtj refers to the t’th time period, where j is labels the x variable

Xt will refer to all x observations at time tX will refer to a matrix including all x observations over all

times t

Page 13: 10. Basic Regressions with Times Series Data

Assumption TS.1(Linear in Parameters)

The stochastic process {(xt1, xt2,…,xtk, yt): t=1, 2,…,n} follows the linear model

Where {ut: t=1,2,…,n} is the sequence of error disturbances. Here, n is the number of observations (time periods).

(Note: TS stands for time series)

(10.8) ux...xxy ttkkt22t110t

Page 14: 10. Basic Regressions with Times Series Data

Assumption TS.2(No Perfect Collinearity)

In the sample (and therefore in the underlying time series process), no independent variable is constant nor a perfect linear combination of the others

Page 15: 10. Basic Regressions with Times Series Data

10.3 Assumption Notes-Our first two assumptions are almost identical to

their cross-sectional counterparts-Note that TS.2 allows for correlation between

variables, it only disallows PERFECT correlation-the final assumption for time series OLS

unbiasedness replaces MLR.4 and obviates the need for a random sampling assumption:

Page 16: 10. Basic Regressions with Times Series Data

Assumption TS.3(Zero Conditional

Mean)For each t, the expected value of the

error ut, given the explanatory variables for all time periods, is zero. Mathematically,

(10.9) n.1,2,..., t,0)|( XuE t

Page 17: 10. Basic Regressions with Times Series Data

10.3 Assumption TS.3 Notes-TS.3 assumes that our error term (unaccounted for variables)

is uncorrelated with our included variables IN EVERY TIME PERIOD

-this requires us to correctly specify the functional form (static or lag) between y and z

-if ut is independent of X and E(ut)=0, TS.3 automatically holds-such a strong assumption was not needed in cross sectional

data because each observation was random; in time series each observation is sequential

Page 18: 10. Basic Regressions with Times Series Data

10.3 Assumption TS.3 Notes-if ut is uncorrelated with all independent

variables of time t: 0)|( tt XuE-We say that xtj are CONTEMPORANEOUSLY EXOGENOUS-therefore ut and Xt are contemporaneously uncorrelated: Corr(xtj,ut)=0 for all j-TS.3 requires more than contemporaneous exogeneity however, it requires

STRICT EXOGENEITY across time periods

Page 19: 10. Basic Regressions with Times Series Data

10.3 Assumption TS.3 Notes-Note that TS.3 puts no restrictions on correlation between

independent variables across time-Note that TS.3 puts no restrictions on correlation between

error terms across timeTS.3 can fail due to:1) Omitted variables2) Measurement error3) Misspecified Model4) Other

Page 20: 10. Basic Regressions with Times Series Data

10.3 TS.3 Failure-If a variable z has a LAGGED effect on y, its lag must be included in the

model or TS.3 is violated-never use a static model if a lag model is more appropriate

-ie: overeating (z) last month (ie: Christmas) causes more exercise in this month (y)

-TS.3 also fails if ut affects future zt (since only past zt are controlled for)-ie: cold weather last month (u) will cause depression thus under eating

next month (z)

Page 21: 10. Basic Regressions with Times Series Data

Theorem 10.1(Unbiasedness of OLS)Under assumptions TS.1 through

TS.3, the OLS estimators are unbiased conditional on X, and

therefore unconditionally as well:

k.1,..., 0,j ,)ˆ( jjE Note: The proof is very similar to the cross-sectional case.