Big data for economic analysis

40
Big data for economic analysis Lucrezia Reichlin London Business School and CEPR London Business School March 7, 2016

Transcript of Big data for economic analysis

Page 1: Big data for economic analysis

Big data for economic analysis

Lucrezia Reichlin

London Business School and CEPR

London Business SchoolMarch 7, 2016

Page 2: Big data for economic analysis

A tsunami of data

More data, standard and non standard sources, easilyavailable, easily collected and stored

Page 3: Big data for economic analysis

Quantifying the data deluge: the petabyte era

• bytes : 1 byte ∼ 1 letter (ascii symbol)

• kilobytes : 1 kB [1000 bytes] ∼ 1 page ,1 article in pdf (50-500 kB) , 1 small image

• megabytes [1 megabyte= 106 bytes ] : 1 book

• gigabytes [1 giga= 109 bytes]: 1 Audio CD (700 MB), 1DVD (5 GB), a private library

• terabytes [1 tera= 1012 bytes]: a public libraryLOC (20 TB) (digital content of U.S. Library of Congress)

• petabytes [1 peta = 1015 bytes]: amount of data treated bythe servers of Google in one hour (1 PB)

Page 4: Big data for economic analysis

Growth

• 90 % of the recorded data have been collected duringthe last two years!!!

• Most data are now digital (numbers)(1 % en 1986, 25 % en 2000, 94% en 2007)

• In 2007, ∼ 300 exabytes of data stored(61 CD-ROM per person, i.e. a stack which would gobeyond the moon!)

• et ∼ 2 zettabytes of data exchanged! [1 zetabyte= 1021

bytes]

(M. Hilbert, P. López, Science 2011)

Page 5: Big data for economic analysis

Mathematics Awareness Month April 2012

Need intelligent design to exploit them!

Page 6: Big data for economic analysis

Finding a needle in a haystack

Can we extract a meaningful signal?

Page 7: Big data for economic analysis

Where is Wally? (Martin Handford)

Model for automatic detection of people from complex systems(computer vision)

Page 8: Big data for economic analysis

Big challenges ...

for

• mathematicians• statisticians• computer scientists, engineers, etc.

in order to develop automatic procedures for extracting usefulinformation from huge amounts of data.

→ rapid development of (new) research fields:

• Computer vision• Data Mining• Statistical Learning (“Machine Learning”)• Bioinformatics, etc.

Page 9: Big data for economic analysis

... in all scientific areas

• Physics• Astronomy• Geophysics• Biology (chiefly genomics, proteomics)• ... and also economics, finance and social sciences

Page 10: Big data for economic analysis

What about economics?

• New statistical methods for estimating models with largedata

• But also use of new data: social media, google queries,geo-locational, ...

Many recent and less recent examples: tick-by-tick data,portfolio optimization, forecasting and stress tests, health data,....

Most of these applications involve large data rather than bigdata but exploit new statistical models to deal with the curse ofdimensionality problem: dimension reduction, sparsity,compression, new approaches to algorithms, ...

Page 11: Big data for economic analysis

Is this a revolution for the field?

My answer in a nutshell: not clear yet!

• Possibly a change in methodological philosophy: moreemphasis on prediction rather than on causal relations

• Emphasis on real time analysis• Democratization of statistics

⇒ Not the end of theory but useful fresh air

Page 12: Big data for economic analysis

Discussion on selected issues

1 The curse of dimensionality and machine learningwhat is this all about?

2 What works with macro data?The curse and blessing of collinearity

3 Google data and real time now-castingNot very useful

4 Prediction and causality

Page 13: Big data for economic analysis

The curse of dimensionality

• In large models there is a proliferation of parameters that islikely to lead to high estimation uncertainty

• As we increase complexity, the number of parameters toestimate increases and so does the variance (estimationuncertainty)

• Predictions based on traditional methods are poor orunfeasible if the number of predictors n is large relative tothe sample size T

Why?⇒ The sample variance is inversely proportional to the degreesof freedom (sample size minus number of parameters)⇒When number of parameters becomes large, the degree offreedoms go to zero or become negative and the precision ofthe estimates deteriorate

This is the curse of dimensionality

Page 14: Big data for economic analysis

Solutions which have been used in econometrics

• Factor analysis:limit complexity due to proliferation of parameters by focusing onfew sources of variations (common factors)Reasonable if data are characterized by strong collinearity (egbusiness cycles), many applications in macro, theory andempirics for large dynamic models

• Principal components:extract first PCs / if few factors drive the dynamics of the data itworks well

• Penalized regression:limit estimation uncertainty via shrinkage [machine-learning]

These methods are related: either aggregate variables orselect or both

Page 15: Big data for economic analysis

Problems with traditional approach

A simple illustration:

Forecast yt using a large information set:

yT+h|T = β′XT

Estimate β via OLS, i.e. maximize the in-sample fit of themodel:

β = arg minβ

T−h∑t=1

(yt+h − β′Xt

)2

=⇒ β = (X ′X )−1X ′y ⇒ yOLST+h|T = β′XT

Problem!! If the size information set (n) is too large relative tothe sample size (T ) then OLS forecasts are poor or unfeasible:curse of dimensionality.

Page 16: Big data for economic analysis

A cure for the illness: Penalized regression

To stabilize the solution (estimator), use extra constraints on thesolution or, alternatively, add a penalty term to the least-squares loss

min[RSS(model) + ν (Model Complexity)]

Example: ridge: penalized regression (L2 norm)

βridge = arg minβ

T−h∑t=1

(yt+h − β′Xt

)2+ ν

n(p+1)∑i=1

β2i

βridge = (X ′X + νI)−1 X ′y

• Ridge is a form of linear ‘shrinkage’, where the components of βolsare shrunk uniformly towards zero• it is a kind of “regularization” which provides the necessarydimension reduction and increases the bias to decrease the variance

Page 17: Big data for economic analysis

Bayesian language

• Penalized regression can be reinterpreted as a Bayesianregression

limit length β + estimate coefficients as the posterior mode tocompute forecast

or ... shrink regression coefficients to zero via priors

DATA (complex/rich) + PRIOR (naive/parsimoniuos)

In the case of the example: i.i.d. prior on β: Φ0 = σ2β I

Page 18: Big data for economic analysis

Several ways of doing it

Two extreme choices:

• Normal prior - give a weight to all regressors (eg ridge) /similar to PCs give more weight to large sources ofvariations

• Double exponential - allows for variable selection byenforcing sparsity, i.e., the presence of zeroes in the vectorβ of the regression coefficients (also known as ‘Lassoregression’)

Page 19: Big data for economic analysis

Sparsity?

“Entia non sunt multiplicanda sine necessitate”

William of Ockham (∼ 1288 - 1348)

Page 20: Big data for economic analysis

Macro problems: does it matter the way we do it?

• Macro and financial data are highly collinear: few factors(shocks) explain the bulk of dynamics.

• As a consequence variable selection or aggregationmethods deliver the same results [empirics and theory]

? Collinearity is curse: variable selection gives unstable results

? But is is also a blessing: All methods allow to capture largesources of variations

Page 21: Big data for economic analysis

A classic result: two shocks drive the business cycle

Correlation in macro data an old insight from the 1970s

(from about 10 series to about 100)

Page 22: Big data for economic analysis

Forecasting industrial production: PCs, ridge andLasso - 200 variables

Page 23: Big data for economic analysis

The end of Theory?

Page 24: Big data for economic analysis

The macro approach: a compromise

• Extract shocks from large data and their lags:combinations of prediction errors

• Identify structural shocks using minimal theory

• Do not identify all coefficients but compute impulseresponse functions to structural shocks

• Many useful applications: the effect of unexpected policychanges, stress tests and conditional forecasting foreconomic analysis

Page 25: Big data for economic analysis

Large Bayesian VAR

Bayesian regression in a dynamic system of simultaneous equationshad been applied in macro for small models since the 80s

• By shrinking in relation to the sample size can estimate withhundreds of variables

• This avoids over-fitting

• Many useful applications which were used for small models cannow be used for large models

Blessing of dimensionality: in large models if correlations are stable,density forecasts work well [narrow bands]. True for bothunconditional and conditional forecasts

Page 26: Big data for economic analysis

Large Bayesian VAR: the effect of the monetary policyshock

Page 27: Big data for economic analysis

Large Bayesian VAR: stress tests

• Macro risk is characterized by correlations: bankingsystem overly exposed to risk in the up and too riskadverse in the down: need to look at aggregate risk

• Combine macro variables and balance sheet variables• Construct stress scenarios by conditioning on specific

assumptions

Page 28: Big data for economic analysis

Other applications in macro: now-casting

Real time monitoring of the rich data flow

Basic idea of now-casting:• follow the calendar of data publication• update now-cast almost in continuous time• corresponding to each release there will be a model based

surprise that move the now-cast of all variables and thesynthetic signal on the state of the economy

THIS IS WHAT THE MARKET INFORMALLY DOES!

Page 29: Big data for economic analysis

Following the calendar

Page 30: Big data for economic analysis

Following the calendar

Page 31: Big data for economic analysis

Model forecasts

Real time data vintages

Data release Model forecast news

Model run Factor update

GDP now-cast

Other series now-casts

NCI™

The Now-Casting platform

Page 32: Big data for economic analysis

What have we learned in years of experience runningan automatic procedure with no judgement?

• Timeliness matters

• Many data are relevant to obtain early signals on economicactivity, increasingly also used by statistical agencies Inparticular: surveys, weekly conjunctural data

• Robust models are relatively simple

• An automatic mechanical model does as well as judgmentbut is as timely as you want and does not get influenced bymoods

Page 33: Big data for economic analysis

Data help

Do timely data help? Evolution of the MSFE in relation to the data flow

00.10.20.30.40.50.60.70.80.9

1

F N B

Euro Area

ar dfm BB FF EC0

0.2

0.4

0.6

0.8

1

1.2

1.4

F N B

Germany

ar dfm BB FF EC

0

0.2

0.4

0.6

0.8

1

1.2

F N B

Italy

ar dfm BB FF EC0

0.10.20.30.40.50.60.70.8

F N B

France

ar dfm BB FF EC

Page 34: Big data for economic analysis

Data help

Do timely data help? Evolution of the MSFE in relation to the data flow

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

F N B

UK

ar dfm BB FF NIESR EC0.2

0.3

0.4

0.5

0.6

0.7

F N B

US

ar dfm BB FF SPF

0.20.40.60.8

11.21.41.61.8

22.2

F N B

Japan

ar dfm BB JCER0

0.5

1

1.5

2

2.5

3

3.5

4

F N B

Brazil

ar dfm BB BCB

Page 35: Big data for economic analysis

Data are available but often unexploited: the USgovernment shutdown

US example: model can run even if government shutdown (Jim Stock presentation)

Page 36: Big data for economic analysis

Do non standard timely data help? Unemploymentand google query jobs are correlated

Smoothed monthly data constructed from weekly data Unemployment (FRED); Jobs (GOOGLE)

-25

-20

-15

-10

-5

0

0

2

4

6

8

10

12

01/0

1/20

04

01/0

5/20

04

01/0

9/20

04

01/0

1/20

05

01/0

5/20

05

01/0

9/20

05

01/0

1/20

06

01/0

5/20

06

01/0

9/20

06

01/0

1/20

07

01/0

5/20

07

01/0

9/20

07

01/0

1/20

08

01/0

5/20

08

01/0

9/20

08

01/0

1/20

09

01/0

5/20

09

01/0

9/20

09

01/0

1/20

10

01/0

5/20

10

01/0

9/20

10

01/0

1/20

11

01/0

5/20

11

01/0

9/20

11

01/0

1/20

12

01/0

5/20

12

01/0

9/20

12

01/0

1/20

13

01/0

5/20

13

01/0

9/20

13

01/0

1/20

14

01/0

5/20

14

01/0

9/20

14

01/0

1/20

15

01/0

5/20

15

Unemployment rate (FRED) and Jobs (Google)

MA(3) unemprate MA(3) jobs

Page 37: Big data for economic analysis

but also correlated with initial claims available bystandard sources

-30

-25

-20

-15

-10

-5

0

5

0

100000

200000

300000

400000

500000

600000

700000

02/0

1/04

02/0

5/04

02/0

9/04

02/0

1/05

02/0

5/05

02/0

9/05

02/0

1/06

02/0

5/06

02/0

9/06

02/0

1/07

02/0

5/07

02/0

9/07

02/0

1/08

02/0

5/08

02/0

9/08

02/0

1/09

02/0

5/09

02/0

9/09

02/0

1/10

02/0

5/10

02/0

9/10

02/0

1/11

02/0

5/11

02/0

9/11

02/0

1/12

02/0

5/12

02/0

9/12

02/0

1/13

02/0

5/13

02/0

9/13

02/0

1/14

02/0

5/14

02/0

9/14

02/0

1/15

02/0

5/15

Weekly Initial Claims (FRED) and Jobs (Google)

MA(4) Claims MA(4) Jobs

Smoothed weekly data Initial Claims (FRED); Jobs (GOOGLE)

Page 38: Big data for economic analysis

and sampling errors and changes in the algorithm leadto instability and lack of robustness

Jan2004 Jan2006 Jan2008 Jan2010 Jan2012 Jan2014 Jan2016-40

-30

-20

-10

0

10

20

30

40

50Jobs: at different vintages

Choi & Varian (2012)available on 28MAY2015

Page 39: Big data for economic analysis

Some examples beyond macro

• Health economics: prediction of whether replacementsurgery for patients with osteoarthritis will be beneficial fora given patient or not, based on more than 3000 variablesrecorded for about 100 000 patients.

• Economics and law: machine-learning algorithms can bemore efficient than a judge in deciding who has to bereleased or go to jail while waiting for trial because ofdanger of committing a crime in the meanwhile

• Microeconometrics: controlling for many covariates in orderto better identify treatment effect, many instruments, ...

• All fields: combining models for robustness (empiricalgrowth literature: i run 1 million regressions!)

Page 40: Big data for economic analysis

Sherlock Holmes

“Data! Data! Data!” he cried impatiently. “I can’t make brickswithout clay”(Arthur Conan Doyle, The Adventure of the Copper Beeches,1892)

“It is a capital mistake to theorize before one has data.Insensibly one begins to twist facts to suit theories, instead oftheories to suit facts.”(Arthur Conan Doyle, A Scandal in Bohemia, 1892)