Treatment Efiects - Stanford University

105
Treatment Effects An Average Treatment effect is a special case of an average partial effect. Specifically it is an average partial effect for a binary explanatory variable. Pioneered by Rubin(1974) who introduced the concept in a coun- terfactual framework. Generally, most estimators of ATE fall into one of two categories, (strong) ignorability, or IV. Treatment effects begin with a counterfactual, where each agent has an outcome with and without treatment. 1

Transcript of Treatment Efiects - Stanford University

Treatment Effects

An Average Treatment effect is a special case of an average partial

effect.

Specifically it is an average partial effect for a binary explanatory

variable.

Pioneered by Rubin(1974) who introduced the concept in a coun-

terfactual framework.

Generally, most estimators of ATE fall into one of two categories,

(strong) ignorability, or IV.

Treatment effects begin with a counterfactual, where each agent

has an outcome with and without treatment.

1

To model this, we will let a random draw from the population be

denoted by the triple (y1i, y0i, di).

yi1 denotes outcome with treatment; yi0 denotes outcome without

treatment; di denotes treatment indicator.

Problem is only observe:

yi = diyi1 + (1− di)yi0

Interested in different measures of the effect of treatment:

2

1. ATE = E[yi1 − yi0]

2. ATOT=E[yi1 − yi0|di = 1]

3. CATE = E[yi1 − yi0|xi]

4. CATOT=E[yi1 − yi0|di = 1, xi]

How do we estimate these parameters? Depends on assumptions

of the model:

3

1. Randomized Treatment (rare in social sciences where there is

usually self selection).

Here we assume di ⊥ (yi1, yi0)

Note in this case ATE=ATOT, because E[yi|di = 1] = E[yi1|di =

1] = E[yi1], and E[yi|di = 0] = E[yi0].

4

General relationship between ATE and ATOT:

yi0 = µ0 + εi0 (1)

yi1 = µ1 + εi1 (2)

difference equations and condition on di = 1 and we get

ATOT = ATE + E[εi1 − εi0|di = 1] (3)

5

2. (Strong) Ignorability (selection on observables)

Here we assume di ⊥ (yi1, yi0) conditional on regressors xi. In

this case CATE = CATOT:

E[yi|di = 1, xi] = E[yi1|xi]

E[yi|di = 0, xi] = E[yi0|xi]

6

Matching Methods Based on Propensity Scores

We define the propensity score as

p(xi) = P (di = 1|xi = 1)

Can show that under the ignorability condition,

ATE = EX [(di − p(xi))yi/(p(xi)(1− p(xi)))] (4)

To see why, note that the numerator can be expanded as:

diyi1(1− p(xi))− p(xi)(1− di)yi0

7

Now condition on xi, di and take expectations: (m1(xi) = E[yi1|xi])

dim1(xi)− p(xi)dim1(xi)− p(xi)(1− di)m0(xi)

Now take expectation conditional on xi:

p(xi)(1− p(xi))(m1(xi)−m0(xi))

8

Instrumental Variables

Since what we have here is a dummy endogenous variable, why

not just do IV? Usual conditions for IV to work will be that εi1 = εi0

which is unrealistic.

Can interpret IV in the LATE framework. Relax the (unrealistic)

strong ignorability condition by introducing binary instruments:

Here we will assume the presence of a binary instrumental vari-

able zi, from which we can define the potential treatment variables:

di0, di1 which denote treatment if i.v. is 0,1, respectively:

9

di = (1− zi)di0 + zidi1

Therefore,

yi = yi0 + di0(yi1 − yi0) + zi(di1 − di0)

Make the following Assumption: (independence)

zi ⊥ (yi0, yi1, di0, di1)

so then we can say

E[yi|zi = 1]− E[yi|zi = 0] = E[(di1 − di0)(yi1 − yi0)]

which is equal to

E[(yi1 − yi0)|di1 − di0 = 1]P (di1 − di0 = 1)−E[(yi1 − yi0)|di1 − di0 = −1]P (di1 − di0 = −1)

Now make the following additional assumption (monotonicity)

di1 ≥ di0

10

In which case

E[yi|zi = 1]− E[yi|zi = 0] = E[(yi1 − yi0)|di1 − di0 = 1]P (di1 − di0 = 1)

We define the subgroup of the population as which satisfies:

di1 − di0 = 1

as compliers, and we define the LATE parameter as

E[yi1 − yi0|di1 − di0 = 1]

i.e. the ATE for compliers.

11

Note that under the monotonicity assumption,

P (di1 − di0 = 1) = E[di1 − di0] (5)

= E[di1 − E[di0] (6)

= E[di|zi = 1]− E[di|zi = 0] (7)

P (di = 1|zi = 1)− P (di = 1|zi = 0) (8)

Therefore, we have the following result:

LATE =E[yi|zi = 1]− E[yi|zi = 0]

E[di|zi = 1]− E[di|zi = 0]

and note the r.h.s. corresponds to the iv estimator for the regression of yi

on di with zi as instrument.

12

On the Role of the Propensity Score in EfficientSemiparametric Estimation of Average Treatment

Effects

Jinyong HahnEconometrica, March 1998, Vol. 66, No. 2, pp315-331

Presented in Applied Microeconometrics LunchgroupAlvin Murphy

September 26, 2006

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Overview

This paper examines the role of the propensity score inestimating treatment effects

Primarily concerned with Average Treatment Effects (ATE)and Average Treatment Effects on the Treated (ATT)

Hahn develops semiparametric variance lower bounds andthen provides estimators that achieve these bounds

The contribution of the paper lies in the collection of centralresults regarding the role of the propensity score in achievingthese bounds

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Introduction 1

As we have seen over the past few weeks, the core difficulty inestimating treatment effects is a missing counterfactual

For each individual, we only observe one of the two potentialoutcomes.

Some notation ...

- Let Di denote a dummy variable, where Di = 1 if individual ireceived the treatment.

- Let Y0i and Y1i denote the potential outcomes when Di = 0and Di = 1

- Yi ≡ DiY1i + (1− Di )Y0i

- Let Xi denote other covariates

We only observe (Y ,D,X )

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Introduction 2

The two primary parameters of interest are average treatmenteffects

β ≡ E [Y1i − Y0i ]

and average treatment effects on the treated

γ ≡ E [Y1i − Y0i |Di = 1]

It is well known that estimating β or γ without controlling insome way for the selection problem will lead to biasedestimates

Given certain assumptions, conditioning on the propensityscore eliminates this bias

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Introduction: Rosenbaum, Rubin, and the Propensity Score

Define the propensity score p(x) ≡ P[Di = 1|Xi = x ]

Assume:

- Xi is such that Di is ignorable given Xi i.e. Di⊥(Y0i ,Y1i )|Xi

- 0 < P[Di = 1|Xi ] < 1 for all Xi

Then, Di⊥(Y0i ,Y1i )|p(Xi ) (Rosenbaum & Rubin 1983, 1984)

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Introduction: Nonparametric Estimators of β

The implication of the ignorability assumption is therefore

E [Yji |p(Xi )] = E [Yi |Di = j , p(Xi )] for j = 0, 1

Hence,

β = E{E [Yi |Di = 1, p(Xi )]− E [Yi |Di = 0, p(Xi )]}

β = E{E [Yi |Di = 1,Xi ]− E [Yi |Di = 0,Xi ]}

This suggests an estimator of β may be constructed as asample average of:

E [Yi |Di = 1, p(Xi )]− E [Yi |Di = 0, p(Xi )]

orE [Yi |Di = 1,Xi ]− E [Yi |Di = 0,Xi ]

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Outline

Examine the efficient estimation of β and γ under theignorability assumption

Examine the role of the propensity score from an efficiencypoint of view

Hahn calculates semiparametric efficiency bounds andestimators that achieve these bounds are constructed

Shows the propensity score is unnecessary for the estimationof β but knowledge of the propensity score does decrease theasymptotic variance bound for γ

Even in this case, projection on the propensity score is notnecessary to achieve the lower bound

In some cases, conditioning on the propensity score could evenresult in a loss of efficiency

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficiency Bounds 1

The dataset consists of (Di ,Yi ,Xi ) for i = 1, . . . , n

Theorem 1: Assume (Y0i ,Y1i )⊥Di |Xi , the asymptoticvariance bounds for β and γ are respectively

E

[σ2

1(Xi )

p(Xi )+

σ20(Xi )

1− p(Xi )+ (β(Xi )− β)2

]

and[p(Xi )σ

21(Xi )

p2+

p(Xi )2σ2

0(Xi )

p2(1− p(Xi ))+

(β(Xi )− γ)2p(Xi )

p2

](1)

where

- βj(Xi ) = E [Yji |Xi ] for j = 0, 1 and β(Xi ) = β1(Xi )−β0(Xi )

- σ2j (Xi ) = var(Yji |Xi )

- p = E [p(Xi )]

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficiency Bounds 2

Theorem 2: Assume (Y0i ,Y1i )⊥Di |Xi , Furthermore, assumethat the propensity score, p(Xi ) is known. The asymptoticvariance bounds for β and γ are respectively

E

[σ2

1(Xi )

p(Xi )+

σ20(Xi )

1− p(Xi )+ (β(Xi )− β)2

]

and[p(Xi )σ

21(Xi )

p2+

p(Xi )2σ2

0(Xi )

p2(1− p(Xi ))+

(β(Xi )− γ)2p(Xi )2

p2

]

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficiency Bounds 3

Result 1: The propensity score does not play any role in theestimation of β: the knowledge of the propensity score doesnot decrease the variance bound

Result 2: Knowledge of the propensity score reduces theasymptotic variance bound for γ by:[

(β(Xi )− γ)2p(Xi )(1− p(Xi ))

p2

](2)

(2) can be interpreted as the marginal value of the propensityscore

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficiency Bounds 4

Theorem 3: Assume (Y0i ,Y1i )⊥Di |Xi , Furthermore, assumethat the propensity score, p(Xi ) is equal to some unknownconstant p. The asymptotic variance bound for β = γ is:

E

[σ2

1(Xi )

p+

σ20(Xi )

1− p+ (β(Xi )− β)2

]Now, consider the variance bounds in Theorem 1 for the casewhere p(Xi ) = p

The bound for β equals

E

[σ2

1(Xi )

p+

σ20(Xi )

1− p+ (β(Xi )− β)2

](3)

and the bound for γ equals[σ2

1(Xi )

p+

σ20(Xi )

1− p+

(β(Xi )− γ)2

p

](4)

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficiency Bounds 5

The bound for β is, unsurprisingly, the same

The bound for γ is lower

The marginal value of knowing that assignment to treatmentis random is given by

E

[1− p

p(β(Xi )− β)2

]

This marginal value equals the marginal value calculatedabove (benefit of knowing p(Xi )) when p(Xi ) = p

Hahn suggests that this implies that the marginal value of theknowledge of the propensity score consists entirely of themarginal value of dimension reduction

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 1

Given knowledge of the variance bounds under differentassumptions, which estimators will achieve these bounds?

The estimators constructed are based on the relevant sampleaverages from an augmented data set.

A dataset can be augmented to fill in the missing values ofY1i and Y0i by a nonparametric imputation method based onthe projection on Xi

Even when the propensity score is known, it is shown thatprojecting on the propensity score is not necessary for theestimator of γ to achieve the lower bound

Conditioning on the propensity score may reduce the efficiencyif the treatment is randomly assigned

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 2

If both Y1i and Y0i were always observed, then a consistentestimator of β could be easily formed by the sample averageof the difference, Y1i − Y0i

A consistent estimator of γ would be the sample average ofthe difference, Y1i − Y0i where Di = 1

The first step, therefore, is to nonparametrically impute themissing values of Y1i and Y0i using their conditionalexpectation given Xi

These conditional expectations are only identified under theassumption of the ignorability of Di given Xi as can be seen by

E [DiYi |Xi ] = E [DiY1i |Xi ] = E [Di |Xi ]E [Y1i |Xi ] = E [Di |Xi ]β1(Xi )

so, β1(Xi ) = E [DiYi |Xi ]/E [Di |Xi ]

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 3 - Completing the Data

Given nonparametric estimators, E [(1− Di )Yi |Xi ],E [DiYi |Xi ], and E [Di |Xi ] we can fill in the missing values ofY1i and Y0i as

β1(Xi ) ≡E [DiYi |Xi ]

E [Di |Xi ]and β0(Xi ) ≡

E [(1− Di )Yi |Xi ]

1− E [Di |Xi ]

We can get the “complete” data set (Y1i , Y0i ,Di ,Xi ) where

Y1i ≡ DiYi + (1− Di )β1(Xi )

andY0i ≡ (1− Di )Yi + Di β0(Xi )

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 4 - Estimating β and γ

Two alternative estimators of β and γ are now easilyconstructed

β =1

n

∑i

(Y1i − Y0i ) and γ =(1/n)

∑i Di (Y1i − Y0i )

(1/n)∑

i Di

recall that βj(Xi ) = E [Yji |Xi ] and β(Xi ) = β1(Xi )− β0(Xi )

We also have already constructed β1(Xi ) and β0(Xi ),therefore, we can also estimate β and γ as

β =1

n

∑i

(β1(Xi )−β0(Xi )) and γ =(1/n)

∑i Di (β1(Xi )− β0(Xi ))

(1/n)∑

i Di

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 5 - Asymptotic Variances

Using results from Newey (1994) it can be shown that theasymptotic variances of

√n(β − β) and

√n(β − β) are equal

to each other and equal to the asymptotic variance boundderived in Theorem 1

Similarly, it can be shown that the asymptotic variances of√n(γ − γ) and

√n(γ − γ) are equal to each other and equal

to the asymptotic variance bound derived in Theorem 1

Proposition 4 formally states the above two results but doesnot provide any information regarding the first stagenonparametric regression estimation.

Theorems 5 and 6 (and their respective discussions) providesome guidance and necessary conditions for the first stageestimations for the overall estimations to be efficient

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 6 - First Stage Estimation

Theorem 5: Assume (Y0i ,Y1i )⊥Di |Xi , Furthermore, assumethat Xi has known finite support. Then β and β are efficientsemiparametric estimators for β, γ and γ are efficientsemiparametric estimators for γ.

First stage estimators are simply constructed as:

E [DiYi |Xi = x ] =

∑i DiYi · 1(Xi = x)∑

i 1(Xi = x)

E [(1− Di )Yi |Xi = x ] =

∑i (1− Di )Yi · 1(Xi = x)∑

i 1(Xi = x)

E [Di |Xi = x ] =

∑i Di · 1(Xi = x)∑

i 1(Xi = x)

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 7 - First Stage Estimation

Theorem 6: Assume (Y0i ,Y1i )⊥Di |Xi , Furthermore, assumethat multiple other conditions hold (see paper). Then β and βare efficient semiparametric estimators for β, γ and γ areefficient semiparametric estimators for γ.

First stage estimators are formed by series estimators

For example, an estimator of E [Yi |Xi = x ] would bepk(x) = (pik(x), . . . , pkk(x))′

y = (Y1, . . . ,Yn)′

pk = [pk(X1), . . . , pk(Xn)]

E [Yi |Xi = x ] = pk(x)′π, π = (pk ′pk)−1pk ′y

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 8 - Is Imputation Necessary?

Imputation is (seemingly) unavoidable even for experimentaldata

Consider Robinson’s partially linear semiparametric regressionmodel (Robinson 1988)

Regress Yi − E [Yi |Xi ] on Di − E [Di |Xi ]

The plim of the resulting estimator, βsl is given by

E [(Yi − E [Yi |Xi ])(Di − E [Di |Xi ])]

E [(Di − (Di |Xi ])2]= β

The asymptotic variance of βsl can be calculated using the“machinery” of Newey(1994) and is shown to be larger thanthat of Hahn’s β

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 9 - γ and the Propensity Score

It was previously shown that the propensity score isunnecessary for the estimation of β

This does not hold for γ as knowledge of the propensity scorereduces the asymptotic variance bound

Probably unrealistic to assume the propensity score is known

However, many papers nonparametrically estimate thepropensity score to exploit the dimension reduction feature.

Hahn argues that even if the propensity score is known, it isnot necessary to project onto the propensity score.

Proposition 7: Assume (Y0i ,Y1i )⊥Di |Xi , Furthermore,assume that the propensity score, p(·) is known. Then, thefollowing estimator is an efficient estimator of γ

1

n

∑i

p(Xi ) ·(

E [DiYi |Xi ]

E [Di |Xi ]− E [(1− Di )Yi |Xi ]

1− E [Di |Xi ]

)/1

n

∑i

p(Xi )

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Efficient Estimation 10 - Dangers of the Propensity Score

Could projection on the propensity score even be harmful forthe estimation of β = γ? (ie the experimental data case)

β, which is an efficient estimator of β with or withoutknowledge of the propensity score, is still efficient for β

note that the estimator for γ developed above in proposition 7reduces to β when the propensity score is constant

Don’t want to use γ as it is only efficient when the propensityscore is unknown

If we condition on the propensity score when it is constant, wejust get the marginal expectation

Therefore we consider the difference in sample averages as anestimator. Call this estimator βols .

However, var(βols)− var(β) ≥ 0

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

Conclusions and Summary

Interesting development of asymptotic variance lower bounds

Bound is unaffected for β (ATE) when propensity score isknown

Bound is reduced for γ (ATT) when propensity score is known

Uses imputation of missing values to construct estimators thatattain lower bounds

Is imputation really the only way to achieve the lower bounds?Find out next week!

Overall the paper provides little encouragement for fans ofpropensity score estimators.

It is worth noting that projecting on the propensity score canreduce efficiency in the experimental data case.

Jinyong Hahn Propensity Score and Semiparametric Estimation of ATE

E�cient Estimation of Average Treatment

E�ects Using the Estimated Propensity Score

Hirano, Imbens, and Ridder (2003)

Presented by Jason BlevinsApplied Microeconometrics Reading Group

Duke University

October 5, 2006

Context

Previous Work:

� As shown by Rosenbaum and Rubin (1983, 1985), the unconfoundednessassumption implies that adjusting for p(x) removes all bias associated withdi�erences in x .

� Hahn (1998) shows that while this removes all bias, it is not necessarily ase�cient as conditioning on the covariates.

� Rosenbaum (1987), Rubin and Thomas (1996), and Robins, Rotnitzky, andZhao (1995): There can be e�ciency gains by using parametric estimatesof the propensity score, rather than the true propensity score.

Main Finding:

� Estimators for � , �wate, and �treated are presented which weight observationsby the inverse of nonparametric estimates of p(x). If the estimator for p(x)is su�ciently �exible, this leads to a fully e�cient estimator.

1

Outline

� Model, Objectives, and Assumptions

� Other Approaches: Matching

� Estimation Using the Propensity Score

� Previous Results: Hahn (1998)

� Missing Data Example

� Three E�cient Estimators

� Population Average Treatement E�ect

� Weighted Average Treatement E�ect

� Average Treatement E�ect for the Treated

2

Model

Population (T; X; Y (0); Y (1))

Missing data Y � T � Y (1) + (1� T ) � Y (0)

Random sample f(Ti ; Xi ; Yi)gNi=1

Treatment indicator Ti 2 f0; 1g

Vector of covariates Xi

Outcomes Yi(0), Yi(1)

Assumption 1. Unconfoundedness: T ? (Y (0); Y (1)) jX

3

Quantities of Interest

Population ATE � = E[Y (1)� Y (0)]

Weighted ATE �wate =RE[Y (1)�Y (0) jX=x ]g(x)dF (x)R

g(x)dF (x)

ATE on the Treated � = E[Y (1)� Y (0) jT = 1]

Propensity Score p(x) = P(T = 1 jX = x)

� ATE on the Treated arises when weight is g(x) = p(x).

� Problem: we only observe either Yi(0) or Yi(1), never both.

� Straightforward nonparametric estimators are all infeasible!

4

Estimation by Matching

The unconfoundedness assumption implies that

�(x) � E[Y (1)� Y (0) jX = x ]

= E[Y (1) jX = x ]� E[Y (0) jX = x ]

= E[Y (1) jT = 1; X = x ]� E[Y (0) jT = 0; X = x ]

= E[Y jT = 1; X = x ]� E[Y jT = 0; X = x ]

since

E[Y jT = 1; X = x ] = E[T � Y (1) + (1� T ) � Y (0) jT = 1; X = x ]

= E[Y (1) jT = 1; X = x ]

� =

∫�(x)dF (x)

5

Estimation Using the Propensity Score

Rosenbaum and Rubin (1983, 1985) show that the unconfoundednessassumption T ? (Y (0); Y (1)) jX implies T ? (Y (0); Y (1)) j p(X).

Unconfoundedness gives:

E[TY jX = x ] = E[TY (1) jX = x ]

= E[T jX = x ]E[Y (1) jX = x ]

E[Y (1) jX = x ] =E[TY jX = x ]

E[T jX = x ]=E[TY jX = x ]

p(x)

This suggests using a sample average to nonparametrically estimate

� = E[�(x)] = E [E[Y (1)� Y (0) jX = x ]] :

6

Previous Results: Hahn (1998)

� Semiparametric e�ciency bounds and estimators for � and �treated

� Knowing p(x) does not a�ect bound for � .

� Knowing p(x) decreases the bound for �treated .

� In general, conditioning only on p(x) and not the covariates does not leadto an e�cient estimator (experimental data case).

� E�cient estimator for � , regardless of whether p(x) is known.

� Nonparametrically estimate E[Y T jX = x ], E[Y (1 � T ) jX = x ], andp(x).

� Impute values for Yi(1) and Yi(0) using

Yi(1) =E[Y T jXi ]

p(Xi)and Yi(0) =

E[Y (1� T ) jXi ]

1� p(Xi)

7

Example: Missing Data with Binary Covariates

� Want to estimate �0 � E[Y ] given a random sample f(Ti ; Xi ; TiYi)gNi=1.

� Ti and Xi are observed for everyone, Yi observed only if Ti = 1.

� �Unconfoundedness�: T ? Y jX� �Propensity score�: p(x) = E[T jX = x ] = P[T = 1 jX = x ]

� Suppose p(x) = 1=2

� Binary covariates: x 2 f0; 1g, Ntx = #fi jTi = t; Xi = xg

8

Example: True Weights Estimator

Normalized variance bound for �0:

Vbound = 2E [V(Y jX)] + V (E[Y jX])

True weights estimator:

�tw =1

N

N∑i=1

YiTip(Xi)

=1

N

N∑i=1

YiTi1=2

Vtw = 2E [V (Y jX)] + V (E[Y jX]) + E[E[Y jX]2

]Ine�cient unless E[Y jX] = 0

9

Example: Estimated Weights Estimator

Estimated �propensity score�:

p(x) =

{N10=(N00 + N10) if x = 0

N11=(N01 + N11) if x = 1

Ntx = #fi jTi = t; Xi = xg

Estimated weights estimator:

�ew =1

N

N∑i=1

YiTip(Xi)

Vew = 2E [V (Y jX)] + V (E[Y jX]) = Vbound

Fully E�cient!

10

GMM Interpretation: True Weights Estimator

True weights estimator �tw is GMM estimator with moment

1(y ; t; x; �) =yt

p(x)� � =

yt

1=2� �

corresponding to

E fE[Y T jX]� E[Y jX]E[T jX]g = 0

Ignores information about T , not necessarily e�cient.

11

GMM Interpretation: Estimated Weights Estimator

The propensity score provides additional information:

E fE[T jX]� p(X)g = E[T � 1=2] = 0

With a binary covariate, we have

2(y ; t; x; �) =

x(t � 1=2)

(1� x)(t � 1=2)

GMM with moment conditions 1 and 2 is fully e�cient and corresponds tothe estimated weights estimator.

12

An Estimator for �

Quantities of interest:

�� � E[Y (1)� Y (0)] p(x) � P[T = 1 jX = x ]

Conditional moments:

�t(x) � E[Y (t) jX = x ] �2t (x) � V (Y (t) jX = x)

�� satis�es E [ (Y; T; X; ��; p�(X))] = 0 where

(t; t; x; �; p(x)) =yt

p(x)� y(1� t)

1� p(x) � �

Given an estimate of p,

� =1

N

N∑i=1

(YiTip(Xi)

� Yi(1� Ti)1� p(Xi)

)13

Series Logit Estimator

Vector of functions: RK(x) = (r1K(x); r2K(x); : : : ; rKK(x))>

Multi-index: � = (�1; : : : ; �r)>, �j 2 N, r = dim(x)

Norm: j�j � ∑rj=1 �j

Sequence of distinct multi-indices: f�(k)gk with j�(k)j � j�(k + 1)jPower series elements: x� =

∏rj=1 x

�j

j = x�11 x�22 � � � x�r

r

Take the sequence frkK(x)gk where rkK(x) = x�(k)

14

Series Logit Estimator

Example (r = 3):

�(1) = (0; 0; 0); �(2) = (1; 0; 0); �(3) = (0; 1; 0);

�(4) = (0; 0; 1); �(4) = (2; 0; 0); : : :

R1 = 1 R2 =

1

x1

R3 =

1

x1

x2

R4 =

1

x1

x2

x3

R5 =

1

x1

x2

x3

x21

: : :

15

Series Logit Estimator

Logistic CDF: L(a) = ea

1+ea

Series Logistic Estimator for p�(x) is p(x) = L(RK(x)>�K) with

�K = argmax�

N∑i=1

[Ti lnL(R

K(Xi)>�) + (1� Ti) ln(1� L(RK(Xi)

>�))]

16

More Assumptions

Assumption 2. Distribution of X:

i. Support of X is a compact subset of Rr .

ii. Density of X is bounded and bounded away from 0.

Assumption 3. Distribution of (Y (0); Y (1)):

i. E[Y (0)2] <1 and E[Y (1)2] <1.

ii. �0(x) and �1(x) are continuously di�erentiable.

Assumption 4. Selection probability:

i. p�(x) is continuously di�erentiable of order s with s � 7r .

ii. 0 < p�(x) < 1

Assumption 5. The Series Logit Estimator of p�(x) uses a power series withK = N� for some 1

4(s=r�1) < � < 19.

17

Asymptotic Properties of �

Theorem 1. Suppose assumptions 1�5 hold. Then:

i. �p�! ��.

ii.pN(� � ��) d�! N(0; V ) with

V = E

[(�(x)� �)2 + �21(X)

p�(X)+

�20(X)

1� p�(X)

]:

iii. � reaches the semiparametric e�ciency bound.

18

Asymptotic Properties of �

� is asymptotically linear:

� = �� +1

N

N∑i=1

[ (Yi ; Ti ; Xi ; ��; p�(Xi)) + �(Ti ; Xi)] + op(1=

pN)

where

�(t; x) = �(�1(x)

p�(x)+

�0(x)

1� p�(x))(t � p�(x))

and soV = E[( + �)2]

Known weights estimator is asymptotically linear with in�uence function .

Consistent estimator for V is found using another Series Logit Estimator.

19

An E�cient Estimator for �wate

�wate =

∫E[Y (1)� Y (0) jX = x ]g(x)dF (x)∫

g(x)dF (x)

By choosing a weighting function g appropriately, we can obtain averagetreatment e�ects for a subpopulation de�ned by X.

Note that g = p� yields �treated .

(y ; t; x; �wate; p(x)) = g(x)

(yt

p(x)� y(1� t)

1� p(x) � �wate

)

�wate =

N∑i=1

g(Xi)

[YiTip(Xi)

� Yi(1� Ti)1� p(Xi)

]/ N∑i=1

g(Xi)

20

Asymptotic Properties of �wate

Theorem 3. Suppose assumptions 1�5 hold, jg(x)j is bounded, and

E[g(x)] > 0. Then:

i. �watep�! ��wate.

ii.pN(�wate � ��wate)

d�! N(0; V ) with

V =1

E[g(X)]2E

[g(X)2 (�(X)� ��wate)

2

+g(X)2

p�(X)�21(X) +

g(X)2

1� p�(X)�20(X)

]:

iii. V is consistent for V .

Theorem 4. �wate reaches the semiparametric e�ciency bound for �wate.

21

An Estimator for �treated with p� Known

Take g(x) = p�(x) and apply the estimator for �wate.

(y ; t; x; �wate; p(x)) = p�(x)

(yt

p(x)� y(1� t)

1� p(x) � �wate

)The estimator �treated is the solution to

0 =

N∑i=1

p�(Xi)

(YiTip(Xi)

� Yi(1� Ti)1� p(Xi)

� �treated)

Notice that p� is used as the weighting function while p weights observations.

Hahn (1998) showed that knowing p� reduces the variance bound for �treated .

From Theorems 3 and 4, this estimator ispN-consistent, asymptotically

normal, and e�cient.

22

An Estimator for �treated with p� Unkown

If p� is unknown, the e�ciency bound for �treated is higher.

We need a new estimator since �treated used p�.

Let �te be the solution to

0 =

N∑i=1

p(Xi)

(YiTip(Xi)

� Yi(1� Ti)1� p(Xi)

� �te):

23

Asymptotic Properties of �te

Theorem 5. Suppose that assumptions 1�5 hold. Then:

i. �tep�! ��treated .

ii.pN(�te � ��treated)

d�! N(0; V ) with

V =1

E[p�(X)]2E

[p�(X)2 (�(X)� ��treated)2

+ p�(X)�21(X) +p�(X)2

1� p�(X)�20(X)

]:

iii. �te reaches the semiparametric e�ciency bound for estimation of �treatedwhen the propensity score is not known.

24

Conclusion

Results:

� Hahn (1998) showed that conditioning on the true propensity score doesnot, in general, yield an e�cient estimator.

� Weighting by the true propensity score does not yield e�cient estimators,however, using the estimated propensity score does.

� The proposed estimators require nonparametric estimation of fewerfunctions than other estimators.

Open Questions:

� Finite sample properties

� Computational properties

25

The Mystery of Propensity Score Matching

October 28, 2008

• Rosenbaum and Rubin, the ”Central Role” of propensityscore.

• D = 0, 1. A balancing score is a function b (X ) such that

D ⊥ X |b (X )

• b (X ) = X is obviously a balancing score.

• p (X ) = p (D = 1|X ) is also a balancing score.

• p (X ) is the coarsest balancing score. In other words, for anybalancing score b (X ), there must be some function g (·) suchthat

p (X ) = g (b (X ))

• Proof: By assumption,

f (D = 1|X , b (X )) = f (D = 1|b (X )) .

The LHS is p (X ). The RHS is some function of b (X ).

• Conditional independence (CI) assumption:

Yi1,Yi0 ⊥ Di |Xi

• Also called unconfoundedness, strong ignorability, etc.

• Under (CI), you can match on any balancing score b (Xi ):

Yi1,Yi0 ⊥ Di |b (Xi )

because

E (Y1 − Y0|b (X ))

=E (Y1|D = 1, b (X )) − E (Y1|D = 0, b (X ))

=E (Y |D = 1, b (X )) − E (Y |D = 0, b (X )) .

• But, why do you want to match on b (X ), or p (X )?

• Does balancing score really help with estimating ATE andATT?

• Only if you know p(X ), or b(X ).

• Consider ATE. Method 1, no p(X ):

E (Y1 − Y0) =EX (E (Y1 − Y0|X ))

=EX (E (Y |D = 1,X ) − E (Y |D = 0,X ))

This can be estimated by

1

n

n∑i=1

[E (Y |D = 1,Xi ) − E (Y |D = 0,Xi )

]

• Method 2, match on p(X ),

E (Y1 − Y0) =Ep(X ) (E (Y1 − Y0|p(X )))

=Ep(X ) (E (Y |D = 1, p(X )) − E (Y |D = 0, p(X ))) .

• First estimate p(X ), then,

1

n

n∑i=1

[E (Y |D = 1, p(Xi )) − E (Y |D = 0, p(Xi ))

]• Does this improve efficiency because the conditional mean is

only estimated on one dimension?

• No. Estimating p(X ) is still a multi-dimensional problem.

• Same story for ATT.

• Similarly for ATT, method 1,

E (Y1 − Y0|D = 1) =EX (E (Y1 − Y0|X ,D = 1))

=EX |D=1 (E (Y |D = 1,X ) − E (Y |D = 0,X ))

This is estimated by

n∑i=1

Di

[E (Y |D = 1,Xi ) − E (Y |D = 0,Xi )

]/

n∑i=1

Di .

• Method 2:

E (Y1 − Y0|D = 1)

=Ep(X ) (E (Y1 − Y0|p(X ),D = 1))

=Ep(X )|D=1 (E (Y |D = 1, p(X )) − E (Y |D = 0, p(X )))

This is estimated by

n∑i=1

Di

[E (Y |D = 1, p(Xi )) − E (Y |D = 0, p(Xi ))

]/

n∑i=1

Di .

• Inverse propensity weighting is different from propensitymatching.

• For ATE, can show that

E (Y1 − Y0) = E

(Y1

p

p(X )|D = 1

)− E

(Y0

1 − p

1 − p(X )|D = 0

).

Can be estimated by

1

n1

n1∑i=1

Y1ip

p (Xi )− 1

n0

n0∑i=1

Y0i1 − p

1 − p (Xi ).

• Inverse propensity (probability) weighting for ATT

E (Y1 − Y0|D = 1) = E (Y1|D = 1) − E

(Y0

p (X ) (1 − p)

p (1 − p (X ))|D = 0

).

This can be estimated by

1

n1

n1∑i=1

Y1i −1

n0

n0∑i=1

Y0ip (Xi ) (1 − p)

p (1 − p (Xi )).

• Mystery of inverse prob weighting for ATT: even if you knowp (X ), using a combination of p (X ) and p (X ) can improveefficiency.

• But how do you guess the most efficient combination?

• Ask Imbens, Hirano and Ridder (2003).

• Mystery unresolved:

• Why do we ever need p (X ) when we don’t know p(X )?

• Why should we ever use inverse probability weighting?

������������ ����������������������� �������� �!�����" ��$#&%'�)(* +�-,. 0/��1�2#&%3 ����4�657�980 :5+;<�=���>�?�!��@+#7�A��(* CB�D����� �!�9 0�2#�%E;<�=�?�!�* 0

�F�&��?�>��@?�

GFH9IKJ�LNMPOQGFI�R"S4T JUR V"SXW1O�YZ\["R'GFV1]4L�T Y^MFR V1S`_a[KT S\Ocb6deIKJ�V1Y

fhgjihkPihl*m�n>o�p grqhstKihu�v^wxs�y�i{zh|)}=}�~�zj��z^zPv

W1R V��6�0�������

������� ������� �

� ��� J���MPY O���KO)H T ������R�LjT R I\H J�Y7O)VQS4T Y^MhLjT9I\[1M�T O)V"R H\O)[1M��O)d J�Y �

� � [\I1Y�T S4T �JS<MxL R\T9VKT�V ]���L O�]4L R d Y� ��R\H O)V1S J"!�#%$�$�&('x�

� ) V\T O)V � M R1Mx[ Y�+*�L JJ�d R V,!�#%$�-��('h�/.DR�L S0!�#%$�$�1('x�

� 2 T9VKT�de[ d43aR1]KJ�Y� 57T76�R�L S\O)� *�OKL M�T�VQR V1S8��J�dUT J�[%9:!�#%$�$�1('x�

� ; L>R\V1Y<�!J�L=�"L O�]4L R d Y R V"S�JS�[� R1M�T O)V �

� > J�M7MxL J R1Mhd J�V M7J � J���MPY R�L JeO�>M J�VXL J�Y^MhL�T ��MPJSXM O H O � R MhT O)V'YZKT �>M��

� b ? d J�MxZ1O�SXJ�Y^MhT9d R1MPJ�Y.d J R\V'YZKT �>M��

� @ O%3 MPO [1Y=JEb ?AM OcH J R�LrV'R IKO)[ M7S4T Y^MhL�T�I\[1M�T O)V"R H\J � J���M Y �

� � � ��� ����� ����� �

� ��� J���MFO��MhL J R1Mxd J�V1M�O)V���[�R\V1M�T�H J�Y O�?MhZ"JUO)[ M<�O)d J

� ["R V1M�T�H J ; L J R1Mhd J�V M � � J���MPY ! ; � 'x�

� R1Y=JS�O)V b6deIKJ�V Y R V"S�GFV1]4L�T Y^M !�#%$�$�� 'h��G V ]4L�T Y^Mh�Db6d IKJ�V Y R V1S &[ IKT�V�!�#%$�$�1('h�?b6deIKJ�V Y R V"S� �[\I\T9V�!�#%$�$��(' R V1SXGFI"R"S4T J !r�������('x�

� 6�O�MhT O)VQO� �O)d � H T J�L Y �

� 5{J��"J�H O �'J�Y^MhT9d R M OKL Y��!O\L�MxL J R1Mhd J�V M7J � J���MPY O)V �O)V1S�T MhT O)V"R H��["R V1M�T�H J�Y �

� b6d �KOKL M R V�J�����[�R\V1M�T�H J �O)d��"R�L�T Y=O)V Y O� ��Z"R V1])T9V ] 3{R1]KJT9V1J��0["R H T M � �

� W O)I ; L R\T9VKT�V ]���R�L MhV"J�LNYZKT7�'G ��M !rW ; ��G 'x�

� � V1J��jY�T S\JS V1O)V�O)d �\H T R V�J�Y �

�=��� ������ � �(������� ����� �����

� T6��T�� S:��O)I Y=J�L ��R MhT O)V1Y �

� � ��Y�� R H R�L&O)[ M<�O)d J�� �O)V1M�T�V1O)[1YH �<S4T Y^MxL�T�I\[ M JS0�

� � R�LjVKT�V ]cT�V'W ; ��G �

� � �*I\T9V"R�L � MxL J R1Mhd J�V M T9V"S4T � R M OKL �

� T9V1S�T � R1M J�Y �"L O�]4L R d ��R�LNMhT �"T7�"R1M�T O)V<T9VQW ; ��G �

� � �*IKT�V"R�L ��T9V1Y^MhLr[\d J�V1Mx�

� L R V1S\O)deT �JS�MhL R\T�VKT9V1]EO � J�LaT9VQW ; ��G �

� ���� � T9VQW ; ��G

� � ����� #UJ 9�O�]KJ�V"O)[ Y �O��R�L�T R M J ��J���MPO\Lr�

� � S0[� R1MhT O)V\��d R�L�T M R H Y^M R1Mx[ Y��[\V"J�d �\H O���d J�V M.Z\T Y^M OKL � T9VQW ; ��G �

� �4O�MPJ�V1M�T R\H)O)[ M<�O)d JET9V"S\J 9�JS R1]\RKT�V Y^M � � ��� �

� � z �0O)[ M<�O)d J ��R H ["J'T � � � #��

� � w �0O)[ M<�O)d J ��R H ["J'T � � � �)�

� �4O�MPJ�V1M�T R\H�MxL J R1Mhd J�V M�Y^M R1Mx[ Y T9V"S\J 9�JS R1]\R\T9V Y^M � � ��� �

� � z ��MhL J R Mxd J�V1M-Y^M^R1Mh[1Y T � � � #��

� � w ��MhL J R Mxd J�V1M-Y^M^R1Mh[1Y T � � � �)�

� � I��PJ���MFO�7T9V �!J�L J�V�JQT YFM OcH J R�LjV'R IKO)[1M=��J R1Mh[ L J�Y O�{S4T Y^MxL�T�I\[ MhT O)V1YO� � z R V"S � w �+�KO�Y^Y�T9I H �XT9VeY[ I��KO � [\H R1MhT O)V1Y �

� G�Y^Y[\d�� MhT O)V'�)� #�� �!O\L�R H d O�Y^M7R H9H���R H ["J�Y O� � �

� b6V1S\J �KJ�V"S\J�V�J�� ! � z�� � w�� � z�� � w ' T Y �PO�T9V1MhH ��T9V"S\J �KJ�V"S\J�V MxH �QO� �])T �"J�V � �

� 6�O)V1MhLjT �\T R H�R1Y^Y�T ]4V d J�V M ����! � � #� � '� !r� � # '�

� *4T9L Y^M � M^R1]KJ����� � z � ��� �� �� � w � ��� �

� 2 O)V1O�M O)VKT �"T M � ��� ! � z�� � w � � ' � #��

� �"J�d d R<�)� #�� _{T ��J�V'R1Y^Y[\d�� MhT O)V'�)� # ! T ' R V1S��O)V1S4T M�T O)V"R H)O)V �R\V"S � z�� � w �0MhL J R1Mxd J�V1M-Y^M^R1Mh[1Y� � ��T Y T9V1S J �KJ�V1S J�V M7O��KO�MPJ�V1M�T R\H O)[ M<�O)d J�Y � ! � z�� � w '�� � � � � � z�� � w �

� ��L O�O� � �QR1Y^Y[ d � MhT O)VQ�)� # ! T�'h�/! � z�� � w � � z�� � w '�� � � � �4Y=O! � z�� � w '�� � � � � � z � # � � w � �)���AZ"J�V � z � #UR V"S � w � ���� � R V�IKJ Y[ I Y^MhT Mx[ M JS"��OKL � ��OKL � � � �

� b6V 3{OKL SKY��R1Y^Y[ d � MhT O)V'�)� # ! T�' T�d � H T J�YFMxZ"R1M �!O\L&J���J�L �,�KO�Y^Y�T�I\H J � �MhZ"J �O)V1S�T MhT O)V"R H�S4T Y^MxL�T�I\[ MhT O)V O� ! � z � � w '7])T �"J�V � z � # � � w � �T Y T9V"S\J �KJ�V"S\J�V M7O� � �Fb6V'O�MhZ"J�L 3{OKL S\Y�0])T �"J�V � �

�:! � z � � w � � z � � w � � � �(' � � ! � z � � w � � z � � w � � � # '

� _{T ��J�VeMxZ"R1M � T Y J 9�O�]KJ�V1O)[1Y� �O)V1S�T MhT O)V\T9V ]XO)V � � R V�IKJQd R"S JJ 9 �\H T �"T Mx�

�=� ����� � ����� ��� ����� � � ��� ����� ������ �

� _&OKR H ��T S J�V MhT � ��=! � z � � z�� � w ' � � ! � z � � z � # � � w � �('

� � I Y=J�L �"J��� ! � z � � � # � � � # ' R\V"S��=! � z � � � # � � � �('��

� GFH Y=O<O)I1Y=J�L �"J����:! � � #� � � # 'FR V"S ��! � � #� � � �('��

� *4T9L Y^M V"O�MPJ.MxZ"R1M�=! � z � � � # � � � # ' � �=! � z � � z � # � � � # ' � �=! � z � � z � # ' �R\V"S� ! � z � � � # � � � �(' � � ! � z � � w � # � � � �(' � � ! � z � � w � # '

� .�O)d��\H T J�L�R1Y^Y[\d�� MhT O)VXd J R V YFMxZ"R1M� � w � #� � � � w � � z � #� �

Z1J�V�J� ! � z � � w � # ' � � ! � z � � w � # � � z � # '��

� GFH Y=O V"O�MPJ.MxZ"R1M�:! � � #� � � # ' � ��! � z � #� � � # ' � �:! � z � # '

R\V"S��! � � #� � � �(' � ��! � w � #� � � �(' � ��! � w � � z � # '��

� .�O)d IKT9V"J.MxZ1J�Y=J<L J�Y[ H MPY �!O\L{T S\J�V MhT � � R1MhT O)V\�� ! � z � � � # � � � # ' � � ! � z � � z � # '

� � ! � z � � z � � w � # '� ! � z � � w � # '��! � z � # '

� � ! � z � � z � # � � w � �(' ��! � z � # � � w � �('��! � z � # '

� � ! � z � � w � # '�:! � z � � w � # '��! � z � # '

� � ! � z � � z � # � � w � �('�� #�� � ! � z � # � � w � # '��! � z � # ' � �

b S J�V MhT � � R1M�T O)V��O)V MhT9V [1JS �� ! � z � � z � # � � w � �(' �

�:! � z � # '��! � z � # ' � ��! � z � # � � w � # '

� �=! � z � � z � # ' � � ! � z � � z � � w � # '�:! � z � � w � # '��! � z � # ' �

�� ! � � #� � � # '

� ! � � #� � � # ' � �:! � � #� � � �(' �� �=! � z � � � # � � � # ' � �=! � z � � � # � � � �(' � ! � � #� � � �('� ! � � #� � � # ' �

� � � � ��� ��� � �������

� _&OKR H ��T S J�V MhT � ��=! � w � � z�� � w ' � � ! � w � � z � # � � w � �('

� � I Y=J�L �"J��� ! � w � � � � � � � # ' R\V"S��=! � w � � � � � � � �('��

� GFH Y=O<O)I1Y=J�L �"J����:! � � � � � � # 'FR V"S ��! � � � � � � �('��

� *4T9L Y^M V"O�MPJ.MxZ"R1M�=! � w � � � � � � � # ' � �=! � w � � z � � � � � # ' � �=! � w � � z � �(' �R\V"S�=! � w � � � � � � � �(' � �=! � w � � w � � � � � �(' � �=! � w � � w � �('��

� .�O)d��\H T J�L�R1Y^Y[\d�� MhT O)VXd J R V YFMxZ"R1M� � z � �� � � � w � � z � �� �

Z1J�V�J� ! � w � � z � �(' � � ! � w � � w � � � � z � �('��

� GFH Y=O V"O�MPJ.MxZ"R1M��! � � � � � � # ' � �:! � z � � � � � # ' � ��! � z � � w � �('

R\V"S��! � � � � � � �(' � �:! � w � � � � � �(' � � ! � w � # '��

� .�O)d IKT9V"J.MxZ1J�Y=J<L J�Y[ H MPY �!O\L{T S\J�V MhT � � R1MhT O)V\�� ! � w � � � � � � � �(' � � ! � w � � w � # '

� � ! � w � � z � � w � �(' � ! � z � � w � �('��! � w � �('

� � ! � w � � z � # � � w � �(' ��! � z � # � � w � �('��! � w � �('

� � ! � w � � z � �(' �:! � z � � w � �('��! � w � �('

� � ! � w � � z � # � � w � �('�� #�� � ! � z � # � � w � �('��! � w � �(' � �

b S J�V MhT � � R1M�T O)V��O)V MhT9V [1JS �� ! � w � � z � # � � w � �(' �

��! � w � �('��! � w � �(' � ��! � z � � w � �(' �� �=! � w � � w � �(' � � ! � w � � z � � w � �(' �:! � z � � w � �('

��! � w � �(' ��

� ! � � � � � � �('� ! � � � � � � �(' � �:! � � � � � � # '

� �=! � w � � � � � � � �(' � �=! � w � � � � � � � # '� ! � � � � � � # '� ! � � � � � � �(' �

� ���(��� ��� ��� � ��� � ��� ����� � �%� � � � ��

� G�Y^Y[\d�� MhT O)V�)� #�� *�OKL � ! � � # 'x��MxZ1J�L JeJ 9 T Y^MPY�� � ��+R V1S� � ���� Y[��Z MxZ"R1M

� � ! � � � � � � � z � � w ' � � � � � ��� � � �3 Z1J�L J � � ! � � � � � � � z � � w '�S\J�V"O�MPJ�Y MhZ"J�� � ��[�R V MhT9H J O� �])T �"J�V � R\V"S � �!O\L/�O)d��\H T J�LNY �

� R1Y�T � R H H ���)MxZKT Y T Y R1Y^Y[\deT9V ]<MxZ"R1M

� � ! � w � � � � z � � w ' � ��� � � �R\V"S

� �/! � z � � � � z � � w ' � ��� � ��� � � �� ���'T Y�MxZ1JeS4T � J�L J�V�J<T9V MhZ"J ��[�R\V1M�T�H J�Y�*V"O�M-MxZ1J �0["R V MhT9H J O�?MxZ1JS4T � J�L J�V�J��

� R1Y^Y=J�M M7R\V"S���O J�V���J�L ��["R V1M�T�H JQL J�]4L J�Y^Y�T O)V��! � � � � � ' � R�LN] dUT�V����� � ��������� ����� ��� � � � � � � � ��� � � z � � w��

� � � ! � ' ��Z1J��!� �P[ V��MhT O)V��� � !#" ' � ! � � # !$"&%A�(' ''�(" �

� [ M �O)d � H T J�L �KO �\[ H R1M�T O)VET Y V"O�MFS4T9L J���MhH �'O)I Y=J�L �"JS0�

� 5{J ��V1J) ! � � � � � ' � #�� � ��!�#�� � '

#��+* w ! � ' � !�#�� � '�� �* w ! � ' �

� * w ! � ' � ��! � � #� � '��

� ) � # 3 Z1J�V � � � � ) % �<O�MhZ"J�L 3 T Y=J��

� �"J�d d R )� #�!jGFI"R"S4T J !r�������(' '�� ��J�M�� ! � � � � � ' IKJUR V �XL J R H�P[ V��MhT O)V O��! � � � � � '-Y[��Z'MhZ�R1M�� ����! � � � � � ' ��%��C�7_{T �"J�VR Y^Y[ d �1M�T O)V'�)� #��

�� �� ! � � � � � ' � � z � � w � �#

�:! � z � � w '� ��! ) ��� ! � � � � � '�' �

� b � � � � T9d��\H T J�Y � � ��� !^V1OXO � J�Lj��V1O<MxL R\T9V\T9V ] '

) ! � � � � � ' � #�� ! #�� � '�� ��� * w ! � '��� ��R�L R d J�M J�LNY O��T9V1MPJ�L J�Y^M-Y=O)H �"J! � � � � � ' � R�LN] deT9V����� � ��� ����� � � ) � � � � � � � � � ��� � � � � z � � w �

� [ M�Y�R d �\H JUR V"R H O�]cV"O�M �O)V �"J 9)�DIKJ�� R [ Y=J ) %A�:�KO�Y^Y�T9I H J��

� � O)H [1M�T O)V\�4M^R �hJUJ 9 �KJ���M^R1M�T O)V �O)V"S4T MhT O)V�R H O)V�� � ! � � � � � '

� �J � H R�J ) I � )�� �

) � � �� ) ��� � � #�� � ��!�#���� w !�� ' '#�� * w ! � ' � ! #�� � '���� w !�� '* w ! � ' �

� � w !� ' � ��! � ��� ' � ��! � � #� � � � � � 'x�

� �"J�d d R )� � � ) V"S\J�L&R1Y^Y[\d�� MhT O)VQ�)� #

) � !�� ' � ��! � z � � w ��� ' �

�&L O�O� � *�T9LNY^M �O)V Y�T S\J�L&MhZ"J �"L O S0[��M � ��!�#�� � 'x� ; ZKT Y S4T � J�LNY � L O)d�J�L O�O)V\H �XT � � � ��R\V"S � w � #�� �Xd O)V1O�M O)VKT �"T M �� � w � #QT�d � H T J�Y� z � #�� @ J�V�J��

�� � ��! #�� � ' ��� �� �:! � ! #�� � ' � #��� '� �:! � z � � w � #��� ' � ��! � � � � � z � � w � # � � '� �:! � z � � w � #��� ' � ��! � � � � � z � � w � # � � z � � '� �:! � z � � w � #��� ' � ��! � � � � � '��

� T�dUT�H R�LjH ���:!�!�# � � ' � ��� ' � ��! � z � � w � � ��� ' �:! � � #� � '��

; Z1J�L J���OKL J

) � !�� ' � ���<#�� � !�#�� � '�:! � � � � � '

� !�#�� � ' ���! � � #� � '�����

���� #�� ��! � z � � w � #��� ' � � ! � z � � w � � ��� '� ��! � z�� � w ��� ' �

� 5aR1M R � ��� � ��� � ��� � ��� ����� z � � ! � � � � ' � �

� � � � ! ��� � � � ' � ��� � � � � � � ���

� 6�O)V �"R�L>R\d J�MhLjT � R H H �QJ�Y^MhT9d R M J��) � !�� ' T�VeMxZ1J ��LNY^M�Y^M J � ��) � !� � ' � #�� ��� !�#����� !�� � ' '

#����* ! ��� ' � !�#�� ��� '����!�� � '�* ! ��� ' �

� � Y^MhT9d R1MPJ ��!� � ' � �:! � ��� � ' [ Y�T9V1]<Y=J�L�T J�YeL J�]4L J�Y^Y�T O)V��

� � R d �\H JUR V"R H O�] ��OKL�J�Y^M�T�d R1M�T�V ] � ���

�� � � R�L ] deT9V� � � �����

#

����� z

# !��) � !� � ' � �('��) � !�� � '�� � � ��� �� �� � � � �

; Z1JOKL J�d )� #�� ) V"S\J�L&R1Y^Y[ d � MhT O)V1Y �)� # R V1S )� # R V1S`T �� MhZ"JUS R1M^R R�L JQT���T6� S0���� .�O)V"S4T MhT O)V�R H O)V � � T Y �O)V1M�T�V\["O)[ YH �<S4T Y^MhLjT9I\[1MPJS"3 T MxZeY[ ���KO\LNMJ���["R H MPO�R �O)d��"R��M T9V M J�L ��R\H)R V1SES J�V Y�T M � IKO)[\V"S\JS R�3aR�� �PL O)d �R\V"S � O)V'MxZ1J Y[�� �KOKLNM��

� * w ! � ' T Y.IKO)[\V"S\JS R�3aR�� �PL O)d �<R V1S8#��:R V1S � M^R �hJ�Y O)VQR ��VKT MPJV\[\d IKJ�L�O� ��R H ["J�Y��

� MhZ"JUS4T Y^MhL�T�I\[1M�T O)V �P[ V��MhT O)V'O� � � �O)V1S4T M�T O)V"R H O)V R V1S � z�� � wT Y �O)V MhT9V [1O)[1YH �'S�T � J�L J�V MhT R I H JeR1M �J�L O 3 T MhZ'S\J�V Y�T M �������� � � ��� ��� !r�('�MxZ"R1M T Y IKO)[\V1S JS R 3{R ��� L O)d ����:!� � � � z � � w ' T Y �KO�Y�T MhT �"J Y=J�deT S\J ��V\T M J��

� ) � T Y.IKO)[ V1S JSXR�3aR�� �PL O)d ���� �!O\L��eJ��0["R H\M O<MxZ1JQV [\deIKJ�L�O� �O)V1M�T�V\["O)[ Y S J�L�T ��R MhT ��J�Y T9V � O�� w � ��������! �ER V1S �#" � $ �)�

; Z1J�V�� �� �� � � � ��� �� �� !r� �� 'x� 3 Z1J�L J� �� � z � � z � � � � �������� � � � � ��� ! �(' � � � z � � w��

R V1S � � ��!���� � ' 3 T MxZ�� � ) ��� ! � ' ��� ! � ' ��! � � * w ! � ' 'h��3 Z1J�L J� !� ' � � � � # � � � � � � %A�� � � �

R V1S

� ! � ' � ����� !�� '�� � � � ! #�� � '!�# �+* w ! � ' ' � � !�#�� � ' �

* w ! � ' � � ����� �

.*O)V Y�T Y^MPJ�V1M�J�Y^MhT9d R MhT O)V'O�aR Y<��d �1MPO�MhT � ��R�L�T R V�J��������� �� � �� ����� �� �������������� � �� ! � � �

*�O\L#"�T�VeMxZ1J%$ � �J�H9H�O� � �*H J�M �& � ��(' �*)� + �-,/. � � , � + �0� �+ � ���-�21 � � � �43 � � �%5 6�� � �7 �98 � � � � � 3�7 �98�: �;��<�21 6��=?> . � � > � ��� � �#@�ACB/B � ED �& � 5 � �7 �98 ��F�HG� Y^MhT9d R M J.O� � � �IJ� �� �� � �: �: � G

; Z1JOKL J�d )� � � ) V"S\J�LDMxZ1J R Y^Y[ d �1M�T O)V Y O� ; Z1JO\L J�d )� # R V1S`T � �� � ��� ��� � �� � ����� � � � � � ��� Z"R1Y.IKO)[\V"S\JS R\V"S��O)V1M�T�V\[1O)[1Y���LNY^M-S J�L�T ��R1M�T �"J�Y��� � !�� ' � ����� � !�� '�� � #������ � � !�� ' ����� %�� �� MhZ"J�L JUJ 9 T Y^M Y� � �<Y[��ZeMxZ"R1M � � !�� ' � � !�� w ' ���� � � ��� w ���

MxZ1J�V�� � � � z �� � � z��� � �

� �����

� ; L>RKT�VKT�V ] � [\V1S�T9V1]cIKJ�]\R\VXT9V � ��M O)IKJ�L #%$�- )�

� .�O)V1M�T�V\[1J<T9V M O H R1M J"#%$�$���� Y �

� ; T MxH J<b>b)Y[�� �KOKLNMPYFMxL R\T9VKT�V ] ��OKL�J��O)V1O)dUT � R H H �ES�T Y�R"S���R\V1M R1]KJS:�

� # deT9H ��R�LNMhT �"T ��R V MPY�R � J R�Lr�0R\V V\[�R\H��O�Y^M # � 1�I\T9H��ET9VUJ R�LrH � #%$�$���� Y �

� 1���$ � J�L � T �J"5{J�H T ��J�L �QG L J R1Y�Y�T MPJ�Y �

� ����� � �� ���������� ��� � ��� �

� H R�LN]KJ�Y^M L R V"S\O)dUT �JSEMxL R\T9V\T9V ]EJ���R H ["R1MhT O)V<T�V ) � �

� ����� ����� ��R�LNMhT �"T ��R V MPYFR1M #%1 � 5�G�Y �

� Y�T M J�Y.V1O�M L R V1S O)d �aR1Y^Y�T ]4V d J�V1M=3 T MxZKT9VeY�T M J<L R V1S O)d �

� Y�R\d��\H J��aR ���\H T � R V MPY IKJ�M 3{JJ�V:6?O��#%$�-���R V1S � J � M #%$�-�$)�

� #%&�� $�- #��KJ�LNY=O)V1Y 3 T MhZ �� d O)V1MhZ1Y O�{J R�LjVKT�V ]�S�R1M R��KO�Y^MR Y^Y�T ]4V\d J�V M��

� b6V1S4T �\T S�["R H Y V"O�MFO � J�L JSXMxL R\T9VKT�V ]EJ 9���H [1S JS ��OKL #%- d O)V1MhZ1Y �

� ; Z1J�� d R � �"R�LNM�T �"T ��R M J'T9V'O�MhZ"J�L=��L O�]4L R d Y �

� ������� �

� ��R�LNMhT �"T ��R V MPY �PL O)d S4T � J�L J�V M7]4L O)[��1Y �

� �"O)V1] �jMPJ�Lrd [1Y=JUO� 3�J�H ��R�L J���Z\T ]4ZeY���Z"O O)H)S�L O �KO)[ Mx�DH O)V1][\V1J�d��\H O�� d J�V Mx��H T9dUT MPJSXJ�V ]4H T YZ���L O � �"T J�V����/� Z%�"Y�T � R H\OKL�d J�V1M R HS4T Y�R I\T9H T M ��*L J R"S4T9V1]���L O � �"T J�V���cIKJ�H O�3 �"MhZ ]4L R"S\J<H J��"J�H ��R\VQR�LjL J�Y^ML J��OKL S:�

� 2 O�Y^M �O)d d O)V [\V"J�d �\H O���d J�V M R V"S`ZKT ]4ZeY���Z"O O)H)S�L O �KO)[1M Y �

� *"O ��[ Y O)VQR"S�[ H M d J�V<R\V"S"3{O)d J�V

� 1 # ���cd J�V<R V1S,& # ��� 3{O)d J�V��

� � � ��cd O)V MxZ'J R�LrVKT9V1]��

� � T�V1S4T � R1MPO\L+�!O\L�J�V L O)H H9d J�V1M T�V'W ; ��G�Y=J�L �\T �J�Y �

� � T9V"S4T � R M OKL �!O\L&O � J�L�O�?Y=J�L �\T �J�Y �

� ��! � � #� � � # ' � � 1�����I\[1M � ! � � #� � � �(' % � ���)�

� .�O��R�L�T R1M J�Y V1JJS\JSXM O �O\LjL J���M ��OKL �O\LjL J�H R MhT O)VXIKJ�M 3{JJ�V � R\V"S � �� b6V��H ["S\J�YUdeT9V"OKL�T M �QS0[\dedeT J�Y�0S0[\d d � �!O\LaZKT ]4Z �jY���Z"O O)H�]4L R"S�[�R M J�Y�S�[\d d � �!O\Lad R�L�T M^R HKY^M^R Mx[ Y�0R1]KJUS0[\d dUT J�Y�:S�[\d d ���!O\L&G * 5 .L J��J"T �1M ��OKL 3{O)d J�V\��S�[ ded � ��OKL 3{O\L � ��L�T O\L&M O�R1Y^Y�T ]4V d J�V1Mx�

� .�O��R�L�T R1M J�YUI"R H R\V�JS`I��QR1Y^Y�T ]4V\d J�V1M7Y^M R1Mx[ Y��I\[1M V"O�M I��MhL J R1Mxd J�V1M�Y^M^R1Mh[1Y �

� � � ��� � � � � �

� � H � � ���&�� ��OKLad J�V\�:��� #%&"�!O\L 3{O)d J�V

� � � � � �+#%&�$ ��OKL-d J�V\� # ��-�� ��OKL 3�O)d J�V �

� � � � � � $:�KJ�L �J�V1M ��OKL-d J�V � #%&:�KJ�L �J�V1M ��OKL 3{O)d J�V

� ["R V1M�T�H JQL J�]4L J�Y^Y�T O)V��

� ��R�LN]KJ�L ��L O �KO\LNMhT O)V"R H)J � J���M R1M H O�3{J�L ��[�R\V1M�T�H J MxZ"R VEZ\T ]4Z1J�L��["R V1M�T�H J��

� 2 OKL JUS0L R d R1MhT � ��OKLad J�V MxZ"R V �!O\L 3{O)d J�V��

� � � � � ��� � �� � �

� ??J�L �<S4T � J�L J�V M �"R1M M J�LjV��PL O)d ��[�R V MhT9H JQL J�]4L J�Y^Y�T O)V

� 6�OcT9d ��R��M�O)V H O%3�J�L ��[�R V MhT9H J�Y ��OKL�d J�V

� de[��Z'Yd R H9H J�L�T�V�IKO�MxZ�L J�H R MhT ��JUR V1SXR I1Y=O)H [ M J M J�Ljd Y �

� H R�LN]KJ R\V"SEY�T ]4VKT � � R V MFR1M [ ���KJ�L �0["R V MhT9H J�Y �� Y^M�T�H HKYd R\H9H J�LDMxZ"R V ��["R V1M�T�H J'L J�]4L J�Y^Y�T O)VQJ�Y^MhT9d R1MPJ�Y �

� � T ]4V\T � � R V M-J � J���MPY ��OKL 3{O)d J�V<R1M7R\H9H��0["R V MhT9H J�Y �

� H R�LN]KJ�Y^M ��L O �KO\LNM�T O)V"R H)J � J���MPY R1M H O�3 ��[�R\V1M�T�H J�Y �

� .�H O�Y=J�L�M O ��["R V1M�T�H JQL J�]4L J�Y^Y�T O)V<J�Y^M�T�d R1MPJ�Y �

� � � � � � � ��� �

� � JJ�d Y S J�Y�T9L R I H J.S4T Y^MhLjT9I\[1M�T O)VeO)[ M<�O)d J �!O\L 3{O)d J�V��

� �AZ%�QS4T S0V�� M�MhL>RKT�VKT�V ]cIKJ�V1J �)M H O%3 �0["R V MhT9H JQd J�V �

� ��L O�]4L>R\d dUT ]4Z M Z"R ��J.J 9 ��H ["S\JS H O%3+T�V�O)d J�d J�V��

� ZKT ]4Z"J�L � w �0["R V MhT9H J�Y �!O\Lad R H J MhL R\T�V1JJ�YFMxZ"R V�V"O)V MxL R\T9V1JJ�Y R M�OKLIKJ�H O�3 d JS�T R V �

� 6�OEY=J�H J���MhT O)VQO� � w ��OKL 3{O)d J�V �

� �4O�Y�T MhT �"JEH O�3 ��[�R V MhT9H J'd R\H J.Y=J�H J���MhT O)VQO� �KO)H T ��� �O)V�J�LjV��

� 57T Y^MxL�T9I [ MhT O)VeJ � J���M dUT ]4Z M IKJQH J�Y^Y T�d �KOKL M R V M�MxZ"R VET�V��L J R1Y�T9V1]V\[\d IKJ�L�O�aR ���\H T � R V MPY �

� @ O%3{J��"J�Lj��[�� �KJ�Lad R H J.J R�LjVKT�V ] �0["R V MhT9H J T Y.ZKT ]4Z � b6V��L J R1Y�T9V1]J R�LjVKT�V ]cT�VeMxZKT Y ��["R V1M�T�H J YZ1O)[ H S�V � M IKJ.O��Z\T ]4Z,�"LjT OKLjT M � �