Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... ·...

29
Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems. ArXiv: 1304.0282 Victor Chernozhukov MIT, Economics + Center for Statistics Co-authors: Alexandre Belloni (Duke) + Kengo Kato (Tokyo) August 12, 2015 Chernozhukov Inference for Z-problems

Transcript of Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... ·...

Page 1: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Uniform Post Selection Inference for LADRegression and Other Z-estimation problems.

ArXiv: 1304.0282

Victor ChernozhukovMIT, Economics + Center for Statistics

Co-authors:Alexandre Belloni (Duke) + Kengo Kato (Tokyo)

August 12, 2015

Chernozhukov Inference for Z-problems

Page 2: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

The presentation is based on:

”Uniform Post Selection Inference for LAD Regression andOther Z-estimation problems”Oberwolfach, 2012; ArXiv, 2013; published by Biometrika, 2014

1. Develop uniformly valid confidence regions for a target regressioncoefficient in a high-dimensional sparse median regression model(extends our earlier work in ArXiv 2010, 2011).

2. New methods are based on Z-estimation using scores that areNeyman-orthogonalized with respect to perturbations of nuisanceparameters.

3. The estimator of a target regression coefficient is root-n consistentand asymptotically normal, uniformly with respect to theunderlying sparse model, and is semi-parametrically efficient.

4. Extend methods and results to general Z-estimation problems withorthogonal scores and many target parameters p1 � n, andconstruct joint confidence rectangles on all target coefficients andcontrol Family-Wise Error Rate.

Chernozhukov Inference for Z-problems

Page 3: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

The presentation is based on:

”Uniform Post Selection Inference for LAD Regression andOther Z-estimation problems”Oberwolfach, 2012; ArXiv, 2013; published by Biometrika, 2014

1. Develop uniformly valid confidence regions for a target regressioncoefficient in a high-dimensional sparse median regression model(extends our earlier work in ArXiv 2010, 2011).

2. New methods are based on Z-estimation using scores that areNeyman-orthogonalized with respect to perturbations of nuisanceparameters.

3. The estimator of a target regression coefficient is root-n consistentand asymptotically normal, uniformly with respect to theunderlying sparse model, and is semi-parametrically efficient.

4. Extend methods and results to general Z-estimation problems withorthogonal scores and many target parameters p1 � n, andconstruct joint confidence rectangles on all target coefficients andcontrol Family-Wise Error Rate.

Chernozhukov Inference for Z-problems

Page 4: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

The presentation is based on:

”Uniform Post Selection Inference for LAD Regression andOther Z-estimation problems”Oberwolfach, 2012; ArXiv, 2013; published by Biometrika, 2014

1. Develop uniformly valid confidence regions for a target regressioncoefficient in a high-dimensional sparse median regression model(extends our earlier work in ArXiv 2010, 2011).

2. New methods are based on Z-estimation using scores that areNeyman-orthogonalized with respect to perturbations of nuisanceparameters.

3. The estimator of a target regression coefficient is root-n consistentand asymptotically normal, uniformly with respect to theunderlying sparse model, and is semi-parametrically efficient.

4. Extend methods and results to general Z-estimation problems withorthogonal scores and many target parameters p1 � n, andconstruct joint confidence rectangles on all target coefficients andcontrol Family-Wise Error Rate.

Chernozhukov Inference for Z-problems

Page 5: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

The presentation is based on:

”Uniform Post Selection Inference for LAD Regression andOther Z-estimation problems”Oberwolfach, 2012; ArXiv, 2013; published by Biometrika, 2014

1. Develop uniformly valid confidence regions for a target regressioncoefficient in a high-dimensional sparse median regression model(extends our earlier work in ArXiv 2010, 2011).

2. New methods are based on Z-estimation using scores that areNeyman-orthogonalized with respect to perturbations of nuisanceparameters.

3. The estimator of a target regression coefficient is root-n consistentand asymptotically normal, uniformly with respect to theunderlying sparse model, and is semi-parametrically efficient.

4. Extend methods and results to general Z-estimation problems withorthogonal scores and many target parameters p1 � n, andconstruct joint confidence rectangles on all target coefficients andcontrol Family-Wise Error Rate.

Chernozhukov Inference for Z-problems

Page 6: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Outline

1. Z-problems like mean, median, logistic regressions and theassociated scores

2. Problems with naive plug-in inference (where we plug-inregularized or post-selection estimators)

3. Problems can be fixed by using Neyman-orthogonal scores,which differ from original scores in most problems

4. Generalization to many target coefficients

5. Literature: orthogonal scores vs. debiasing

6. Conclusion

Chernozhukov Inference for Z-problems

Page 7: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

1. Z-problems

I Consider examples with yi response, di the target regressor,and xi covariates, with p = dim(xi )� n

I Least squares projection:

IE[(yi − diα0 − x ′iβ0)(di , x′i )′] = 0

I LAD regression:

IE[{1(yi ≤ diα0 + x ′iβ0)− 1/2}(di , x ′i )′] = 0

I Logistic Regression:

IE[{yi − Λ(diα0 + x ′iβ0)}wi (di , x′i )′] = 0,

where Λ(t) = exp(t)/{1 + exp(t)}, wi = 1/Λi (1− Λi ), andΛi = Λ(diα0 + x ′iβ0).

Chernozhukov Inference for Z-problems

Page 8: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

1. Z-problems

I In all cases have the Z-problem (focusing on a subset ofequations that identify α0 given β0):

IE[ϕ( W︸︷︷︸data

, α0︸︷︷︸target parameter

, β0︸︷︷︸high−dim nuisance parameter

)] = 0

with non-orthogonal scores (check!):

∂βIE[ϕ(W , α0, β)]∣∣∣β=β0

6= 0.

I Can we use plug-in estimators β, based on regularization viapenalization or selection, to form Z-estimators of α0?

IEn[ϕ(W , α, β)] = 0

I The answer is NO!

Chernozhukov Inference for Z-problems

Page 9: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

1. Z-problems

I In all cases have the Z-problem (focusing on a subset ofequations that identify α0 given β0):

IE[ϕ( W︸︷︷︸data

, α0︸︷︷︸target parameter

, β0︸︷︷︸high−dim nuisance parameter

)] = 0

with non-orthogonal scores (check!):

∂βIE[ϕ(W , α0, β)]∣∣∣β=β0

6= 0.

I Can we use plug-in estimators β, based on regularization viapenalization or selection, to form Z-estimators of α0?

IEn[ϕ(W , α, β)] = 0

I The answer is NO!

Chernozhukov Inference for Z-problems

Page 10: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

2. Problems with naive plug-in inference: MC Example

I In this simulation we used: p = 200, n = 100, α0 = .5

yi = diα0 + x ′iβ0 + ζi , ζi ∼ N(0, 1)

di = x ′i γ0 + vi , vi ∼ N(0, 1)

I approximately sparse model

|β0j | ∝ 1/j2, |γ0j | ∝ 1/j2

→ so can use L1-penalization

I R2 = .5 in each equation

I regressors are correlated Gaussians:

x ∼ N(0,Σ), Σkj = (0.5)|j−k|.

Chernozhukov Inference for Z-problems

Page 11: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

2.a. Distribution of The Naive Plug-in Z-Estimator

p = 200 and n = 100

(the picture is roughly the same for median and mean problems)

−8 −7 −6 −5 −4 −3 −2 −1 0 1 2 3 4 5 6 7 80

=⇒ badly biased, misleading confidence intervals;predicted by “impossibility theorems” in Leeb and Potscher (2009)

Chernozhukov Inference for Z-problems

Page 12: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

2.b. Regularization Bias of The Naive Plug-in Z-Estimator

I β is a plug-in for β0; bias in estimating equations:

√nIEϕ(W , α0, β)

∣∣β=β

=

=0︷ ︸︸ ︷√nIEϕ(W , α0, β0)

+ ∂βIEϕ(W , α0, β)∣∣∣β=β0

√n(β − β0)︸ ︷︷ ︸

=:I→∞

+O(√n‖β − β0‖2)︸ ︷︷ ︸=:II→0

I II → 0 under sparsity conditions

‖β0‖0 ≤ s = o(√

n/ log p)

or approximate sparsity (more generally) since√n‖β − β0‖2 .

√n(s/n) log p = o(1).

I I →∞ generally, since√n(β − β0) ∼

√s log p →∞,

I due to non-regularity of β, arising due to regularization viapenalization or selection.

Chernozhukov Inference for Z-problems

Page 13: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3. Solution: Solve Z-problems with Orthogonal Scores

I In all cases, it is possible to construct Z-problems

IE[ψ( W︸︷︷︸data

, α0︸︷︷︸target parameter

, η0︸︷︷︸high−dim nuisance parameter

)] = 0

with Neyman-orthogonal (or “immunized”) scores ψ:

∂ηIE[ψ(W , α0, η)]∣∣∣η=η0

= 0.

I Then we can simply use plug-in estimators η, based onregularization via penalization or selection, to formZ-estimators of α0:

IEn[ψ(W , α, η)] = 0.

I Note that ϕ 6= ψ + extra nuisance parameters!

Chernozhukov Inference for Z-problems

Page 14: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3. Solution: Solve Z-problems with Orthogonal Scores

I In all cases, it is possible to construct Z-problems

IE[ψ( W︸︷︷︸data

, α0︸︷︷︸target parameter

, η0︸︷︷︸high−dim nuisance parameter

)] = 0

with Neyman-orthogonal (or “immunized”) scores ψ:

∂ηIE[ψ(W , α0, η)]∣∣∣η=η0

= 0.

I Then we can simply use plug-in estimators η, based onregularization via penalization or selection, to formZ-estimators of α0:

IEn[ψ(W , α, η)] = 0.

I Note that ϕ 6= ψ + extra nuisance parameters!

Chernozhukov Inference for Z-problems

Page 15: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.a. Distribution of the Z-Estimator with OrthogonalScores

p = 200, n = 100

−8 −7 −6 −5 −4 −3 −2 −1 0 1 2 3 4 5 6 7 80

=⇒ low bias, accurate confidence intervals

obtained in a series of our papers, ArXiv, 2010, 2011, ...

Chernozhukov Inference for Z-problems

Page 16: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.b. Regularization Bias of The Orthogonal Plug-inZ-Estimator

I Expand the bias in estimating equations:

√nIEψ(W , α0, η)

∣∣η=η

=

=0︷ ︸︸ ︷√nIEψ(W , α0, η0)

+ ∂ηIEψ(W , α0, η)∣∣∣η=η0

√n(η − η0)︸ ︷︷ ︸

=:I=0

+O(√n‖η − η0‖2)︸ ︷︷ ︸=:II→0

I II → 0 under sparsity conditions

‖η0‖0 ≤ s = o(√n/ log p)

or approximate sparsity (more generally) since

√n‖η − η0‖2 .

√n(s/n) log p = o(1).

I I = 0 by Neyman orthogonality.

Chernozhukov Inference for Z-problems

Page 17: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3c. Theoretical result I

Approximate sparsity: after sorting absolute values ofcomponents of η0 decay fast enough:

|η0|(j) ≤ Aj−a, a > 1.

Theorem (BCK, Informal Statement)

Uniformly within a class of approximately sparse modelswith restricted isometry conditions

σ−1n

√n(α− α0) N(0, 1),

where σ2n is conventional variance formula for Z-estimatorsassuming η0 is known. If the orthogonal score is efficient score,then α is semi-parametrically efficient.

Chernozhukov Inference for Z-problems

Page 18: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.d. Neyman-Orthogonal Scores

B In low-dimensional parametric settings, it was used by Neyman(56, 79) to deal with crudely estimated nuisance parameters.Frisch-Waugh-Lovell partialling out goes back to the 30s.

B Newey (1990, 1994), Van der Vaart (1990), Andrews (1994),Robins and Rotnitzky (1995), and Linton (1996) usedorthogonality in semi parametric problems.

B For p � n settings, Belloni, Chernozhukov, and Hansen(ArXiv 2010a,b) first used Neyman-orthogonality in thecontext of IV models. The η0 was the parameter of theoptimal instrument function, estimated by Lasso andOLS-post-Lasso methods

Chernozhukov Inference for Z-problems

Page 19: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.f. Examples of Orthogonal Scores: Least Squares

I Least squares:

ψ(Wi , α, η0) = {yi − diα}di ,

yi = x ′i η10 + yi , IE[yixi ] = 0,

di = x ′i η20 + di , IE[dixi ] = 0.

Thus the orthogonal score is constructed by Frisch-Waughpartialling out from yi and di . Here

η0 := (η′10, η′20)′

can be estimated by sparsity based methods, e.g.OLS-post-Lasso.Semi-parametrically efficient under homoscedasticity.

I Reference: Belloni, Chernozhukov, Hansen (ArXiv, 2011a,b).

Chernozhukov Inference for Z-problems

Page 20: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.f. Examples of Orthogonal Scores: LAD regression

I LAD regression:

ψ(Wi , α, η0) = {1(yi ≤ diα + x ′iβ0)− 1/2}di ,

where

fidi = fix′i γ0 + di , IE[di fixi ] = 0,

fi := fyi |di ,xi (diα0 + x ′iβ0 | di , xi ).

Hereη0 := (fyi |di ,xi (·), α

′0, β′0, γ′0)′

can be estimated by sparsity based methods, by L1-penalizedLAD and by OLS-post-Lasso. Semi-parametrically efficient.

I Reference: Belloni, Chernozhukov, Kato (ArXiv, 2013a,b).

Chernozhukov Inference for Z-problems

Page 21: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

3.f. Examples of Orthogonal Scores: Logistic regression

I Logistic regression,

ψ(Wi , α, η0) = {yi − Λ(diα + x ′iβ0)}di/√wi ,

√widi =

√wix′i γ0 + di , IE[

√wi dixi ] = 0,

wi = Λ(diα0 + x ′iβ0)(1− Λ(diα0 + x ′iβ0))

Hereη0 := (α′0, β

′0, γ′0)′

can be estimated by sparsity based methods, by L1-penalizedlogistic regression and by OLS-post-Lasso.Semi-parametrically efficient.

I Reference: Belloni, Chernozhukov, Ying (ArXiv, 2013).

Chernozhukov Inference for Z-problems

Page 22: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

4. Generalization: Many Target Parameters

I Consider many Z-problems

IE[ψj( Wj︸︷︷︸data

, αj0︸︷︷︸target parameter

, ηj0︸︷︷︸high−dim nuisance parameter

)] = 0

with Neyman-orthogonal (or “immunized”) scores:

∂ηj IE[ψj(W , αj0, ηj)]∣∣∣ηj=ηj0

= 0

j = 1, ..., p1 � n.

I The can simply use plug-in estimators ηj , based onregularization via penalization or selection, to formZ-estimators of αj0:

IEn[ψj(W , αj , ηj)] = 0, j = 1, ..., p1.

Chernozhukov Inference for Z-problems

Page 23: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

4. Generalization: Many Target Parameters

Theorem (BCK, Informal Statement)Uniformly within a class of approximately sparse models withrestricted isometry conditions holding uniformly in j = 1, ..., p1 and(log p1)7 = o(n),

supR∈rectangles in Rp1

|P({σ−1jn

√n(αj − αj0)}p1j=1 ∈ R)− P(N ∈ R)| → 0,

where σ2jn is conventional variance formula for Z-estimators assuming ηj0

is known, and N is the normal random p1-vector that has mean zero andmatches the large sample covariance function of {σ−1jn

√n(αj − αj0)}p1j=1.

Moreover, we can estimate P(N ∈ R) by Multiplier Bootstrap.

I These results allow construction of simultaneous confidencerectangles on all target coefficients as well as control of thefamily-wise-error rate (FWER) in hypothesis testing.

I Rely on Gaussian Approximation Results and Multiplier Bootstrapproposed in Chernozhukov, Chetverikov, Kato (ArXiv 2012, 2013).

Chernozhukov Inference for Z-problems

Page 24: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

5. Literature: Neyman-Orthogonal Scores vs. Debiasing

I ArXiv 2010-2011 – use of orthogonal scores linear models

a. Belloni, Chernozhukov, Hansen (ArXiv, 2010a, 2010b ,2011a,2011b): use OLS-post-Lasso methods to estimate nuisanceparameters in instrumental and mean regression;

b. Zhang and Zhang (ArXiv, 2011): introduces debiasing + useLasso methods to estimate nuisance parameters in meanregression;

I ArXiv 2013-2014 – non-linear models

c. Belloni, Chernozhukov, Kato (ArXiv, 2013), Belloni,Chernozhukov, Wang(ArXiv, 2013);

d. Javanmard and Montanari (ArXiv, 2013 a,b);van de Geer and co-authors (ArXiv, 2013);

e. Han Liu and co-authors (ArXiv 2014)

I [b,d] introduce de-biasing of an initial estimator α. We can interpret“debiased” estimators= Bickel’s “one-step” correction of an initialestimator in Z-problems with Neyman-orthogonal scores. They arefirst-order-equivalent to our estimators.

Chernozhukov Inference for Z-problems

Page 25: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Conclusion

−8 −7 −6 −5 −4 −3 −2 −1 0 1 2 3 4 5 6 7 80

−8 −7 −6 −5 −4 −3 −2 −1 0 1 2 3 4 5 6 7 80

Without Orthogonalization With Orthogonalization

Chernozhukov Inference for Z-problems

Page 26: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Bibliography

I Sparse Models and Methods for Optimal Instruments with anApplication to Eminent Domain, Alexandre Belloni, Daniel Chen,Victor Chernozhukov, Christian Hansen (arXiv 2010, Econometrica)

I LASSO Methods for Gaussian Instrumental Variables Models,Alexandre Belloni, Victor Chernozhukov, Christian Hansen (arXiv2010)

I Inference for High-Dimensional Sparse Econometric Models,Alexandre Belloni, Victor Chernozhukov, Christian Hansen (arXiv2011, World Congress of Econometric Society, 2010)

I Confidence Intervals for Low-Dimensional Parameters inHigh-Dimensional Linear Models, Cun-Hui Zhang, Stephanie S.Zhang (arXiv 2011, JRSS(b))

I Inference on Treatment Effects After Selection AmongstHigh-Dimensional Controls, Alexandre Belloni, Victor Chernozhukov,Christian Hansen (arXiv 2011, Review of Economic Studies)

Chernozhukov Inference for Z-problems

Page 27: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Bibliography

I Gaussian approximations and multiplier bootstrap for maxima ofsums of high-dimensional random vectors, Victor Chernozhukov,Denis Chetverikov, Kengo Kato (arXiv 2012, Annals of Statistics)

I Comparison and anti-concentration bounds for maxima of Gaussianrandom vectors Victor Chernozhukov, Denis Chetverikov, KengoKato (arXiv 2013, Probability Theory and Related Fields)

I Uniform Post Selection Inference for LAD Regression and OtherZ-estimation problems, Alexandre Belloni, Victor Chernozhukov,Kengo Kato (arXiv 2013, oberwolfach 2012, Biometrika)

I On asymptotically optimal confidence regions and tests forhigh-dimensional models, Sara van de Geer, Peter Bhlmann,Ya’acov Ritov, Ruben Dezeure (arXiv 2013, Annals of Statistics)

Chernozhukov Inference for Z-problems

Page 28: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Bibliography

I Honest Confidence Regions for a Regression Parameter in LogisticRegression with a Large Number of Controls, Alexandre Belloni,Victor Chernozhukov, Ying Wei (ArXiv 2013)

I Valid Post-Selection Inference in High-Dimensional ApproximatelySparse Quantile Regression Models, Alexandre Belloni, VictorChernozhukov, Kengo Kato (ArXiv 2013)

I Confidence Intervals and Hypothesis Testing for High-DimensionalRegression, Adel Javanmard and Andrea Montanari (arXiv 2013, J.Mach. Learn. Res.)

I Pivotal estimation via square-root Lasso in nonparametric regressionAlexandre Belloni, Victor Chernozhukov, Lie Wang, (arXiv 2013,Annals of Statistics)

Chernozhukov Inference for Z-problems

Page 29: Uniform Post Selection Inference for LAD Regression and ...vchern/papers/Chernozhukov... · "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems" Oberwolfach,

Bibliography

I Program Evaluation with High-Dimensional Data, AlexandreBelloni, Victor Chernozhukov, Ivan Fernandez-Val, Chris Hansen(arXiv 2013)

I A General Framework for Robust Testing and Confidence Regions inHigh-Dimensional Quantile Regression, Tianqi Zhao, Mladen Kolarand Han Liu (arXiv 2014)

I A General Theory of Hypothesis Tests and Confidence Regions forSparse High Dimensional Models, Yang Ning and Han Liu (arXiv2014)

I Valid Post-Selection and Post-Regularization Inference: AnElementary, General Approach (Annual Review of Economics 2015),Victor Chernozhukov, Christian Hansen, Martin Spindler

Chernozhukov Inference for Z-problems