Linear and conic programming estimators in high dimensional errors-in-variables models

Published

Journal Article

© 2016 Royal Statistical Society We consider the linear regression model with observation error in the design. In this setting, we allow the number of covariates to be much larger than the sample size. Several new estimation methods have been recently introduced for this model. Indeed, the standard lasso estimator or Dantzig selector turns out to become unreliable when only noisy regressors are available, which is quite common in practice. In this work, we propose and analyse a new estimator for the errors-in-variables model. Under suitable sparsity assumptions, we show that this estimator attains the minimax efficiency bound. Importantly, this estimator can be written as a second-order cone programming minimization problem which can be solved numerically in polynomial time. Finally, we show that the procedure introduced by Rosenbaum and Tsybakov, which is almost optimal in a minimax sense, can be efficiently computed by a single linear programming problem despite non-convexities.

Full Text

Duke Authors

Cited Authors

  • Belloni, A; Rosenbaum, M; Tsybakov, AB

Published Date

  • June 1, 2017

Published In

Volume / Issue

  • 79 / 3

Start / End Page

  • 939 - 956

Electronic International Standard Serial Number (EISSN)

  • 1467-9868

International Standard Serial Number (ISSN)

  • 1369-7412

Digital Object Identifier (DOI)

  • 10.1111/rssb.12196

Citation Source

  • Scopus