IDEAS/RePEc search
IDEAS search now includes synonyms. If you feel that some synonyms are missing, you are welcome to suggest them for inclusion
- Kock, Anders Bredahl & Callot, Laurent (2015): Oracle inequalities for high dimensional vector autoregressions
This paper establishes non-asymptotic oracle inequalities for the prediction error and estimation accuracy of the LASSO in stationary vector autoregressive models. These inequalities are used to establish consistency of the LASSO even when the number of parameters is of a much larger order of magnitude than the sample size.
RePEc:eee:econom:v:186:y:2015:i:2:p:325-344 Save to MyIDEAS - Vaart Aad W. van der & Dudoit Sandrine & Laan Mark J. van der (2006): Oracle inequalities for multi-fold cross validation
We derive bounds that show that the risk of the resulting procedure is (up to a constant) smaller than the risk of an oracle plus an error which typically grows logarithmically with the number of estimators in the class.
RePEc:bpj:strimo:v:24:y:2006:i:3:p:21:n:3 Save to MyIDEAS - Anders Bredahl Kock & Laurent A.F. Callot (2012): Oracle Inequalities for High Dimensional Vector Autoregressions
This paper establishes non-asymptotic oracle inequalities for the prediction error and estimation accuracy of the LASSO in stationary vector autoregressive models. These inequalities are used to establish consistency of the LASSO even when the number of parameters is of a much larger order of magnitude than the sample size. ... Some maximal inequalities for vector autoregressions which might be of independent interest are contained in the appendix.
RePEc:aah:create:2012-16 Save to MyIDEAS - Tung Duy Luu & Jalal Fadili & Christophe Chesneau (2020): Sharp oracle inequalities for low-complexity priors
More precisely, we show that these two estimators satisfy sharp oracle inequalities for prediction ensuring their good theoretical performances. ... When the noise is random, we provide oracle inequalities in probability using concentration inequalities.
RePEc:spr:aistmt:v:72:y:2020:i:2:d:10.1007_s10463-018-0693-6 Save to MyIDEAS - Koike, Yuta & Tanoue, Yuta (2019): Oracle inequalities for sign constrained generalized linear models
Recent studies on this subject have shown that, in the case of linear regression, sign-constraints alone could be as efficient as the oracle method if the design matrix enjoys a suitable assumption in addition to a traditional compatibility condition.
RePEc:eee:ecosta:v:11:y:2019:i:c:p:145-157 Save to MyIDEAS - E. A. Pchelintsev & S. M. Pergamenshchikov (2018): Oracle inequalities for the stochastic differential equations
We represent the general model selection method and the sharp oracle inequalities methods which provide the robust efficient estimation in the adaptive setting.
RePEc:spr:sistpr:v:21:y:2018:i:2:d:10.1007_s11203-018-9180-1 Save to MyIDEAS - Mehmet Caner & Anders Bredahl Kock (2013): Oracle Inequalities for Convex Loss Functions with Non-Linear Targets
Using the elastic net penalty we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target is linear this inequality also provides an upper bound of the estimation error of the estimated parameter vector. ... Next, we use the non-asymptotic results to show that the excess loss of our estimator is asymptotically of the same order as that of the oracle.
RePEc:aah:create:2013-51 Save to MyIDEAS - Anders Bredahl Kock (2013): Oracle inequalities for high-dimensional panel data models
This paper is concerned with high-dimensional panel data models where the number of regressors can be much larger than the sample size. Under the assumption that the true parameter vector is sparse we establish finite sample upper bounds on the estimation error of the Lasso under two different sets of conditions on the covariates as well as the error terms. Upper bounds on the estimation error of the unobserved heterogeneity are also provided under the assumption of sparsity. Next, we show that our upper bounds are essentially optimal in the sense that they can only be improved by multiplicative constants. These results are then used to show that the Lasso can be consistent in even very large models where the number of regressors increases at an exponential rate in the sample size.
RePEc:aah:create:2013-20 Save to MyIDEAS - Sam Efromovich (2004): Oracle Inequalities for Efromovich–Pinsker Blockwise Estimates
Oracle inequality is a relatively new statistical tool for the analysis of nonparametric adaptive estimates. Oracle is a good pseudo-estimate that is based on both data and an underlying estimated curve. An oracle inequality shows how well an adaptive estimator mimics the oracle for a particular underlying curve. The most advanced oracle inequalities have been recently obtained by Cavalier and Tsybakov (2001) for Stein type blockwise estimates used in filtering a signal from a stationary white Gaussian process.
RePEc:spr:metcap:v:6:y:2004:i:3:d:10.1023_b:mcap.0000026562.80429.48 Save to MyIDEAS - Mehmet Caner & Anders Bredahl Kock (2016): Oracle Inequalities for Convex Loss Functions with Nonlinear Targets
Using the elastic net penalty, of which the Least Absolute Shrinkage and Selection Operator (Lasso) is a special case, we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target is linear, this inequality also provides an upper bound of the estimation error of the estimated parameter vector. Next, we use the non-asymptotic results to show that the excess loss of our estimator is asymptotically of the same order as that of the oracle.
RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1377-1411 Save to MyIDEAS