Error-dependent smoothing rules in local linear regression

Date

2002

Authors

Cheng, M-Y
Hall, Peter

Journal Title

Journal ISSN

Volume Title

Publisher

Academia Sinica

Abstract

We suggest an adaptive, error-dependent smoothing method for reducing the variance of local-linear curve estimators. It involves weighting the bandwidth used at the ith datum in proportion to a power of the absolute value of the ith residual. We show that the optimal power is 2/3. Arguing in this way, we prove that asymptotic variance can be reduced by 24% in the case of Normal errors, and by 35% for double-exponential errors. These results might appear to violate Jianqing Fan's bounds on performance of local-linear methods, but note that our approach to smoothing produces nonlinear estimators. In the case of Normal errors, our estimator has slightly better mean squared error performance than that suggested by Fan's minimax bound, calculated by him over all estimators, not just linear ones. However, these improvements are available only for single functions, not uniformly over Fan's function class. Even greater improvements in performance are achievable for error distributions with heavier tails. For symmetric error distributions the method has no first-order effect on bias, and existing bias-reduction techniques may be used in conjunction with error-dependent smoothing. In the case of asymmetric error distributions an overall reduction in mean squared error is achievable, involving a trade-off between bias and variance contributions. However, in this setting, the technique is relatively complex and probably not practically feasible.

Description

Keywords

Keywords: Bandwidth; Kernel method; Nonparametric regression; Tail weight; Variance reduction

Citation

Source

Statistica Sinica

Type

Journal article

Book Title

Entity type

Access Statement

License Rights

DOI

Restricted until