Abstract:

Outlier detection algorithms are intimately connected with robust statistics that down‐weight some observations to zero. We define a number of outlier detection algorithms related to the Huber‐skip and least trimmed squares estimators, including the one‐step Huber‐skip estimator and the forward search. Next, we review a recently developed asymptotic theory of these. Finally, we analyse the gauge, the fraction of wrongly detected outliers, for a number of outlier detection algorithms and establish an asymptotic normal and a Poisson theory for the gauge.

Citation:

Johansen, S. and Nielsen, B. (2015). `Asymptotic theory of outlier detection algorithms for linear time series regression models'. Scandinavian Journal of Statistics, 43, 321-348.
Go to Document

Authors

Research Programmes

Type