Back to Search Start Over

Robust Hierarchical-Optimization RLS Against Sparse Outliers.

Authors :
Slavakis, Konstantinos
Banerjee, Sinjini
Source :
IEEE Signal Processing Letters; Jan2020, Vol. 27, p171-175, 5p
Publication Year :
2020

Abstract

This letter fortifies the recently introduced hierarchical-optimization recursive least squares (HO-RLS) against outliers which contaminate infrequently linear-regression models. Outliers are modeled as nuisance variables and are estimated together with the linear filter/system variables via a sparsity-inducing (non-)convexly regularized least-squares task. The proposed outlier-robust HO-RLS builds on steepest-descent directions with a constant step size (learning rate), needs no matrix inversion (lemma), accommodates colored nominal noise of known correlation matrix, exhibits small computational footprint, and offers theoretical guarantees, in a probabilistic sense, for the convergence of the system estimates to the solutions of a hierarchical-optimization problem: Minimize a convex loss, which models a-priori knowledge about the unknown system, over the minimizers of the classical ensemble LS loss. Extensive numerical tests on synthetically generated data in both stationary and non-stationary scenarios showcase notable improvements of the proposed scheme over state-of-the-art techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10709908
Volume :
27
Database :
Complementary Index
Journal :
IEEE Signal Processing Letters
Publication Type :
Academic Journal
Accession number :
141802115
Full Text :
https://doi.org/10.1109/LSP.2019.2963188