Back to Search Start Over

Robust artificial neural networks and outlier detection. Technical report

Authors :
Beliakov, Gleb
Kelarev, Andrei
Yearwood, John
Publication Year :
2011

Abstract

Large outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks to contaminated data using least trimmed squares criterion. We introduce a penalized least trimmed squares criterion which prevents unnecessary removal of valid data. Training of ANNs leads to a challenging non-smooth global optimization problem. We compare the efficiency of several derivative-free optimization methods in solving it, and show that our approach identifies the outliers correctly when ANNs are used for nonlinear regression.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1110.0169
Document Type :
Working Paper
Full Text :
https://doi.org/10.1080/02331934.2012.674946