Back to Search Start Over

Controlling false discoveries in high-dimensional situations: boosting with stability selection.

Authors :
Hofner, Benjamin
Boccuto, Luigi
Göker, Markus
Source :
BMC Bioinformatics. Jul2015, Vol. 16 Issue 1, p1-17. 17p. 15 Graphs.
Publication Year :
2015

Abstract

Background: Modern biotechnologies often result in high-dimensional data sets with many more variables than observations (n ⪡ p). These data sets pose new challenges to statistical analysis: Variable selection becomes one of the most important tasks in this setting. Similar challenges arise if in modern data sets from observational studies, e.g., in ecology, where flexible, non-linear models are fitted to high-dimensional data. We assess the recently proposed flexible framework for variable selection called stability selection. By the use of resampling procedures, stability selection adds a finite sample error control to high-dimensional variable selection procedures such as Lasso or boosting. We consider the combination of boosting and stability selection and present results from a detailed simulation study that provide insights into the usefulness of this combination. The interpretation of the used error bounds is elaborated and insights for practical data analysis are given. Results: Stability selection with boosting was able to detect influential predictors in high-dimensional settings while controlling the given error bound in various simulation scenarios. The dependence on various parameters such as the sample size, the number of truly influential variables or tuning parameters of the algorithm was investigated. The results were applied to investigate phenotype measurements in patients with autism spectrum disorders using a log-linear interaction model which was fitted by boosting. Stability selection identified five differentially expressed amino acid pathways. Conclusion: Stability selection is implemented in the freely available R package stabs (http://CRAN.R-project.org/ package=stabs). It proved to work well in high-dimensional settings with more predictors than observations for both, linear and additive models. The original version of stability selection, which controls the per-family error rate, is quite conservative, though, this is much less the case for its improvement, complementary pairs stability selection. Nevertheless, care should be taken to appropriately specify the error bound. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14712105
Volume :
16
Issue :
1
Database :
Academic Search Index
Journal :
BMC Bioinformatics
Publication Type :
Academic Journal
Accession number :
108646009
Full Text :
https://doi.org/10.1186/s12859-015-0575-3