Back to Search Start Over

Kernel Knockoffs Selection for Nonparametric Additive Models

Authors :
Dai, Xiaowu
Lyu, Xiang
Li, Lexin
Publication Year :
2021

Abstract

Thanks to its fine balance between model flexibility and interpretability, the nonparametric additive model has been widely used, and variable selection for this type of model has been frequently studied. However, none of the existing solutions can control the false discovery rate (FDR) unless the sample size tends to infinity. The knockoff framework is a recent proposal that can address this issue, but few knockoff solutions are directly applicable to nonparametric models. In this article, we propose a novel kernel knockoffs selection procedure for the nonparametric additive model. We integrate three key components: the knockoffs, the subsampling for stability, and the random feature mapping for nonparametric function approximation. We show that the proposed method is guaranteed to control the FDR for any sample size, and achieves a power that approaches one as the sample size tends to infinity. We demonstrate the efficacy of our method through intensive simulations and comparisons with the alternative solutions. Our proposal thus makes useful contributions to the methodology of nonparametric variable selection, FDR-based inference, as well as knockoffs.

Subjects

Subjects :
Statistics - Methodology

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2105.11659
Document Type :
Working Paper