Back to Search Start Over

Multi-Task Learning for Sparsity Pattern Heterogeneity: Statistical and Computational Perspectives

Authors :
Behdin, Kayhan
Loewinger, Gabriel
Kishida, Kenneth T.
Parmigiani, Giovanni
Mazumder, Rahul
Publication Year :
2022

Abstract

We consider a problem in Multi-Task Learning (MTL) where multiple linear models are jointly trained on a collection of datasets ("tasks"). A key novelty of our framework is that it allows the sparsity pattern of regression coefficients and the values of non-zero coefficients to differ across tasks while still leveraging partially shared structure. Our methods encourage models to share information across tasks through separately encouraging 1) coefficient supports, and/or 2) nonzero coefficient values to be similar. This allows models to borrow strength during variable selection even when non-zero coefficient values differ across tasks. We propose a novel mixed-integer programming formulation for our estimator. We develop custom scalable algorithms based on block coordinate descent and combinatorial local search to obtain high-quality (approximate) solutions for our estimator. Additionally, we propose a novel exact optimization algorithm to obtain globally optimal solutions. We investigate the theoretical properties of our estimators. We formally show how our estimators leverage the shared support information across tasks to achieve better variable selection performance. We evaluate the performance of our methods in simulations and two biomedical applications. Our proposed approaches appear to outperform other sparse MTL methods in variable selection and prediction accuracy. We provide the sMTL package on CRAN.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2212.08697
Document Type :
Working Paper