Back to Search Start Over

A Unifying Framework for Typical Multi-Task Multiple Kernel Learning Problems

Authors :
Li, Cong
Georgiopoulos, Michael
Anagnostopoulos, Georgios C.
Publication Year :
2014

Abstract

Over the past few years, Multi-Kernel Learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning. MKL formulations have been devised and solved for a broad spectrum of machine learning problems, including Multi-Task Learning (MTL). Solving different MKL formulations usually involves designing algorithms that are tailored to the problem at hand, which is, typically, a non-trivial accomplishment. In this paper we present a general Multi-Task Multi-Kernel Learning (Multi-Task MKL) framework that subsumes well-known Multi-Task MKL formulations, as well as several important MKL approaches on single-task problems. We then derive a simple algorithm that can solve the unifying framework. To demonstrate the flexibility of the proposed framework, we formulate a new learning problem, namely Partially-Shared Common Space (PSCS) Multi-Task MKL, and demonstrate its merits through experimentation.<br />Comment: 17 pages, 1 figure. Accepted by IEEE Transactions on Neural Networks and Learning Systems; currently published as Early Access Article

Subjects

Subjects :
Computer Science - Learning

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1401.5136
Document Type :
Working Paper