Back to Search Start Over

Generalized Convolution Spectral Mixture for Multitask Gaussian Processes

Authors :
Jinsong Chen
Elena Marchiori
Twan van Laarhoven
Perry Groot
Kai Chen
Source :
IEEE Transactions on Neural Networks and Learning Systems, 31, 5613-5623, IEEE Transactions on Neural Networks and Learning Systems, 31, 12, pp. 5613-5623
Publication Year :
2020

Abstract

Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.

Details

ISSN :
2162237X
Volume :
31
Database :
OpenAIRE
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Accession number :
edsair.doi.dedup.....8fa00f49194cc8790a4bc0338c5eef66