Back to Search Start Over

A sparse expansion for deep Gaussian processes.

Authors :
Ding, Liang
Tuo, Rui
Shahrampour, Shahin
Source :
IISE Transactions. May2024, Vol. 56 Issue 5, p559-572. 14p.
Publication Year :
2024

Abstract

In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for stochastic processes with complex distributions. Conventional inferential methods for DGP models can suffer from high computational complexity, as they require large-scale operations with kernel matrices for training and inference. In this work, we propose an efficient scheme for accurate inference and efficient training based on a range of Gaussian Processes, called the Tensor Markov Gaussian Processes (TMGP). We construct an induced approximation of TMGP referred to as the hierarchical expansion. Next, we develop a deep TMGP (DTMGP) model as the composition of multiple hierarchical expansion of TMGPs. The proposed DTMGP model has the following properties: (i) the outputs of each activation function are deterministic while the weights are chosen independently from standard Gaussian distribution; (ii) in training or prediction, only O (polylog (M)) (out of M) activation functions have non-zero outputs, which significantly boosts the computational efficiency. Our numerical experiments on synthetic models and real datasets show the superior computational efficiency of DTMGP over existing DGP models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
24725854
Volume :
56
Issue :
5
Database :
Academic Search Index
Journal :
IISE Transactions
Publication Type :
Academic Journal
Accession number :
175670395
Full Text :
https://doi.org/10.1080/24725854.2023.2210629