1. A primer on Variational Laplace (VL).
- Author
-
Zeidman, Peter, Friston, Karl, and Parr, Thomas
- Subjects
- *
BAYESIAN field theory , *INTEGRATED software , *MACHINE learning , *BRAIN imaging , *MATHEMATICS - Abstract
• Variational Laplace (VL) is a scheme for Bayesian modelling. • VL is widely used in neuroimaging, in particular DCM. • This paper provides a tutorial explanation of the math and algorithms. • New standalone code is provided to enable re-implementation. • The supplementary materials provide worked derivations. This article details a scheme for approximate Bayesian inference, which has underpinned thousands of neuroimaging studies since its introduction 15 years ago. Variational Laplace (VL) provides a generic approach to fitting linear or non-linear models, which may be static or dynamic, returning a posterior probability density over the model parameters and an approximation of log model evidence, which enables Bayesian model comparison. VL applies variational Bayesian inference in conjunction with quadratic or Laplace approximations of the evidence lower bound (free energy). Importantly, update equations do not need to be derived for each model under consideration, providing a general method for fitting a broad class of models. This primer is intended for experimenters and modellers who may wish to fit models to data using variational Bayesian methods, without assuming previous experience of variational Bayes or machine learning. Accompanying code demonstrates how to fit different kinds of model using the reference implementation of the VL scheme in the open-source Statistical Parametric Mapping (SPM) software package. In addition, we provide a standalone software function that does not require SPM, in order to ease translation to other fields, together with detailed pseudocode. Finally, the supplementary materials provide worked derivations of the key equations. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF