Back to Search Start Over

Approximate Bayesian Inference.

Authors :
Alquier, Pierre
Alquier, Pierre

Abstract

Summary: Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis-Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC-Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Subjects

Subjects :
Research & information: general
Mathematics & science
bifurcation
dynamical systems
Edward-Sokal coupling
mean-field
Kullback-Leibler divergence
variational inference
Bayesian statistics
machine learning
variational approximations
PAC-Bayes
expectation-propagation
Markov chain Monte Carlo
Langevin Monte Carlo
sequential Monte Carlo
Laplace approximations
approximate Bayesian computation
Gibbs posterior
MCMC
stochastic gradients
neural networks
Approximate Bayesian Computation
differential evolution
Markov kernels
discrete state space
ergodicity
Markov chain
probably approximately correct
variational Bayes
Bayesian inference
Markov Chain Monte Carlo
Sequential Monte Carlo
Riemann Manifold Hamiltonian Monte Carlo
integrated nested laplace approximation
fixed-form variational Bayes
stochastic volatility
network modeling
network variability
Stiefel manifold
MCMC-SAEM
data imputation
Bethe free energy
factor graphs
message passing
variational free energy
variational message passing
approximate Bayesian computation (ABC)
differential privacy (DP)
sparse vector technique (SVT)
Gaussian
particle flow
variable flow
Langevin dynamics
Hamilton Monte Carlo
non-reversible dynamics
control variates
thinning
meta-learning
hyperparameters
priors
online learning
online optimization
gradient descent
statistical learning theory
PAC-Bayes theory
deep learning
generalisation bounds
Bayesian sampling
Monte Carlo integration
no free lunch theorems
sequential learning
principal curves
data streams
regret bounds
greedy algorithm
sleeping experts
entropy
robustness
statistical mechanics
complex systems

Details

Language :
English
ISBN :
books978-3-0365-3790-0
9783036537894
9783036537900
ISBNs :
9783036537900, 9783036537894, and 9783036537900
Database :
Jio Institute Digital Library Catalog
Journal :
Approximate Bayesian Inference
Notes :
004703, New Energy, Open Access star Unrestricted online access, Creative Commons https://creativecommons.org/licenses/by/4.0/ cc https://creativecommons.org/licenses/by/4.0/, English
Publication Type :
Book
Accession number :
jlc.oai.folio.org.fs00001072.365317f9.d4d1.4180.b8bb.b94754ad0ce5
Document Type :
Book; Electronic document