1. Chance-Constrained Active Inference
- Author
-
Ismail Senoz, Thijs van de Laar, Henk Wymeersch, Ayca Ozcelikkale, Bayesian Intelligent Autonomous Systems, and Signal Processing Systems
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Theoretical computer science ,Computer science ,Cognitive Neuroscience ,Message passing ,Inference ,Computer Science - Neural and Evolutionary Computing ,Context (language use) ,Bayes Theorem ,Machine Learning (stat.ML) ,Machine Learning (cs.LG) ,Constraint (information theory) ,Generative model ,Arts and Humanities (miscellaneous) ,Statistics - Machine Learning ,Graphical model ,Neural and Evolutionary Computing (cs.NE) ,Random variable ,Complement (set theory) ,Probability - Abstract
Active inference (ActInf) is an emerging theory that explains perception and action in biological agents in terms of minimizing a free energy bound on Bayesian surprise. Goal-directed behavior is elicited by introducing prior beliefs on the underlying generative model. In contrast to prior beliefs, which constrain all realizations of a random variable, we propose an alternative approach through chance constraints, which allow for a (typically small) probability of constraint violation, and demonstrate how such constraints can be used as intrinsic drivers for goal-directed behavior in ActInf. We illustrate how chance-constrained ActInf weights all imposed (prior) constraints on the generative model, allowing, for example, for a trade-off between robust control and empirical chance constraint violation. Second, we interpret the proposed solution within a message passing framework. Interestingly, the message passing interpretation is not only relevant to the context of ActInf, but also provides a general-purpose approach that can account for chance constraints on graphical models. The chance constraint message updates can then be readily combined with other prederived message update rules without the need for custom derivations. The proposed chance-constrained message passing framework thus accelerates the search for workable models in general and can be used to complement message-passing formulations on generative neural models.
- Published
- 2021