Chance-Constrained Active Inference

Active inference (ActInf) is an emerging theory that explains perception and action in biological agents in terms of minimizing a free energy bound on Bayesian surprise. Goal-directed behavior is elicited by introducing prior beliefs on the underlying generative model. In contrast to prior beliefs,...

Full description

Saved in:
Bibliographic Details
Published inNeural computation Vol. 33; no. 10; pp. 2710 - 2735
Main Authors van de Laar, Thijs, Şenöz, İsmail, Özçelikkale, Ayça, Wymeersch, Henk
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 16.09.2021
MIT Press Journals, The
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Active inference (ActInf) is an emerging theory that explains perception and action in biological agents in terms of minimizing a free energy bound on Bayesian surprise. Goal-directed behavior is elicited by introducing prior beliefs on the underlying generative model. In contrast to prior beliefs, which constrain all realizations of a random variable, we propose an alternative approach through chance constraints, which allow for a (typically small) probability of constraint violation, and demonstrate how such constraints can be used as intrinsic drivers for goal-directed behavior in ActInf. We illustrate how chance-constrained ActInf weights all imposed (prior) constraints on the generative model, allowing, for example, for a trade-off between robust control and empirical chance constraint violation. Second, we interpret the proposed solution within a message passing framework. Interestingly, the message passing interpretation is not only relevant to the context of ActInf, but also provides a general-purpose approach that can account for chance constraints on graphical models. The chance constraint message updates can then be readily combined with other prederived message update rules without the need for custom derivations. The proposed chance-constrained message passing framework thus accelerates the search for workable models in general and can be used to complement message-passing formulations on generative neural models.
Bibliography:OCTOBER, 2021
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0899-7667
1530-888X
1530-888X
DOI:10.1162/neco_a_01427