A Generalized Iterative Scaling Algorithm for Maximum Entropy Model Computations Respecting Probabilistic Independencies

Maximum entropy distributions serve as favorable models for commonsense reasoning based on probabilistic conditional knowledge bases. Computing these distributions requires solving high-dimensional convex optimization problems, especially if the conditionals are composed of first-order formulas. In...

Full description

Saved in:
Bibliographic Details
Published inFoundations of Information and Knowledge Systems pp. 379 - 399
Main Authors Wilhelm, Marco, Kern-Isberner, Gabriele, Finthammer, Marc, Beierle, Christoph
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Maximum entropy distributions serve as favorable models for commonsense reasoning based on probabilistic conditional knowledge bases. Computing these distributions requires solving high-dimensional convex optimization problems, especially if the conditionals are composed of first-order formulas. In this paper, we propose a highly optimized variant of generalized iterative scaling for computing maximum entropy distributions. As a novel feature, our improved algorithm is able to take probabilistic independencies into account that are established by the principle of maximum entropy. This allows for exploiting the logical information given by the knowledge base, represented as weighted conditional impact systems, in a very condensed way.
ISBN:9783319900490
3319900498
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-90050-6_21