Online Parameter Estimation and Convergence Property of Dynamic Bayesian Networks

In this paper, we investigate a novel online estimation algorithm for dynamic Bayesian network(DBN) parameters, given as conditional probabilities. We sequentially update the parameter adjustment rule based on observation data. We apply our algorithm to two well known representations of DBNs: to a f...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of fuzzy logic and intelligent systems : IJFIS Vol. 7; no. 4; pp. 285 - 294
Main Authors Cho, Hyun-Cheol, Fadali, M. Sami, Lee, Kwon-Soon
Format Journal Article
LanguageKorean
Published 2007
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we investigate a novel online estimation algorithm for dynamic Bayesian network(DBN) parameters, given as conditional probabilities. We sequentially update the parameter adjustment rule based on observation data. We apply our algorithm to two well known representations of DBNs: to a first-order Markov Chain(MC) model and to a Hidden Markov Model(HMM). A sliding window allows efficient adaptive computation in real time. We also examine the stochastic convergence and stability of the learning algorithm.
Bibliography:KISTI1.1003/JNL.JAKO200706717285372
ISSN:1598-2645
2093-744X