Extensions and enhancements of decoupled extended Kalman filter training

We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that co...

Full description

Saved in:
Bibliographic Details
Published inProceedings of International Conference on Neural Networks (ICNN'97) Vol. 3; pp. 1879 - 1883 vol.3
Main Authors Puskorius, G.V., Feldkamp, L.A.
Format Conference Proceeding
LanguageEnglish
Published IEEE 1997
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We describe here three useful and practical extensions and enhancements of the decoupled extended Kalman filter (DEKF) neural network weight update procedure, which has served as the backbone for much of our applications-oriented research for the last six years. First, we provide a mechanism that constrains weight values to a pre-specified range during training to allow for fixed-point deployment of trained networks. Second, we examine modifications of DEKF training for alternative cost functions; as an example, we show how to use DEKF training to minimize a measure of relative entropy, rather than mean squared error, for pattern classification problems. Third, we describe an approximation of DEKF training that allows a multiple-output training problem to be treated with single-output training complexity.
ISBN:0780341228
9780780341227
DOI:10.1109/ICNN.1997.614185