Radial-Basis-Function Networks

Neural networks are characterized by being massively parallel in their architecture and that they use a learning paradigm rather than being programmed. The radial basis function (RBF) represents one such processing method. The approximation by the sum can also be interpreted as a rather simple singl...

Full description

Saved in:
Bibliographic Details
Published inIntelligent Systems pp. 7-1 - 7-12
Main Authors Eide, Åge J., Lindblad, Thomas, Paillet, Guy
Format Book Chapter
LanguageEnglish
Published United Kingdom CRC Press 2011
Taylor & Francis Group
Edition2
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Neural networks are characterized by being massively parallel in their architecture and that they use a learning paradigm rather than being programmed. The radial basis function (RBF) represents one such processing method. The approximation by the sum can also be interpreted as a rather simple singlelayer type of artificial neural network. The RBFs taking on the role of the activation functions of the neural network. The most well-known learning paradigm for feed-forward neural networks is the backpropagation paradigm. Generally speaking, very few implementations of neural networks have been done outside the university world, where such implementations have been quite popular. In very general terms, the approach is to map an N-dimensional space by prototypes. The RMS brightness jitter as well as the position jitter in pixels turned out to be equally good or even better than most conventional star tracker of same complexity.
ISBN:1439802831
9781439802830
1138071889
9781138071889
DOI:10.1201/9781315218427-7