Characterization of a Class of Sigmoid Functions with Applications to Neural Networks
We study two classes of sigmoids: the simple sigmoids, defined to be odd, asymptotically bounded, completely monotone functions in one variable, and the hyperbolic sigmoids, a proper subset of simple sigmoids and a natural generalization of the hyperbolic tangent. We obtain a complete characterizati...
Saved in:
Published in | Neural networks Vol. 9; no. 5; pp. 819 - 835 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Oxford
Elsevier Ltd
01.07.1996
Elsevier Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We study two classes of sigmoids: the simple sigmoids, defined to be odd, asymptotically bounded, completely monotone functions in one variable, and the hyperbolic sigmoids, a proper subset of simple sigmoids and a natural generalization of the hyperbolic tangent. We obtain a complete characterization for the inverses of hyperbolic sigmoids using Euler's incomplete beta functions, and describe composition rules that illustrate how such functions may be synthesized from others. These results are applied to two problems. First we show that with respect to simple sigmoids the continuous Cohen-Grossberg-Hopfield model can be reduced to the (associated) Legendre differential equations. Second, we show that the effect of using simple sigmoids as node transfer functions in a one-hidden layer feedforward network with one summing output may be interpreted as representing the output function as a Fourier series sine transform evaluated at the hidden layer node inputs, thus extending and complementing earlier results in this area. Copyright © 1996 Elsevier Science Ltd |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/0893-6080(95)00107-7 |