TanhExp: A smooth activation function with high convergence speed for lightweight neural networks

Lightweight or mobile neural networks used for real‐time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, a novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed which can improve the perform...

Full description

Saved in:
Bibliographic Details
Published inIET computer vision Vol. 15; no. 2; pp. 136 - 150
Main Authors Liu, Xinyu, Di, Xiaoguang
Format Journal Article
LanguageEnglish
Published Stevenage John Wiley & Sons, Inc 01.03.2021
Wiley
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Lightweight or mobile neural networks used for real‐time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, a novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed which can improve the performance for these networks on image classification task significantly. The definition of TanhExp is f(x) = x tanh(ex). The simplicity, efficiency, and robustness of TanhExp on various datasets and network models is demonstrated and TanhExp outperforms its counterparts in both convergence speed and accuracy. Its behaviour also remains stable even with noise added and dataset altered. It is shown that without increasing the size of the network, the capacity of lightweight neural networks can be enhanced by TanhExp with only a few training epochs and no extra parameters added.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1751-9632
1751-9640
DOI:10.1049/cvi2.12020