Optimization method and device of WTA attention mechanism neural network model

The invention discloses an optimization method and device for a WTA attention mechanism neural network model, and the method comprises the steps: carrying out the bidirectional generative adversarial training of an image training data set when the image training data set is received, generating corr...

Full description

Saved in:
Bibliographic Details
Main Authors LIU DANBING, LI DI, HU XINXIN, HUANG JIAMING, WANG QIANG, ZENG XIAO, CHEN LIN, MIAO HAO, LIU JINHU, CHEN ZHAOWEI, XU XUEDAN, YI YINXIANG, WANG YADI
Format Patent
LanguageChinese
English
Published 27.12.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The invention discloses an optimization method and device for a WTA attention mechanism neural network model, and the method comprises the steps: carrying out the bidirectional generative adversarial training of an image training data set when the image training data set is received, generating corresponding target training sample data, and obtaining network construction parameters, network construction parameters are adopted to construct an initial WTA attention mechanism network model, training is carried out, a target WTA attention mechanism network model is generated, when target image data are received, a corresponding input feature graph is extracted, the target WTA attention mechanism network model is input, a WTA path graph is generated, and the target WTA attention mechanism network model is subjected to feedback optimization by adopting the WTA path graph; the technical problems that the interpretability is poor and the performance of the neural network is not obviously improved when the neural netw
Bibliography:Application Number: CN202211399604