Optimization method and device of WTA attention mechanism neural network model
The invention discloses an optimization method and device for a WTA attention mechanism neural network model, and the method comprises the steps: carrying out the bidirectional generative adversarial training of an image training data set when the image training data set is received, generating corr...
Saved in:
Main Authors | , , , , , , , , , , , , |
---|---|
Format | Patent |
Language | Chinese English |
Published |
27.12.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The invention discloses an optimization method and device for a WTA attention mechanism neural network model, and the method comprises the steps: carrying out the bidirectional generative adversarial training of an image training data set when the image training data set is received, generating corresponding target training sample data, and obtaining network construction parameters, network construction parameters are adopted to construct an initial WTA attention mechanism network model, training is carried out, a target WTA attention mechanism network model is generated, when target image data are received, a corresponding input feature graph is extracted, the target WTA attention mechanism network model is input, a WTA path graph is generated, and the target WTA attention mechanism network model is subjected to feedback optimization by adopting the WTA path graph; the technical problems that the interpretability is poor and the performance of the neural network is not obviously improved when the neural netw |
---|---|
Bibliography: | Application Number: CN202211399604 |