Attention and feature transfer based knowledge distillation
Existing knowledge distillation (KD) methods are mainly based on features, logic, or attention, where features and logic represent the results of reasoning at different stages of a convolutional neural network, and attention maps symbolize the reasoning process. Because of the continuity of the two...
Saved in:
Published in | Scientific reports Vol. 13; no. 1; pp. 18369 - 10 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
26.10.2023
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!