Attention and feature transfer based knowledge distillation

Existing knowledge distillation (KD) methods are mainly based on features, logic, or attention, where features and logic represent the results of reasoning at different stages of a convolutional neural network, and attention maps symbolize the reasoning process. Because of the continuity of the two...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 13; no. 1; pp. 18369 - 10
Main Authors Yang, Guoliang, Yu, Shuaiying, Sheng, Yangyang, Yang, Hao
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 26.10.2023
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…