Relational Knowledge Distillation

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expressed as a form of training the student to mimic output activations of individual data examples represented by the teacher. W...

Full description

Saved in:
Bibliographic Details
Published in2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) pp. 3962 - 3971
Main Authors Park, Wonpyo, Kim, Dongju, Lu, Yan, Cho, Minsu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2019
Subjects
Online AccessGet full text

Cover

Loading…