More Synergy, Less Redundancy: Exploiting Joint Mutual Information for Self-Supervised Learning
Self-supervised learning (SSL) is now a serious competitor for supervised learning, even though it does not require data annotation. Several baselines have attempted to make SSL models exploit information about data distribution, and less dependent on the augmentation effect. However, there is no cl...
Saved in:
Published in | 2023 IEEE International Conference on Image Processing (ICIP) pp. 1390 - 1394 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
08.10.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!