Supervised Contrastive Learning with Multiple Positive Examples

The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised versi...

Full description

Saved in:
Bibliographic Details
Main Authors Krishnan, Dilip, Sarna, Aaron Yehuda, Teterwak, Piotr, Liu, Ce, Tian, Yonglong, Maschinot, Aaron Joseph, Isola, Philip John, Wang, Chen, Khosla, Prannay
Format Patent
LanguageEnglish
Published 18.05.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
Bibliography:Application Number: US202117920623