Domain Generalization with Small Data

In this work, we propose to tackle the problem of domain generalization in the context of insufficient samples . Instead of extracting latent feature embeddings based on deterministic models, we propose to learn a domain-invariant representation based on the probabilistic framework by mapping each d...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of computer vision Vol. 132; no. 8; pp. 3172 - 3190
Main Authors Chen, Kecheng, Gal, Elena, Yan, Hong, Li, Haoliang
Format Journal Article
LanguageEnglish
Published New York Springer US 01.08.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this work, we propose to tackle the problem of domain generalization in the context of insufficient samples . Instead of extracting latent feature embeddings based on deterministic models, we propose to learn a domain-invariant representation based on the probabilistic framework by mapping each data point into probabilistic embeddings. Specifically, we first extend empirical maximum mean discrepancy (MMD) to a novel probabilistic MMD that can measure the discrepancy between mixture distributions (i.e., source domains) consisting of a series of latent distributions rather than latent points. Moreover, instead of imposing the contrastive semantic alignment (CSA) loss based on pairs of latent points, a novel probabilistic CSA loss encourages positive probabilistic embedding pairs to be closer while pulling other negative ones apart. Benefiting from the learned representation captured by probabilistic models, our proposed method can marriage the measurement on the distribution over distributions (i.e., the global perspective alignment) and the distribution-based contrastive semantic alignment (i.e., the local perspective alignment). Extensive experimental results on three challenging medical datasets show the effectiveness of our proposed method in the context of insufficient data compared with state-of-the-art methods.
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-024-02028-4