Wasserstein Dictionary Learning: Optimal Transport-based unsupervised non-linear dictionary learning

This paper introduces a new nonlinear dictionary learning method for histograms in the probability simplex. The method leverages optimal transport theory, in the sense that our aim is to reconstruct histograms using so-called displacement interpolations (a.k.a. Wasserstein barycenters) between dicti...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Schmitz, Morgan A, Heitz, Matthieu, Bonneel, Nicolas, Fred Maurice Ngolè Mboula, Coeurjolly, David, Cuturi, Marco, Peyré, Gabriel, Starck, Jean-Luc
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 15.03.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper introduces a new nonlinear dictionary learning method for histograms in the probability simplex. The method leverages optimal transport theory, in the sense that our aim is to reconstruct histograms using so-called displacement interpolations (a.k.a. Wasserstein barycenters) between dictionary atoms; such atoms are themselves synthetic histograms in the probability simplex. Our method simultaneously estimates such atoms, and, for each datapoint, the vector of weights that can optimally reconstruct it as an optimal transport barycenter of such atoms. Our method is computationally tractable thanks to the addition of an entropic regularization to the usual optimal transportation problem, leading to an approximation scheme that is efficient, parallel and simple to differentiate. Both atoms and weights are learned using a gradient-based descent method. Gradients are obtained by automatic differentiation of the generalized Sinkhorn iterations that yield barycenters with entropic smoothing. Because of its formulation relying on Wasserstein barycenters instead of the usual matrix product between dictionary and codes, our method allows for nonlinear relationships between atoms and the reconstruction of input data. We illustrate its application in several different image processing settings.
ISSN:2331-8422
DOI:10.48550/arxiv.1708.01955