DETERMINISTIC DECODER VARIATIONAL AUTOENCODER

A model of a deterministic decoder VAE (DD-VAE) is provided. The DD-VAE has evidence lower bound derived, and a convenient approximation can be proposed with proven convergence to optimal parameters of a non-relaxed objective. The invention introduces bounded support distributions as a solution ther...

Full description

Saved in:
Bibliographic Details
Main Authors Polykovskiy, Daniil, Zavoronkovs, Aleksandrs
Format Patent
LanguageEnglish
Published 02.09.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A model of a deterministic decoder VAE (DD-VAE) is provided. The DD-VAE has evidence lower bound derived, and a convenient approximation can be proposed with proven convergence to optimal parameters of a non-relaxed objective. The invention introduces bounded support distributions as a solution thereto. Experiments on multiple datasets (synthetic, MNIST, MOSES, ZINC) are performed to show that DD-VAE yields both a proper generative distribution and useful latent codes. A computer-implemented method of generating objects with a deterministic decoder variational autoencoder can include: providing a model configured as a deterministic decoder variational autoencoder; inputting object data into a stochastic encoder of the deterministic decoder variational autoencoder; generating latent codes in the latent space with the encoder; providing the latent codes from the latent space to a decoder, wherein the decoder is configured as a deterministic decoder; generating decoded objects with the decoder; and generating a report that identifies the decoded object.
Bibliography:Application Number: US202117189017