APo-VAE: Text Generation in Hyperbolic Space

Natural language often exhibits inherent hierarchical structure ingrained with complex syntax and semantics. However, most state-of-the-art deep generative models learn embeddings only in Euclidean vector space, without accounting for this structural property of language. In this paper, we investiga...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Dai, Shuyang, Gan, Zhe, Cheng, Yu, Tao, Chenyang, Lawrence, Carin, Liu, Jingjing
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 14.07.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Natural language often exhibits inherent hierarchical structure ingrained with complex syntax and semantics. However, most state-of-the-art deep generative models learn embeddings only in Euclidean vector space, without accounting for this structural property of language. In this paper, we investigate text generation in a hyperbolic latent space to learn continuous hierarchical representations. An Adversarial Poincare Variational Autoencoder (APo-VAE) is presented, where both the prior and variational posterior of latent variables are defined over a Poincare ball via wrapped normal distributions. By adopting the primal-dual formulation of KL divergence, an adversarial learning procedure is introduced to empower robust model training. Extensive experiments in language modeling and dialog-response generation tasks demonstrate the winning effectiveness of the proposed APo-VAE model over VAEs in Euclidean latent space, thanks to its superb capabilities in capturing latent language hierarchies in hyperbolic space.
ISSN:2331-8422