FastTrees: Parallel Latent Tree-Induction for Faster Sequence Encoding

Inducing latent tree structures from sequential data is an emerging trend in the NLP research landscape today, largely popularized by recent methods such as Gumbel LSTM and Ordered Neurons (ON-LSTM). This paper proposes FASTTREES, a new general purpose neural module for fast sequence encoding. Unlik...

Full description

Saved in:
Bibliographic Details
Main Authors Pung, Bill Tuck Weng, Chan, Alvin
Format Journal Article
LanguageEnglish
Published 27.11.2021
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2111.14031

Cover

Abstract Inducing latent tree structures from sequential data is an emerging trend in the NLP research landscape today, largely popularized by recent methods such as Gumbel LSTM and Ordered Neurons (ON-LSTM). This paper proposes FASTTREES, a new general purpose neural module for fast sequence encoding. Unlike most previous works that consider recurrence to be necessary for tree induction, our work explores the notion of parallel tree induction, i.e., imbuing our model with hierarchical inductive biases in a parallelizable, non-autoregressive fashion. To this end, our proposed FASTTREES achieves competitive or superior performance to ON-LSTM on four well-established sequence modeling tasks, i.e., language modeling, logical inference, sentiment analysis and natural language inference. Moreover, we show that the FASTTREES module can be applied to enhance Transformer models, achieving performance gains on three sequence transduction tasks (machine translation, subject-verb agreement and mathematical language understanding), paving the way for modular tree induction modules. Overall, we outperform existing state-of-the-art models on logical inference tasks by +4% and mathematical language understanding by +8%.
AbstractList Inducing latent tree structures from sequential data is an emerging trend in the NLP research landscape today, largely popularized by recent methods such as Gumbel LSTM and Ordered Neurons (ON-LSTM). This paper proposes FASTTREES, a new general purpose neural module for fast sequence encoding. Unlike most previous works that consider recurrence to be necessary for tree induction, our work explores the notion of parallel tree induction, i.e., imbuing our model with hierarchical inductive biases in a parallelizable, non-autoregressive fashion. To this end, our proposed FASTTREES achieves competitive or superior performance to ON-LSTM on four well-established sequence modeling tasks, i.e., language modeling, logical inference, sentiment analysis and natural language inference. Moreover, we show that the FASTTREES module can be applied to enhance Transformer models, achieving performance gains on three sequence transduction tasks (machine translation, subject-verb agreement and mathematical language understanding), paving the way for modular tree induction modules. Overall, we outperform existing state-of-the-art models on logical inference tasks by +4% and mathematical language understanding by +8%.
Author Chan, Alvin
Pung, Bill Tuck Weng
Author_xml – sequence: 1
  givenname: Bill Tuck Weng
  surname: Pung
  fullname: Pung, Bill Tuck Weng
– sequence: 2
  givenname: Alvin
  surname: Chan
  fullname: Chan, Alvin
BackLink https://doi.org/10.48550/arXiv.2111.14031$$DView paper in arXiv
BookMark eNrjYmDJy89LZWCQNDTQM7EwNTXQTyyqyCzTMzI0NNQzNDEwNuRkcHNLLC4JKUpNLbZSCEgsSszJSc1R8EksSc0rUQAJ63rmpZQml2Tm5ymk5RcpgFSnFikEpxaWpuYlpyq45iXnp2TmpfMwsKYl5hSn8kJpbgZ5N9cQZw9dsI3xBUWZuYlFlfEgm-PBNhsTVgEAdUs6GA
ContentType Journal Article
Copyright http://creativecommons.org/licenses/by/4.0
Copyright_xml – notice: http://creativecommons.org/licenses/by/4.0
DBID AKY
GOX
DOI 10.48550/arxiv.2111.14031
DatabaseName arXiv Computer Science
arXiv.org
DatabaseTitleList
Database_xml – sequence: 1
  dbid: GOX
  name: arXiv.org
  url: http://arxiv.org/find
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
ExternalDocumentID 2111_14031
GroupedDBID AKY
GOX
ID FETCH-arxiv_primary_2111_140313
IEDL.DBID GOX
IngestDate Tue Jul 22 23:19:39 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-arxiv_primary_2111_140313
OpenAccessLink https://arxiv.org/abs/2111.14031
ParticipantIDs arxiv_primary_2111_14031
PublicationCentury 2000
PublicationDate 2021-11-27
PublicationDateYYYYMMDD 2021-11-27
PublicationDate_xml – month: 11
  year: 2021
  text: 2021-11-27
  day: 27
PublicationDecade 2020
PublicationYear 2021
Score 3.5583854
SecondaryResourceType preprint
Snippet Inducing latent tree structures from sequential data is an emerging trend in the NLP research landscape today, largely popularized by recent methods such as...
SourceID arxiv
SourceType Open Access Repository
SubjectTerms Computer Science - Computation and Language
Computer Science - Learning
Title FastTrees: Parallel Latent Tree-Induction for Faster Sequence Encoding
URI https://arxiv.org/abs/2111.14031
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1NS8QwEB3WPXkRRWX9zsFrtEnTNvUmsnURv8AVeitpmoAgVdKu-PPNpBW97HUyJENCmDdk3gvAObdRLusmoZnNIyqskLTWTFKbiJRrkegoR-7ww2O6eBV3ZVJOgPxyYZT7fvsa9IHr7tJXJ-wCFeV8fbPBORZXt0_l8DgZpLhG_z8_jzGD6V-SKLZha0R35Ho4jh2YmHYXikJ1_dIZ012RZ-Xw85J3cu8xXtsTNFP8PiPQC4hHkAS9jSMvY48zmbf6AxPMHpwV8-XNgoaVq89BJqLCoKoQVLwPU1_MmxkQxm2jmqxRItbCyFzFOkWKhvQ3B-UJD2C2bpbD9UNHsMmx14IxyrNjmPZuZU58suzr07BjP6nMbX0
linkProvider Cornell University
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=FastTrees%3A+Parallel+Latent+Tree-Induction+for+Faster+Sequence+Encoding&rft.au=Pung%2C+Bill+Tuck+Weng&rft.au=Chan%2C+Alvin&rft.date=2021-11-27&rft_id=info:doi/10.48550%2Farxiv.2111.14031&rft.externalDocID=2111_14031