Permutation Equivariance of Transformers and Its Applications

Revolutionizing the field of deep learning, Transformer-based models have achieved remarkable performance in many tasks. Recent research has recognized these models are robust to shuffling but are limited to inter-token permutation in the forward propagation. In this work, we propose our definition...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Xu, Hengyuan, Liyao Xiang, Ye, Hangyu, Yao, Dixi, Chu, Pengzhi, Li, Baochun
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 31.03.2024
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Revolutionizing the field of deep learning, Transformer-based models have achieved remarkable performance in many tasks. Recent research has recognized these models are robust to shuffling but are limited to inter-token permutation in the forward propagation. In this work, we propose our definition of permutation equivariance, a broader concept covering both inter- and intra- token permutation in the forward and backward propagation of neural networks. We rigorously proved that such permutation equivariance property can be satisfied on most vanilla Transformer-based models with almost no adaptation. We examine the property over a range of state-of-the-art models including ViT, Bert, GPT, and others, with experimental validations. Further, as a proof-of-concept, we explore how real-world applications including privacy-enhancing split learning, and model authorization, could exploit the permutation equivariance property, which implicates wider, intriguing application scenarios.
AbstractList Revolutionizing the field of deep learning, Transformer-based models have achieved remarkable performance in many tasks. Recent research has recognized these models are robust to shuffling but are limited to inter-token permutation in the forward propagation. In this work, we propose our definition of permutation equivariance, a broader concept covering both inter- and intra- token permutation in the forward and backward propagation of neural networks. We rigorously proved that such permutation equivariance property can be satisfied on most vanilla Transformer-based models with almost no adaptation. We examine the property over a range of state-of-the-art models including ViT, Bert, GPT, and others, with experimental validations. Further, as a proof-of-concept, we explore how real-world applications including privacy-enhancing split learning, and model authorization, could exploit the permutation equivariance property, which implicates wider, intriguing application scenarios.
Author Chu, Pengzhi
Ye, Hangyu
Li, Baochun
Liyao Xiang
Xu, Hengyuan
Yao, Dixi
Author_xml – sequence: 1
  givenname: Hengyuan
  surname: Xu
  fullname: Xu, Hengyuan
– sequence: 2
  fullname: Liyao Xiang
– sequence: 3
  givenname: Hangyu
  surname: Ye
  fullname: Ye, Hangyu
– sequence: 4
  givenname: Dixi
  surname: Yao
  fullname: Yao, Dixi
– sequence: 5
  givenname: Pengzhi
  surname: Chu
  fullname: Chu, Pengzhi
– sequence: 6
  givenname: Baochun
  surname: Li
  fullname: Li, Baochun
BookMark eNqNjc0KgkAURoco6M93GGgt6J00Ny0ijNq1cC-DjTCid_TemZ4_iR6g1bc45_BtxRIdmoXYgFJpXBwB1iJi7pIkgfwEWaY24vw0NASvvXUoyynYtyarsTHStbIijdw6Ggyx1PiSD8_yMo69bb4B78Wq1T2b6Lc7cbiV1fUej-SmYNjXnQuEM6qhmE9zSKFQ_1kfAgQ6JA
ContentType Paper
Copyright 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
Publicly Available Content Database
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest One Academic
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-proquest_journals_28026621283
IEDL.DBID BENPR
IngestDate Thu Oct 10 20:20:01 EDT 2024
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-proquest_journals_28026621283
OpenAccessLink https://www.proquest.com/docview/2802662128?pq-origsite=%requestingapplication%
PQID 2802662128
PQPubID 2050157
ParticipantIDs proquest_journals_2802662128
PublicationCentury 2000
PublicationDate 20240331
PublicationDateYYYYMMDD 2024-03-31
PublicationDate_xml – month: 03
  year: 2024
  text: 20240331
  day: 31
PublicationDecade 2020
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2024
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 3.5378275
SecondaryResourceType preprint
Snippet Revolutionizing the field of deep learning, Transformer-based models have achieved remarkable performance in many tasks. Recent research has recognized these...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Coders
Equivalence
Learning
Neural networks
Permutations
Privacy
Training
Transformers
Title Permutation Equivariance of Transformers and Its Applications
URI https://www.proquest.com/docview/2802662128
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LSwMxEB5sF8FbfeGjloBeg-wjyfYkVXatQkuRCr2VPMGDtt2HR3-7s9tdFYQeQyAhD76Zb75JBuBGOaNNyARlVioaCS2o5NxSrU3MhLPhsP6BbzLl49foecEWTcAtb9IqW0ysgdqsdBUjvw1iZAscgTa-W29oVTWqUlebEhod8AI_qmRa7z6Zzl5-oiwBF-gzh_-AtrYeaQ-8mVzb7BD27McR7NdJlzo_BsSj7L3cauEk2ZRvn0hcq1MgK0fmrUeJ_hlBuk-eipyM_sjNJ3CdJvOHMW3nXDb3Il_-riI8hS4SfHsGRMRchr5ijkciYs5KXwpfGWmcUXpo1Tn0d410sbv7Eg4CNMTbd3R96BZZaa_QkBZqAJ04fRw0e4atyVfyDaHSf-w
link.rule.ids 786,790,12792,21416,33406,33777,43633,43838
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1ZS8NAEB40QfTNE4-qC_q6SI7dTZ9EJSXVNhSJ0LewJ_igbXP4-92kiQpCnxd22YPvm29mdgbgVhglVUAYJpoLHDLJMKdUYylVRJjRwbCtwDdNafIWPs_JvHO4lV1aZY-JLVCrhWx85Hd-ZNUCtUAb3S9XuOka1URXuxYa2-A2JTcjB9zHOJ29_nhZfMqszRz8A9qWPUb74M74UhcHsKU_D2GnTbqU5RFYPCo-6nUsHMWr-v3LCtfmFtDCoKy3KK19hqzcR-OqRA9_ws3HcDOKs6cE92vm3bso899dBCfgWIGvTwGxiPLAE8TQkIXEaO5x5gnFlVFCDrU4g8Gmmc43D1_DbpJNJ_lknL5cwJ5vSXn9p24ATlXU-tKSaiWuupP7BshJgMs
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Permutation+Equivariance+of+Transformers+and+Its+Applications&rft.jtitle=arXiv.org&rft.au=Xu%2C+Hengyuan&rft.au=Liyao+Xiang&rft.au=Ye%2C+Hangyu&rft.au=Yao%2C+Dixi&rft.date=2024-03-31&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422