Principled Representation Learning for Entity Alignment
Embedding-based entity alignment (EEA) has recently received great attention. Despite significant performance improvement, few efforts have been paid to facilitate understanding of EEA methods. Most existing studies rest on the assumption that a small number of pre-aligned entities can serve as anch...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , , , , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
21.10.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Embedding-based entity alignment (EEA) has recently received great attention. Despite significant performance improvement, few efforts have been paid to facilitate understanding of EEA methods. Most existing studies rest on the assumption that a small number of pre-aligned entities can serve as anchors connecting the embedding spaces of two KGs. Nevertheless, no one investigates the rationality of such an assumption. To fill the research gap, we define a typical paradigm abstracted from existing EEA methods and analyze how the embedding discrepancy between two potentially aligned entities is implicitly bounded by a predefined margin in the scoring function. Further, we find that such a bound cannot guarantee to be tight enough for alignment learning. We mitigate this problem by proposing a new approach, named NeoEA, to explicitly learn KG-invariant and principled entity embeddings. In this sense, an EEA model not only pursues the closeness of aligned entities based on geometric distance, but also aligns the neural ontologies of two KGs by eliminating the discrepancy in embedding distribution and underlying ontology knowledge. Our experiments demonstrate consistent and significant improvement in performance against the best-performing EEA methods. |
---|---|
AbstractList | Embedding-based entity alignment (EEA) has recently received great attention. Despite significant performance improvement, few efforts have been paid to facilitate understanding of EEA methods. Most existing studies rest on the assumption that a small number of pre-aligned entities can serve as anchors connecting the embedding spaces of two KGs. Nevertheless, no one investigates the rationality of such an assumption. To fill the research gap, we define a typical paradigm abstracted from existing EEA methods and analyze how the embedding discrepancy between two potentially aligned entities is implicitly bounded by a predefined margin in the scoring function. Further, we find that such a bound cannot guarantee to be tight enough for alignment learning. We mitigate this problem by proposing a new approach, named NeoEA, to explicitly learn KG-invariant and principled entity embeddings. In this sense, an EEA model not only pursues the closeness of aligned entities based on geometric distance, but also aligns the neural ontologies of two KGs by eliminating the discrepancy in embedding distribution and underlying ontology knowledge. Our experiments demonstrate consistent and significant improvement in performance against the best-performing EEA methods. |
Author | Sun, Zequn Zhang, Qiang Chen, Mingyang Hu, Wei Guo, Lingbing Chen, Huajun |
Author_xml | – sequence: 1 givenname: Lingbing surname: Guo fullname: Guo, Lingbing – sequence: 2 givenname: Zequn surname: Sun fullname: Sun, Zequn – sequence: 3 givenname: Mingyang surname: Chen fullname: Chen, Mingyang – sequence: 4 givenname: Wei surname: Hu fullname: Hu, Wei – sequence: 5 givenname: Qiang surname: Zhang fullname: Zhang, Qiang – sequence: 6 givenname: Huajun surname: Chen fullname: Chen, Huajun |
BookMark | eNqNyr0KwjAUQOEgClbtOwScC21-mq4iFQcHEfcS9LakxJuYpINvbwcfwOkM59uQJTqEBckY51XRCMbWJI9xLMuS1YpJyTOirsHgw3gLT3oDHyACJp2MQ3oBHdDgQHsXaIvJpA89WDPgayY7suq1jZD_uiX7U3s_ngsf3HuCmLrRTQHn1THZCKEqXiv-n_oCeuQ36A |
ContentType | Paper |
Copyright | 2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: 2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | 8FE 8FG ABJCF ABUWG AFKRA AZQEC BENPR BGLVJ CCPQU DWQXO HCIFZ L6V M7S PIMPY PQEST PQQKQ PQUKI PRINS PTHSS |
DatabaseName | ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central (Alumni) ProQuest Central ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One Community College ProQuest Central Korea SciTech Premium Collection ProQuest Engineering Collection Engineering Database Publicly Available Content Database ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Engineering Collection |
DatabaseTitle | Publicly Available Content Database Engineering Database Technology Collection ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central ProQuest Engineering Collection ProQuest One Academic UKI Edition ProQuest Central Korea Materials Science & Engineering Collection ProQuest One Academic Engineering Collection |
DatabaseTitleList | Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Physics |
EISSN | 2331-8422 |
Genre | Working Paper/Pre-Print |
GroupedDBID | 8FE 8FG ABJCF ABUWG AFKRA ALMA_UNASSIGNED_HOLDINGS AZQEC BENPR BGLVJ CCPQU DWQXO FRJ HCIFZ L6V M7S M~E PIMPY PQEST PQQKQ PQUKI PRINS PTHSS |
ID | FETCH-proquest_journals_25844713673 |
IEDL.DBID | BENPR |
IngestDate | Thu Oct 10 18:34:48 EDT 2024 |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-proquest_journals_25844713673 |
OpenAccessLink | https://www.proquest.com/docview/2584471367?pq-origsite=%requestingapplication% |
PQID | 2584471367 |
PQPubID | 2050157 |
ParticipantIDs | proquest_journals_2584471367 |
PublicationCentury | 2000 |
PublicationDate | 20211021 |
PublicationDateYYYYMMDD | 2021-10-21 |
PublicationDate_xml | – month: 10 year: 2021 text: 20211021 day: 21 |
PublicationDecade | 2020 |
PublicationPlace | Ithaca |
PublicationPlace_xml | – name: Ithaca |
PublicationTitle | arXiv.org |
PublicationYear | 2021 |
Publisher | Cornell University Library, arXiv.org |
Publisher_xml | – name: Cornell University Library, arXiv.org |
SSID | ssj0002672553 |
Score | 3.357742 |
SecondaryResourceType | preprint |
Snippet | Embedding-based entity alignment (EEA) has recently received great attention. Despite significant performance improvement, few efforts have been paid to... |
SourceID | proquest |
SourceType | Aggregation Database |
SubjectTerms | Alignment Embedding Learning Ontology |
Title | Principled Representation Learning for Entity Alignment |
URI | https://www.proquest.com/docview/2584471367 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2BQMTFIMjQHNhV0zZIT03RNLA0sdYGVYJIusC5ISUlOMzc2MgBtFPb1M_MINfGKMI2ADrgVQ5dVwspEcEGdkp8MGiPXNwLWlMCC1NjM3L6gUBd0axRodhV6hQYzA6sRsKdgwMLA6uTqFxAEH2UxMjMHtpmNMQpacO3hJsjAGpBYkFokxMCUmifMwA5edJlcLMJgHgAb6U5RCAKvSIVuBMpTgB57mq4AbFMquIK20lYqOOZkpoPn7kUZlN1cQ5w9dGG2xUNTRHE8wv3GYgwswK59qgSDgkGKEbA9ZJ6caAraU2SWkpgCzC6pZmlphuappmmphpIMMvhMksIvLc3AZQRagQEsaY0MZRhYSopKU2WBVWhJkhwDs4Wbuxw0tIA83zpXAM5We6A |
link.rule.ids | 783,787,12777,21400,33385,33756,43612,43817 |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1NSwMxEB20RezNT7RWDeg1uJvsJvQkIl1XbUuRCr0tu_kogtTarQf_vZOQ1YPQcyAJIXnvZWZeAnCdRFUsUSpQoUpLk37Up0iCFUUu0FpZyVnkjMKjschfk6dZOgsBtzqUVTaY6IFafygXI79hyJQIpFzI2-Undb9Guexq-EJjG9oJR652TvHs4TfGwoRExcz_waznjmwP2pNyaVb7sGUWB7DjSy5VfQhy0sS5NXnx9ajBBrQg4dHTOUFFSQbOSPtN7t7f5j5zfwRX2WB6n9NmtCLsh7r4mz0_hhZe7M0JkEgzVENSlalzFAldajwsRlgbS5NaE59Cb1NP3c3Nl7CbT0fDYvg4fj6DDnO1GIi5LO5Ba736MudIpuvqwq_YD9btexQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Principled+Representation+Learning+for+Entity+Alignment&rft.jtitle=arXiv.org&rft.au=Guo%2C+Lingbing&rft.au=Sun%2C+Zequn&rft.au=Chen%2C+Mingyang&rft.au=Hu%2C+Wei&rft.date=2021-10-21&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422 |