Forecasting of depth and ego-motion with transformers and self-supervision
This paper addresses the problem of end-to-end self-supervised forecasting of depth and ego motion. Given a sequence of raw images, the aim is to forecast both the geometry and ego-motion using a self supervised photometric loss. The architecture is designed using both convolution and transformer mo...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
15.06.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | This paper addresses the problem of end-to-end self-supervised forecasting of depth and ego motion. Given a sequence of raw images, the aim is to forecast both the geometry and ego-motion using a self supervised photometric loss. The architecture is designed using both convolution and transformer modules. This leverages the benefits of both modules: Inductive bias of CNN, and the multi-head attention of transformers, thus enabling a rich spatio-temporal representation that enables accurate depth forecasting. Prior work attempts to solve this problem using multi-modal input/output with supervised ground-truth data which is not practical since a large annotated dataset is required. Alternatively to prior methods, this paper forecasts depth and ego motion using only self-supervised raw images as input. The approach performs significantly well on the KITTI dataset benchmark with several performance criteria being even comparable to prior non-forecasting self-supervised monocular depth inference methods. |
---|---|
AbstractList | This paper addresses the problem of end-to-end self-supervised forecasting of depth and ego motion. Given a sequence of raw images, the aim is to forecast both the geometry and ego-motion using a self supervised photometric loss. The architecture is designed using both convolution and transformer modules. This leverages the benefits of both modules: Inductive bias of CNN, and the multi-head attention of transformers, thus enabling a rich spatio-temporal representation that enables accurate depth forecasting. Prior work attempts to solve this problem using multi-modal input/output with supervised ground-truth data which is not practical since a large annotated dataset is required. Alternatively to prior methods, this paper forecasts depth and ego motion using only self-supervised raw images as input. The approach performs significantly well on the KITTI dataset benchmark with several performance criteria being even comparable to prior non-forecasting self-supervised monocular depth inference methods. |
Author | Boulahbal, Houssem Voicila, Adrian Comport, Andrew |
Author_xml | – sequence: 1 givenname: Houssem surname: Boulahbal fullname: Boulahbal, Houssem – sequence: 2 givenname: Adrian surname: Voicila fullname: Voicila, Adrian – sequence: 3 givenname: Andrew surname: Comport fullname: Comport, Andrew |
BookMark | eNqNzUEKwjAUBNAgClbtHQKuAzGx1r1YxLX7EuxPTWnza36q1zeIB3A1MPNgVmzu0cOMZUrrnTjulVqynKiTUqpDqYpCZ-xaYYC7oeh8y9HyBsb44MY3HFoUA0aHnr9d6mIwniyGAQJ9AUFvBU0jhJejxDZsYU1PkP9yzbbV-Xa6iDHgcwKKdYdT8Gmq032ptCy01v-pD2YkPxw |
ContentType | Paper |
Copyright | 2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: 2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | 8FE 8FG ABJCF ABUWG AFKRA AZQEC BENPR BGLVJ CCPQU DWQXO HCIFZ L6V M7S PIMPY PQEST PQQKQ PQUKI PRINS PTHSS |
DatabaseName | ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central (Alumni Edition) ProQuest Central ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One Community College ProQuest Central Korea SciTech Premium Collection ProQuest Engineering Collection Engineering Database Publicly Available Content Database ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Engineering Collection |
DatabaseTitle | Publicly Available Content Database Engineering Database Technology Collection ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central ProQuest Engineering Collection ProQuest One Academic UKI Edition ProQuest Central Korea Materials Science & Engineering Collection ProQuest One Academic Engineering Collection |
DatabaseTitleList | Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Physics |
EISSN | 2331-8422 |
Genre | Working Paper/Pre-Print |
GroupedDBID | 8FE 8FG ABJCF ABUWG AFKRA ALMA_UNASSIGNED_HOLDINGS AZQEC BENPR BGLVJ CCPQU DWQXO FRJ HCIFZ L6V M7S M~E PIMPY PQEST PQQKQ PQUKI PRINS PTHSS |
ID | FETCH-proquest_journals_26772305333 |
IEDL.DBID | 8FG |
IngestDate | Thu Oct 10 20:03:37 EDT 2024 |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-proquest_journals_26772305333 |
OpenAccessLink | https://www.proquest.com/docview/2677230533?pq-origsite=%requestingapplication% |
PQID | 2677230533 |
PQPubID | 2050157 |
ParticipantIDs | proquest_journals_2677230533 |
PublicationCentury | 2000 |
PublicationDate | 20220615 |
PublicationDateYYYYMMDD | 2022-06-15 |
PublicationDate_xml | – month: 06 year: 2022 text: 20220615 day: 15 |
PublicationDecade | 2020 |
PublicationPlace | Ithaca |
PublicationPlace_xml | – name: Ithaca |
PublicationTitle | arXiv.org |
PublicationYear | 2022 |
Publisher | Cornell University Library, arXiv.org |
Publisher_xml | – name: Cornell University Library, arXiv.org |
SSID | ssj0002672553 |
Score | 3.4098186 |
SecondaryResourceType | preprint |
Snippet | This paper addresses the problem of end-to-end self-supervised forecasting of depth and ego motion. Given a sequence of raw images, the aim is to forecast both... |
SourceID | proquest |
SourceType | Aggregation Database |
SubjectTerms | Datasets Economic forecasting Modules Transformers |
Title | Forecasting of depth and ego-motion with transformers and self-supervision |
URI | https://www.proquest.com/docview/2677230533 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3PS8MwFH7oiuBt_kLdHAG9Bts0zbqToLSOwcYQhd1G06Z6EFuX7urf7nux1YOwY0kTkpC89-XLy_sAbgLta8r8x30jNKfsI1wrpTmCYyVkLEPjVCLmCzV9kbNVtGoJN9uGVXY20RnqosqJI78VCnFgSC9H7-pPTqpRdLvaSmjsgxeI8ZhWdZw-_nIsWAcRc_jPzDrfkfbBW2a12RzBnvk4hgMXcpnbE5iRKmaeWYo7ZlXJClM3bwwP9sy8VvxHXYcRTcqaDlwiVHM_WPNecrutaZ8T23UK12ny_DDlXQfW7RKx678BhWfQw7O-OQdWZoEqJpT1bWJkViD8RY-fS1nGsYz8wr-A4a6WLncXD-BQUPQ-Se9EQ-g1m625Qp_a6JGbuBF498li-YRf86_kG-zYgVM |
link.rule.ids | 783,787,12779,21402,33387,33758,43614,43819 |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1bS8MwFD7oiuibV7xMDehrsGvT0D0JykadWxkyYW-laVJ9EFuX7v97Tk31QdhzLiQhOefLl5PzAdwOlK8o8x_3TaA4ZR_hSkrFERzLQMQiNK1KxCyVyauYLKOlI9ysC6vsbGJrqHVVEEd-F0jEgSH9HL2vvzipRtHrqpPQ2AaPUlXhrvYeRun85ZdlwVaImcN_hrb1HuN98OZ5bVYHsGU-D2GnDbos7BFMSBezyC1FHrOqZNrUzTvDqz0zbxX_0ddhRJSypoOXCNbaCtZ8lNyuazrpxHcdw814tHhMeDeAzG0Sm_1NKTyBHt72zSmwMh9IPaS8b0Mjco0AGH1-IUQZxyLytX8G_U09nW8uvobdZDGbZtOn9PkC9gKK5SchnqgPvWa1NpfoYRt15ZbxGzJDgtk |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Forecasting+of+depth+and+ego-motion+with+transformers+and+self-supervision&rft.jtitle=arXiv.org&rft.au=Boulahbal%2C+Houssem&rft.au=Voicila%2C+Adrian&rft.au=Comport%2C+Andrew&rft.date=2022-06-15&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422 |