Depth Estimation fusing Image and Radar Measurements with Uncertain Directions

This paper proposes a depth estimation method using radar-image fusion by addressing the uncertain vertical directions of sparse radar measurements. In prior radar-image fusion work, image features are merged with the uncertain sparse depths measured by radar through convolutional layers. This appro...

Full description

Saved in:
Bibliographic Details
Published in2024 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 6
Main Authors Kotani, Masaya, Oba, Takeru, Ukita, Norimichi
Format Conference Proceeding
LanguageEnglish
Published IEEE 30.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
Abstract This paper proposes a depth estimation method using radar-image fusion by addressing the uncertain vertical directions of sparse radar measurements. In prior radar-image fusion work, image features are merged with the uncertain sparse depths measured by radar through convolutional layers. This approach is disturbed by the features computed with the uncertain radar depths. Furthermore, since the features are computed with a fully convolutional network, the uncertainty of each depth corresponding to a pixel is spread out over its surrounding pixels. Our method avoids this problem by computing features only with an image and conditioning the features pixelwise with the radar depth. Furthermore, the set of possibly correct radar directions is identified with reliable LiDAR measurements, which are available only in the training stage. Our method improves training data by learning only these possibly correct radar directions, while the previous method trains raw radar measurements, including erroneous measurements. Experimental results demonstrate that our method can improve the quantitative and qualitative results compared with its base method using radar-image fusion.
AbstractList This paper proposes a depth estimation method using radar-image fusion by addressing the uncertain vertical directions of sparse radar measurements. In prior radar-image fusion work, image features are merged with the uncertain sparse depths measured by radar through convolutional layers. This approach is disturbed by the features computed with the uncertain radar depths. Furthermore, since the features are computed with a fully convolutional network, the uncertainty of each depth corresponding to a pixel is spread out over its surrounding pixels. Our method avoids this problem by computing features only with an image and conditioning the features pixelwise with the radar depth. Furthermore, the set of possibly correct radar directions is identified with reliable LiDAR measurements, which are available only in the training stage. Our method improves training data by learning only these possibly correct radar directions, while the previous method trains raw radar measurements, including erroneous measurements. Experimental results demonstrate that our method can improve the quantitative and qualitative results compared with its base method using radar-image fusion.
Author Oba, Takeru
Kotani, Masaya
Ukita, Norimichi
Author_xml – sequence: 1
  givenname: Masaya
  surname: Kotani
  fullname: Kotani, Masaya
  email: masaya.kotani.ttij@gmail.com
  organization: Toyota Technological Institute,Japan
– sequence: 2
  givenname: Takeru
  surname: Oba
  fullname: Oba, Takeru
  email: sd21502@toyota-ti.ac.jp
  organization: Toyota Technological Institute,Japan
– sequence: 3
  givenname: Norimichi
  surname: Ukita
  fullname: Ukita, Norimichi
  email: ukita@toyota-ti.ac.jp
  organization: Toyota Technological Institute,Japan
BookMark eNo1j81OAjEUhavRREDewEVfYPDe_ndpABGDY2JwTUrngjVSyHSI8e3FqJtzVt-Xc_rsIu8zMcYRRojgb-eP47o24LwfCRBqhGA0KKfO2NBb76QGqb1Ecc56Ag1WSoG9Yv1S3gGE9F72WD2hQ_fGp6VLu9ClfeabY0l5y-e7sCUecsNfQhNa_kShHFvaUe4K_0wn5jVHaruQMp-kluIPXK7Z5SZ8FBr-9YAt76fL8UO1eJ7Nx3eLKllUlXZoHDVReKPXek0GEdeBgqWGwDlwIQorfTyl3oQIDr0S6KWTRgsbrRywm19tIqLVoT1tb79W__flN1OnUS8
ContentType Conference Proceeding
DBID 6IE
6IH
CBEJK
RIE
RIO
DOI 10.1109/IJCNN60899.2024.10650484
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan (POP) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library Online
IEEE Proceedings Order Plans (POP) 1998-present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library Online
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISBN 9798350359312
EISSN 2161-4407
EndPage 6
ExternalDocumentID 10650484
Genre orig-research
GroupedDBID 6IE
6IF
6IH
6IK
6IL
6IM
6IN
AAJGR
ABLEC
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
IJVOP
IPLJI
M43
OCL
RIE
RIL
RIO
RNS
ID FETCH-LOGICAL-i714-58168edc2965b5be6111baea7ede08808ac2739cc275fac081942193836527c73
IEDL.DBID RIE
IngestDate Mon Nov 04 11:48:21 EST 2024
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i714-58168edc2965b5be6111baea7ede08808ac2739cc275fac081942193836527c73
OpenAccessLink https://arxiv.org/pdf/2403.15787
PageCount 6
ParticipantIDs ieee_primary_10650484
PublicationCentury 2000
PublicationDate 2024-June-30
PublicationDateYYYYMMDD 2024-06-30
PublicationDate_xml – month: 06
  year: 2024
  text: 2024-June-30
  day: 30
PublicationDecade 2020
PublicationTitle 2024 International Joint Conference on Neural Networks (IJCNN)
PublicationTitleAbbrev IJCNN
PublicationYear 2024
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0023993
Score 1.9252334
Snippet This paper proposes a depth estimation method using radar-image fusion by addressing the uncertain vertical directions of sparse radar measurements. In prior...
SourceID ieee
SourceType Publisher
StartPage 1
SubjectTerms Depth maps
Estimation
Fusion
Laser radar
Measurement uncertainty
Radar measurement depths
Radar measurements
Training
Training data
Uncertainty
Title Depth Estimation fusing Image and Radar Measurements with Uncertain Directions
URI https://ieeexplore.ieee.org/document/10650484
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3LS8MwGA9uJ0_zMfFNDl5buzZpmvPc2AYrIhvsNvL4oiJ2MreLf71f-thQELyUEEhTkn7P_H75CLnLpBLSRlFgek4GTAgbaGNcgLY00RBbLrTPQ07zdDRnkwVf1GT1kgsDACX4DELfLM_y7cpsfaoMJRz9CZaxFmkJKSuy1i668pa2gepE8n486ed56g-1MAiMWdiM_VFFpTQiww7Jm-kr7MhbuN3o0Hz9upnx3993RLp7vh593FmiY3IAxQnpNAUbaC2_pyR_gI_NCx2gWFeMReo87v2Zjt9RrVBVWPqkrFrT6T5z-El9qpbO8QUleIDWShL_1i6ZDQez_iioCyoEr6LHAu5rbIA1sUy55hpS1HNagRJgAZVNlCmDzow0-OROGe8sMFRoGMOmPBZGJGekXawKOCfUOAxtRZwIcAlzTkuls56DBDh2SGYvSNcvz_KjujJj2azM5R_9V-TQ71IFxLsm7c16Czdo7Tf6ttzlb1aOqgc
link.rule.ids 310,311,783,787,792,793,799,27937,55086
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3LS8MwGA86D3qaj4lvc_Da2rVp057nxja3IrLBbiOPLypiN2Z38a_3Sx8bCoKXUgIJIcn3zO-Xj5C7OBE80Z7nqLZJHMa5dqRSxkFbGkjwdcilzUOO06g_ZcNZOKvI6gUXBgAK8Bm49re4y9cLtbapMpRw9CdYzHbJHjrWcVTStTbxlbW1NVjHS-4Hw06aRvZaC8NAn7l17x91VAoz0muStJ5AiR55d9e5dNXXr7cZ_z3DQ9LaMvbo08YWHZEdyI5Jsy7ZQCsJPiHpAyzzV9pFwS45i9RY5PsLHXygYqEi0_RZaLGi423u8JPaZC2d4gAFfIBWahLPa4tMet1Jp-9UJRWcN95mTmirbIBWfhKFMpQQoaaTAgQHDahuvFgodGcShd_QCGXdBYYqDaPYKPS54sEpaWSLDM4IVQaDW-4HHEzAjJGJkHHbQAAhNiRMn5OWXZ75snw0Y16vzMUf7bdkvz8Zj-ajQfp4SQ7sjpWwvCvSyFdruEbbn8ubYse_AWmsrVI
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2024+International+Joint+Conference+on+Neural+Networks+%28IJCNN%29&rft.atitle=Depth+Estimation+fusing+Image+and+Radar+Measurements+with+Uncertain+Directions&rft.au=Kotani%2C+Masaya&rft.au=Oba%2C+Takeru&rft.au=Ukita%2C+Norimichi&rft.date=2024-06-30&rft.pub=IEEE&rft.eissn=2161-4407&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FIJCNN60899.2024.10650484&rft.externalDocID=10650484