Mitigation of Through-Wall Distortions of Frontal Radar Images using Denoising Autoencoders

Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall environments. While several methods have been proposed for removing target independent static and dynamic clutter, there still remain considerable cha...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Vishwakarma, Shelly, Shobha Sundar Ram
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 22.03.2019
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall environments. While several methods have been proposed for removing target independent static and dynamic clutter, there still remain considerable challenges in mitigating target dependent clutter especially when the knowledge of the exact propagation characteristics or analytical framework is unavailable. In this work we focus on mitigating wall effects using a machine learning based solution -- denoising autoencoders -- that does not require prior information of the wall parameters or room geometry. Instead, the method relies on the availability of a large volume of training radar images gathered in through-wall conditions and the corresponding clean images captured in line-of-sight conditions. During the training phase, the autoencoder learns how to denoise the corrupted through-wall images in order to resemble the free space images. We have validated the performance of the proposed solution for both static and dynamic human subjects. The frontal radar images of static targets are obtained by processing wideband planar array measurement data with two-dimensional array and range processing. The frontal radar images of dynamic targets are simulated using narrowband planar array data processed with two-dimensional array and Doppler processing. In both simulation and measurement processes, we incorporate considerable diversity in the target and propagation conditions. Our experimental results, from both simulation and measurement data, show that the denoised images are considerably more similar to the free-space images when compared to the original through-wall images.
AbstractList Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall environments. While several methods have been proposed for removing target independent static and dynamic clutter, there still remain considerable challenges in mitigating target dependent clutter especially when the knowledge of the exact propagation characteristics or analytical framework is unavailable. In this work we focus on mitigating wall effects using a machine learning based solution -- denoising autoencoders -- that does not require prior information of the wall parameters or room geometry. Instead, the method relies on the availability of a large volume of training radar images gathered in through-wall conditions and the corresponding clean images captured in line-of-sight conditions. During the training phase, the autoencoder learns how to denoise the corrupted through-wall images in order to resemble the free space images. We have validated the performance of the proposed solution for both static and dynamic human subjects. The frontal radar images of static targets are obtained by processing wideband planar array measurement data with two-dimensional array and range processing. The frontal radar images of dynamic targets are simulated using narrowband planar array data processed with two-dimensional array and Doppler processing. In both simulation and measurement processes, we incorporate considerable diversity in the target and propagation conditions. Our experimental results, from both simulation and measurement data, show that the denoised images are considerably more similar to the free-space images when compared to the original through-wall images.
Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall environments. While several methods have been proposed for removing target independent static and dynamic clutter, there still remain considerable challenges in mitigating target dependent clutter especially when the knowledge of the exact propagation characteristics or analytical framework is unavailable. In this work we focus on mitigating wall effects using a machine learning based solution -- denoising autoencoders -- that does not require prior information of the wall parameters or room geometry. Instead, the method relies on the availability of a large volume of training radar images gathered in through-wall conditions and the corresponding clean images captured in line-of-sight conditions. During the training phase, the autoencoder learns how to denoise the corrupted through-wall images in order to resemble the free space images. We have validated the performance of the proposed solution for both static and dynamic human subjects. The frontal radar images of static targets are obtained by processing wideband planar array measurement data with two-dimensional array and range processing. The frontal radar images of dynamic targets are simulated using narrowband planar array data processed with two-dimensional array and Doppler processing. In both simulation and measurement processes, we incorporate considerable diversity in the target and propagation conditions. Our experimental results, from both simulation and measurement data, show that the denoised images are considerably more similar to the free-space images when compared to the original through-wall images.
Author Vishwakarma, Shelly
Shobha Sundar Ram
Author_xml – sequence: 1
  givenname: Shelly
  surname: Vishwakarma
  fullname: Vishwakarma, Shelly
– sequence: 2
  fullname: Shobha Sundar Ram
BackLink https://doi.org/10.48550/arXiv.1903.09451$$DView paper in arXiv
https://doi.org/10.1109/TGRS.2020.2978440$$DView published paper (Access to full text may be restricted)
BookMark eNotUNFKwzAUDaLgnPsAnwz43Jnmtmn3ODang4kgAx98KLdJ2mV0yUxa0b-33Xy4nAvncO8554ZcWmc1IXcxmyZ5mrJH9D_mexrPGEzZLEnjCzLiAHGUJ5xfk0kIe8YYFxlPUxiRz1fTmhpb4yx1Fd3uvOvqXfSBTUOXJrTOD1QYuJV3tsWGvqNCT9cHrHWgXTC2pkttnTlt86512kqntA-35KrCJujJP47JdvW0XbxEm7fn9WK-iTDlLNJ52YMGlCggURlUlZIMlFBSiFIxyEQ_LJF9mgxEJnQuBaoy07ISuVAwJvfns6fkxdGbA_rfYmigODXQKx7OiqN3X50ObbF3nbe9p4LnMBOcQ__jD4vLYOs
ContentType Paper
Journal Article
Copyright 2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
http://arxiv.org/licenses/nonexclusive-distrib/1.0
Copyright_xml – notice: 2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: http://arxiv.org/licenses/nonexclusive-distrib/1.0
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
GOX
DOI 10.48550/arxiv.1903.09451
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One
ProQuest Central Korea
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering collection
arXiv.org
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
Engineering Collection
DatabaseTitleList
Publicly Available Content Database
Database_xml – sequence: 1
  dbid: GOX
  name: arXiv.org
  url: http://arxiv.org/find
  sourceTypes: Open Access Repository
– sequence: 2
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
ExternalDocumentID 1903_09451
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
GOX
ID FETCH-LOGICAL-a520-e8b520e3aca634d73ffdc03d6dc66bd037603704c09473676e8c6adb7ecf686d3
IEDL.DBID BENPR
IngestDate Tue Jul 22 22:00:42 EDT 2025
Mon Jun 30 09:19:32 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-a520-e8b520e3aca634d73ffdc03d6dc66bd037603704c09473676e8c6adb7ecf686d3
Notes SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
OpenAccessLink https://www.proquest.com/docview/2839622360?pq-origsite=%requestingapplication%
PQID 2839622360
PQPubID 2050157
ParticipantIDs arxiv_primary_1903_09451
proquest_journals_2839622360
PublicationCentury 2000
PublicationDate 20190322
2019-03-22
PublicationDateYYYYMMDD 2019-03-22
PublicationDate_xml – month: 03
  year: 2019
  text: 20190322
  day: 22
PublicationDecade 2010
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2019
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 1.685173
SecondaryResourceType preprint
Snippet Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall...
Radar images of humans and other concealed objects are considerably distorted by attenuation, refraction and multipath clutter in indoor through-wall...
SourceID arxiv
proquest
SourceType Open Access Repository
Aggregation Database
SubjectTerms Clutter
Indoor environments
Machine learning
Narrowband
Noise reduction
Propagation
Radar imaging
Simulation
Training
Wall effects
SummonAdditionalLinks – databaseName: arXiv.org
  dbid: GOX
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3NS8MwFA9zJy-iqGw6JQev1ZqvtsfhnFOYgkwYeCj5lB1sZR-yP9_30g4P4qmhSQN9L83vvfSXXwi5QtDhPohEeikScet0Ypg0MBkWvAgsszaK-kyf1eRNPM3lvEPobi-MXm4X340-sFndAFrxa0hAcI_0HmNI2Xp4mTc_J6MUV9v-tx3EmPHWn6k14sX4kBy0gR4dNp45Ih1fHZP36aLRtKgrWgc6aw7JSXA1m46iYEccBlg3RmUB6OBVQ6ZPHz_hu19RZKl_0JGv6kUsDTfrGpUokY18Qmbj-9ndJGmPN0i0hJzN5wYunmurFRcu4yE4m3KnnFXKuBTZKjxLhYX3ylBXzedWaVRDtkHlyvFT0q3qyvcIDVwYJqQ3GtIjgBfoMs28k0EGiH4c75NeNEr51ShYlGivMtqrTwY7O5Xt6F2VEHIUCnyo0rP_nzwn-xA8FMjHYmxAuuvlxl8AQK_NZfTSDwGXkUI
  priority: 102
  providerName: Cornell University
Title Mitigation of Through-Wall Distortions of Frontal Radar Images using Denoising Autoencoders
URI https://www.proquest.com/docview/2839622360
https://arxiv.org/abs/1903.09451
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LS8NAEF60QfDmk1Zr2YPXtDGbbJKTqG1ahdZSKhQ8hM0-pKBJTVrx5G93dpPqQfCS18JCZpdvdmY-vkHoUjsdIpVn-9L3bO9KMDt1_RTAMCKRcgPOjajPeEJHT97Dwl_UCbeyplVuMdEAtci5zpH3wA1GFOalzvXq3dZdo3R1tW6hsYssgOAQgi_rdjCZzn6yLC4N4MxMqnKmEe_qseJz-dEFP0i6ENroAqVlPv0BY-Nh4gNkTdlKFodoR2ZHaM8QM3l5jJ7Hy0oFI89wrvC8aqtj6_w37huJD7Nx9FistQjYK54xwQp8_wZIUWLNa3_BfZnlS_N0s1nnWrtS85dP0DwezO9Gdt0QwWY-RHkyTOEmCeOMEk8ERCnBHSKo4JSmwtH8FhI4Hof_CrQSmww5ZVo_mSsaUkFOUSPLM9lEWBEvdT1fpgwCKnBIMKUTSOErX8F5SZAWahqjJKtK8yLR9kqMvVqovbVTUu_3MvldnbP_h8_RPhw5Is3ict02aqyLjbwAt75OO2g3jIedegXhbfi4gOv4a_AN6L6maw
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LS8NAEB7UInrzifW5Bz1GY3azaQ8iYq2tWhGpIHgIm31IQZPa1teP8j86s7F6ELx5SsjCEGZn573fAGyT0eHWiSC2sQjEvlFBFsUZKsM6r7so0dqD-nQuZetGnN3GtxPwMb4LQ22VY53oFbUpNOXI99AM1iXSleFh_ymgqVFUXR2P0CjF4ty-v2LINjxoN3B_d6KoedI9bgVfUwUCFWOoZGsZPixXWkkuTMKdMzrkRhotZWZCahLhSSg0xj0JwZnZmpaKQIi1kzVpOJKdhIrg-Dt0Mb15-p3SiWSCDjova6ceKWxPDd56L7todPku0qNqaMV_-qX5vTlrzkHlSvXtYB4mbL4A074LVA8X4a7TKyE3ipwVjnXLGT4BJdtZw-OJeCmltSYBH6gHdq2MGrD2I6qlIaMm-nvWsHnR829Hz6OCgDKpWXoJuv_Bp2WYyovcrgBzXGSRiG2mMHpD64ckw8Sa2MUOnTPDq7DimZL2S4CNlPiVen5VYX3Mp_TrcA3TH1FY_Xt5C2Za3c5FetG-PF-DWfR16tQ-FkXrMDUaPNsN9CdG2abfRQbpP0vNJzDP3mE
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Mitigation+of+Through-Wall+Distortions+of+Frontal+Radar+Images+using+Denoising+Autoencoders&rft.jtitle=arXiv.org&rft.au=Vishwakarma%2C+Shelly&rft.au=Shobha+Sundar+Ram&rft.date=2019-03-22&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422&rft_id=info:doi/10.48550%2Farxiv.1903.09451