CA-SpaceNet: Counterfactual Analysis for 6D Pose Estimation in Space

Reliable and stable 6D pose estimation of un-cooperative space objects plays an essential role in on-orbit servicing and debris removal missions. Considering that the pose estimator is sensitive to background interference, this paper proposes a counterfactual analysis framework named CA-SpaceNet to...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 10627 - 10634
Main Authors Wang, Shunli, Wang, Shuaibing, Jiao, Bo, Yang, Dingkang, Su, Liuzhen, Zhai, Peng, Chen, Chixiao, Zhang, Lihua
Format Conference Proceeding
LanguageEnglish
Published IEEE 23.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Reliable and stable 6D pose estimation of un-cooperative space objects plays an essential role in on-orbit servicing and debris removal missions. Considering that the pose estimator is sensitive to background interference, this paper proposes a counterfactual analysis framework named CA-SpaceNet to complete robust 6D pose estimation of the space-borne targets under complicated background. Specifically, conventional methods are adopted to extract the features of the whole image in the factual case. In the counterfactual case, a non-existent image without the target but only the background is imagined. Side effect caused by background interference is reduced by counterfactual analysis, which leads to unbiased prediction in final results. In addition, we also carry out low-bit-width quantization for CA-SpaceNet and deploy part of the framework to a Processing-In-Memory (PIM) accelerator on FPGA. Qualitative and quantitative results demonstrate the effectiveness and efficiency of our proposed method. To our best knowledge, this paper applies causal inference and network quantization to the 6D pose estimation of space-borne targets for the first time. The code is available at https://github.com/Shunli-Wang/CA-SpaceNet.
ISSN:2153-0866
DOI:10.1109/IROS47612.2022.9981172