ECC‐BERT: Classification of error correcting codes using the improved bidirectional encoder representation from transformers

The recent concept of contextual information in error correcting code (ECC) can significantly improve the capacity of the blind recognition of codes with deep learning (DL) approaches. However, the fundamental challenges of existing DL‐based methods are inflexible structure and limited kernel size w...

Full description

Saved in:
Bibliographic Details
Published inIET communications Vol. 16; no. 4; pp. 359 - 368
Main Authors Li, Sida, Hu, Xiaochang, Huang, Zhiping, Zhou, Jing
Format Journal Article
LanguageEnglish
Published Stevenage John Wiley & Sons, Inc 01.03.2022
Wiley
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The recent concept of contextual information in error correcting code (ECC) can significantly improve the capacity of the blind recognition of codes with deep learning (DL) approaches. However, the fundamental challenges of existing DL‐based methods are inflexible structure and limited kernel size which bring great difficulties to exploit the characteristics of contextual information in ECC. To handle this problem, in this paper, a state‐of‐the‐art framework for natural language processing (NLP), bidirectional encoder representation from transformers (BERT), is utilized in ECC classification scenarios. To strengthen the effectiveness of contextual information, the BERT model is improved by weighted relative positional encoding and error bit embedding. The proposed approach achieves higher classification accuracy than the methods based on Gauss‐Jordan elimination and traditional deep learning schemes. Further simulation results show that the classification accuracy is affected by block length and the employment of weighted relative positional encoding and error bit embedding to a large extent.
ISSN:1751-8628
1751-8636
DOI:10.1049/cmu2.12357