Hypertension Identification and Classification Based on Temporal Convolutional Networks and Support Vector Machines

Hypertension, one of the most common cardiovascular diseases, may not have obvious symptoms in its early stages, making it difficult to detect through simple blood pressure tests. A deep learning method using electrocardiogram signals as the source has been proposed for automatic feature extraction...

Full description

Saved in:
Bibliographic Details
Published in2023 3rd International Conference on Computer Science, Electronic Information Engineering and Intelligent Control Technology (CEI) pp. 290 - 295
Main Authors Cao, Yuming, Jing, Huicheng, Ge, Chao, Gao, Yuxing
Format Conference Proceeding
LanguageEnglish
Published IEEE 15.12.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Hypertension, one of the most common cardiovascular diseases, may not have obvious symptoms in its early stages, making it difficult to detect through simple blood pressure tests. A deep learning method using electrocardiogram signals as the source has been proposed for automatic feature extraction and classification of hypertension. Firstly, electrocardiogram is processed using methods such as wavelet decomposition to locate and calculate RR-intervals. Then, re-model and extract features from the RR interval data using a Temporal Convolutional Network combined a Bidirectional Long Short-Term Memory Network, with added attention mechanism. Finally, a Support Vector Machine classifier optimized by the Dung beetle optimizer is used for hypertension recognition and classification. Simulations using datasets from the PhysioNet database, including shareedb, nsrdb, and nsr2db, show that the model achieves an accuracy, recall rate, specificity, and precision of 93.8%, 96.4%, 88.9%, and 94.3% respectively, with an F1-score of 95.3%. This method can timely identify hypertension, reminding patients to control their blood pressure through effective treatment to avoid adverse consequences.
DOI:10.1109/CEI60616.2023.10527917