A feed forward deep neural network model using feature selection for cloud intrusion detection system
Summary The rapid advancement and growth of technology have rendered cloud computing services indispensable to our activities. Threats and intrusions have since multiplied exponentially across a range of industries. In such a scenario, the intrusion detection system, or simply the IDS, is deployed o...
Saved in:
Published in | Concurrency and computation Vol. 36; no. 9 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Hoboken, USA
John Wiley & Sons, Inc
25.04.2024
Wiley Subscription Services, Inc |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Summary
The rapid advancement and growth of technology have rendered cloud computing services indispensable to our activities. Threats and intrusions have since multiplied exponentially across a range of industries. In such a scenario, the intrusion detection system, or simply the IDS, is deployed on the network to monitor and detect any attacks. The paper proposes a feed‐forward deep neural network (FFDNN) method based on deep learning methodology using a filter‐based feature selection model. The feature selection strategy aims to determine and select the most highly relevant subset of attributes from the feature importance score for training the deep learning model. Three benchmark data sets were used to assess the experiment: CIC‐IDS 2017, UNSW‐NB15, and NSL‐KDD. In order to justify the proposed technique, a comparison was done using other learning algorithms ranging from classical machine learning to ensemble learning methods that can detect various attacks. The experiments showed that the FFDNN model with reduced feature subsets gave the highest accuracy of 99.53% and 94.45% in the NSL‐KDD and UNSW‐NB15 data sets, while the ensemble‐based XGBoost model performed better in the CIC‐IDS 2017 data set. In addition, the results show that the overall accuracy, recall, and F1 score of the deep learning algorithm are generally better for all the data sets. |
---|---|
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.8001 |