Neural Network-Based Strong Motion Prediction for On-Site Earthquake Early Warning

Developing on-site earthquake early warning systems has been a challenging problem because of time limitations and the amount of information that can be collected before the warning needs to be issued. A potential solution that could prevent severe disasters is to predict the potential strong motion...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 22; no. 3; p. 704
Main Authors Chiang, You-Jing, Chin, Tai-Lin, Chen, Da-Yi
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 18.01.2022
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Developing on-site earthquake early warning systems has been a challenging problem because of time limitations and the amount of information that can be collected before the warning needs to be issued. A potential solution that could prevent severe disasters is to predict the potential strong motion using the initial P-wave signal and provide warnings before serious ground shaking starts. In practice, the accuracy of prediction is the most critical issue for earthquake early warning systems. Traditional methods use certain criteria, selected through intuition or experience, to make the prediction. However, the criteria thresholds are difficult to select and may significantly affect the prediction accuracy. This paper investigates methods based on artificial intelligence for predicting the greatest earthquake ground motion early, when the P-wave arrives at seismograph stations. A neural network model is built to make the predictions using a small window of the initial P-wave acceleration signal. The model is trained by seismic waves collected from 1991 to 2019 in Taiwan and is evaluated by events in 2020 and 2021. From these evaluations, the proposed scheme significantly outperforms the threshold-based method in terms of its accuracy and average leading time.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22030704