Snore Sound Classification With Mel-Spectrogram and a Fine-Tuned CNN
Snoring occurs when airflow through the mouth and nose is partially obstructed during sleep, causing the surrounding tissues to vibrate. This obstruction can be due to factors such as relaxed throat muscles, excess tissue, nasal congestion, or structural abnormalities. While snoring is common and va...
Saved in:
Published in | IEEE-EMBS Conference on Biomedical Engineering and Sciences pp. 479 - 482 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
11.12.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Snoring occurs when airflow through the mouth and nose is partially obstructed during sleep, causing the surrounding tissues to vibrate. This obstruction can be due to factors such as relaxed throat muscles, excess tissue, nasal congestion, or structural abnormalities. While snoring is common and varies in intensity, it can sometimes signal a more serious condition like sleep apnea. Identifying the excitation location of snore sound is important for pinpointing the site of airway obstruction, leading to more targeted and effective treatments tailored to individual anatomical challenges. In this work, we propose a method for detecting the excitation location of snoring by frame-based classification on a dataset of 828 snore sounds from 219 subjects, with expert annotations into four distinct excitation locations. Each segmented snore sound is divided into frames and converted into a Mel-spectrogram, a time-frequency representation that serves as input to a pretrained convolutional neural network designed for audio classification. We fine-tune the network with a modified classification layer with inverse class weights to account for the class imbalance. Our method achieves an improvement of 6.60% in average classification accuracy over the baseline method, demonstrating its effectiveness in distinguishing snoring excitation locations based on acoustic characteristics. |
---|---|
ISSN: | 2573-3028 |
DOI: | 10.1109/IECBES61011.2024.10991306 |