EarPass: Unlock When Wearing Your Earphones
With the growing reliance on digital systems in today's mobile Internet era, robust authentication methods are crucial for safeguarding personal data and controlling access to resources. Conventional methods, such as knowledge-based and biometric-based authentication, are widely used but still...
Saved in:
Published in | Proceedings of the International Conference on Distributed Computing Systems pp. 1283 - 1293 |
---|---|
Main Authors | , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
23.07.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | With the growing reliance on digital systems in today's mobile Internet era, robust authentication methods are crucial for safeguarding personal data and controlling access to resources. Conventional methods, such as knowledge-based and biometric-based authentication, are widely used but still have some usage limitations and potential security concerns, like wearing protective suits/masks or being imitated by attackers with ulterior motives. In this paper, we propose another earphone-based authentication system, namely EarPass, that leverages users' unique head motion patterns in response to a very short period of music segment. Here, we employ a Convolutional Neural Network (CNN)-based feature extractor to capture and map distinct head motions into a well-separated latent space, achieving high-dimensional data extraction. We demonstrate the consistency, uniqueness, and robustness of head motion patterns through extensive experiments and reach a 98.2% F1-score, indicating superior performance compared to conventional authentication methods. Additionally, EarPass is user-friendly, secure, and adaptable to various environments, including noisy and movement-oriented scenarios. By integrating the authentication system into Android devices, we showcase its real-world applicability and low energy consumption with minimal latency. The source code of EarPass will be open-source to further research and collaboration within the community. |
---|---|
ISSN: | 2575-8411 |
DOI: | 10.1109/ICDCS60910.2024.00121 |