Exploring orientation and accelerometer sensor data for personal authentication in smartphones using touchscreen gestures

•A novel approach for user verification in smartphones using touchscreen gestures.•Performance of the approach is evaluated on relatively large dataset of 104 users.•Orientation sensor information outperforms other features considered in this work.•Proposed approach achieves 0.31% EER for score leve...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition letters Vol. 68; pp. 351 - 360
Main Authors Jain, Ankita, Kanhangad, Vivek
Format Journal Article
LanguageEnglish
Published Elsevier B.V 15.12.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•A novel approach for user verification in smartphones using touchscreen gestures.•Performance of the approach is evaluated on relatively large dataset of 104 users.•Orientation sensor information outperforms other features considered in this work.•Proposed approach achieves 0.31% EER for score level fusion of all gestures. In this paper, we propose an approach for user authentication in smartphones using behavioral biometrics. The approach involves analyzing behavioral traits while the user performs different gestures during his interaction with the device. In addition to the commonly employed features such as x–y coordinate information and finger area, the proposed approach utilizes the information from orientation sensor for each of the seven gestures considered in this study. The feature set is further enriched with features such as accelerometer sensor reading, curvature of the swipe. Matching of corresponding features is performed using the modified Hausdorff distance. Performance evaluation of the proposed authentication approach on a dataset of 104 users yielded promising results, suggesting that the readings from orientation sensor carry useful information for reliably authenticating the users. In addition, experimental results demonstrate that consolidating multiple features results in performance improvement. The proposed method outperforms dynamic time warping based matching for all gestures considered in this study, with significant reduction in EER from 1.55% to 0.31% for score level fusion of all gestures. In addition, the performance of the proposed algorithm is ascertained on a dataset of 30 subjects captured using another smartphone.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2015.07.004