GADformer: A Transparent Transformer Model for Group Anomaly Detection on Trajectories

Group Anomaly Detection (GAD) identifies unusual pattern in groups where individual members might not be anomalous. This task is of major importance across multiple disciplines, in which also sequences like trajectories can be considered as a group. As groups become more diverse in heterogeneity and...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Lohrer, Andreas, Malik, Darpan, Zelenka, Claudius, Kröger, Peer
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 25.04.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Group Anomaly Detection (GAD) identifies unusual pattern in groups where individual members might not be anomalous. This task is of major importance across multiple disciplines, in which also sequences like trajectories can be considered as a group. As groups become more diverse in heterogeneity and size, detecting group anomalies becomes challenging, especially without supervision. Though Recurrent Neural Networks are well established deep sequence models, their performance can decrease with increasing sequence lengths. Hence, this paper introduces GADformer, a BERT-based model for attention-driven GAD on trajectories in unsupervised and semi-supervised settings. We demonstrate how group anomalies can be detected by attention-based GAD. We also introduce the Block-Attention-anomaly-Score (BAS) to enhance model transparency by scoring attention patterns. In addition to that, synthetic trajectory generation allows various ablation studies. In extensive experiments we investigate our approach versus related works in their robustness for trajectory noise and novelties on synthetic data and three real world datasets.
ISSN:2331-8422