1. DualAttlog: Context aware dual attention networks for log-based anomaly detection.
- Author
-
Yang H, Sun D, and Huang W
- Subjects
- Humans, Natural Language Processing, Algorithms, Neural Networks, Computer, Semantics, Attention physiology
- Abstract
Most existing log-driven anomaly detection methods assume that logs are static and unchanged, which is often impractical. To address this, we propose a log anomaly detection model called DualAttlog. This model includes word-level and sequence-level semantic encoding modules, as well as a context-aware dual attention module. Specifically, The word-level semantic encoding module utilizes a self-matching attention mechanism to explore the interactive properties between words in log sequences. By performing word embedding and semantic encoding, it captures the associations and evolution processes between words, extracting local-level semantic information. while The sequence-level semantic encoding module encoding the entire log sequence using a pre-trained model. This extracts global semantic information, capturing overall patterns and trends in the logs. The context-aware dual attention module integrates these two levels of encoding, utilizing contextual information to reduce redundancy and enhance detection accuracy. Experimental results show that the DualAttlog model achieves an F1-Score of over 95% on 7 public datasets. Impressively, it achieves an F1-Score of 82.35% on the Real-Industrial W dataset and 83.54% on the Real-Industrial Q dataset. It outperforms existing baseline techniques on 9 datasets, demonstrating its significant advantages., Competing Interests: Declaration of competing interest The authors declare that they have no financial and personal relationships with other people or organizations that can inappropriately influence their work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled., (Copyright © 2024 Elsevier Ltd. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF