A Teacher-Student Knowledge Distillation Framework for Enhanced Detection of Anomalous User Activity
- Resource Type
- Conference
- Authors
- Hsu, Chan; Ku, Chan-Tung; Wang, Yuwen; Hsieh, Minchen; Wu, Jun-Ting; Hsieh, Yunhsiang; Chang, PoFeng; Lu, Yimin; Kang, Yihuang
- Source
- 2023 IEEE 24th International Conference on Information Reuse and Integration for Data Science (IRI) IRI Information Reuse and Integration for Data Science (IRI), 2023 IEEE 24th International Conference on. :20-21 Aug, 2023
- Subject
- Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineering Profession
Robotics and Control Systems
Analytical models
Machine learning
Data science
Data models
Rough surfaces
Behavioral sciences
IEEE activities
anomaly detection
knowledge distillation
log analysis
multi-label classification
- Language
- ISSN
- 2835-5776
As information systems continuously produce high volumes of user event log data, efficient detection of anomalous activities indicative of insider threats becomes crucial. Typical supervised Machine Learning (ML) methods are often labor-intensive and suffer from the constraints of costly labeled data with unknown anomaly dependencies. Here we introduce a knowledge distillation ML framework, using multiple binary classifiers as teacher models and a multi-label model as the student. Leveraging the soft targets of teacher models, we demonstrate that the student model significantly improves performance.