In recent years, with the increasing awareness of privacy protection, federated learning has received wide attention as a distributed machine learning method and has been applied in the field of intrusion detection. However, traditional federated learning algorithms perform poorly in such data scenarios due to the non-independent and identically distributed nature of network traffic data from loT devices. To solve this problem, we propose a novel federated learning algorithm called FedGLCD. The algorithm coordinates the local drift and global drift by combining global distillation and local self-distillation to significantly improve model performance. Specifically, FedGLCD dynamically incorporates local historical and global knowledge into the labels to guide model updates in the form of softened labels. We have conducted extensive experiments on the N-BaIoT dataset, and the results show that FedGLCD can achieve better performance in intrusion detection tasks.