Knowledge distillation- based lightweight network for power scenarios inspection
- Resource Type
- Conference
- Authors
- Chen, Nuotian; Mao, Jianxu; Peng, Ziyang; Yi, Junfei; Tao, Ziming; Wang, Yaonan
- Source
- 2023 China Automation Congress (CAC) Automation Congress (CAC), 2023 China. :3990-3996 Nov, 2023
- Subject
- Aerospace
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Knowledge engineering
Automation
Computational modeling
Object detection
Inspection
Feature extraction
Robustness
knowledge distillation
power scenarios
model lightweight
feature distillation module
logit distillation module
- Language
- ISSN
- 2688-0938
To address the problem of target detection networks in power scenarios requiring a large number of parameters and complex structures to ensure accuracy, a knowledge distillation method is proposed to transfer intermediate feature knowledge and output logic knowledge from the teacher model to the student model. The distilled student model achieves high accuracy while maintaining the advantages of fewer parameters and faster inference speed. GFL-ResI8 is used as the student model and GFL-Resl0l as the teacher model. The distilled student model achieves an accuracy of 78.8% with 19.1M parameters, an inference speed of 24.9 img/s, and a computational cost of 155.2 GFlops. Experimental results show that our distillation method achieves the highest accuracy for the student model compared to other distillation methods.