Prune it Yourself: Automated Pruning by Multiple Level Sensitivity
- Resource Type
- Conference
- Authors
- Yan, Zhaoyi; Xing, Peiyin; Wang, Yaowei; Tian, Yonghong
- Source
- 2020 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR) MIPR Multimedia Information Processing and Retrieval (MIPR), 2020 IEEE Conference on. :73-78 Aug, 2020
- Subject
- Communication, Networking and Broadcast Technologies
Computing and Processing
Geoscience
Robotics and Control Systems
Signal Processing and Analysis
Sensitivity
Indexes
Quantization (signal)
Mathematical model
Neural networks
Computational modeling
Acceleration
deep neural network
model compression
pruning
- Language
Deep neural network pruning is to reduce the model size by removing redundant structures and weights. Existing methods focus on single layer information which ignore other layers. And pruning progress is simply removing all weights at the same time. To address these limitations, we propose Prune it Yourself (PIY) framework. First, we collect both filter and channel sensitivity information. Then combine them to decide the structure to be pruned. At last we use the gradual pruning algorithm to reduce the accuracy loss without extra hyper-parameters. We use VGG-16 and ResNet to perform experiments on CIFAR-10 and ImageNet. The experimental results prove the effectiveness of our method.