Neural network pruning plays an important role in the deployment on resource-constrained devices by reducing the scale of the network and the computational complexity. However, existing pruning methods only consider the richness of information filters contain, without considering the distribution of information. In image classification, the information related to target area is very important. To address these limitations, we propose HCov to prune filters generating low covariance feature maps. The principle behind is that most of the feature maps generated by filters contain target area information, therefore, maps with low covariance contain either very little information or messy background information unrelated to target. Thus filters generating low covariance feature maps can be pruned with little accuracy drop. HCov calculates the covariance between feature maps in the same layer and removes filters with low covariance feature maps. Through experiments on single-branch and multi-branch networks, the results prove that HCov can prune more redundant filters while maintaining better accuracy. Notably, our method can reduce 68.6% parameters and 71.7% FLOPs of ResNet-110 with only 0.26% top-1 accuracy loss on CIFAR-10. With ResNet-50, we achieve a 44.7% FLOPs reduction by removing 40.8% of the parameters, with only a loss of 0.62% in the top-1 accuracy on ImageNet, which has advanced the state-of-the-art.