With the development of convolutional neural networks, attention has been successfully applied to object detection, object classification, and other fields. Meanwhile, it has significantly improved the performance in these scenes. To fully find the attention’s ability out, this paper uses the FaceBoxes as a baseline and studies some famous attention modules’effect for face detection, like SE, CBAM, and ECA. According to the results, an advanced attention module (AAM) based on CBAM is proposed and tested on the validation dataset of the WIDER FACE. The performance is better than the above attention modules. Compared with the baseline, it is increased by 2.0%, 3.0%, and 2.8%, respectively, and compared with the CBAM by 0.1%, 0.9%, and 1.1%, which fully proves the effectiveness of AAM, and it does not increase model size.