Intra prediction plays a critical role in eliminating spatial domain correlation in video coding. To further enhance the accuracy of intra prediction, we propose a Neuron Attention-based Convolutional Neural Network (NACNN) to improve the luma quality of the current coding tree unit (CTU). The NACNN reasonably considers the influence of different quantization parameters on video coding and introduces a lightweight multi-scale neuron attention mechanism to enhance the quality of the reconstructed CTUs while simplifying network complexity. Experimental results demonstrate that the proposed method can achieve average BD-rate savings of 1.24%, 0.34% and 0.16% for Y, Cb and Cr, respectively, compared with H.266/Versatile Video Coding (VVC).