The next-item recommendation can extract critical information from the historical sequence and predict the next actions of users. To better extract users’ interests, some sequential recommender methods propose position-aware attention networks to obtain users’ general intentions. Nonetheless, although these methods have achieved superior performances, they cannot effectively extract core information from historical behavior sequences such as position weights, the dynamic categories of users, and the dynamic preferences of users. The position information in the historical sequence can assist in the modeling of user interest, and the dynamic category of users can help us ensure the major intention of users. Moreover, capturing the dynamic preference of users can help the model learn the evolution tendency of user interest and make better recommendations. Therefore, this paper proposes a Position-category-aware Attention Network (PCAN) to consider the above three factors. First, this model obtains the dynamic category of the user in the data preprocessing stage. Then, a long-term attention module is constructed to get the interaction between users and items in the long-term sequential behavior, to better capture the users’ long-term preference representation. Meanwhile, the model utilizes the self-attention method to extract users’ short-term interest features. Finally, two kinds of preference representation are adaptively fused through an attention-based method. On five kinds of Amazon public datasets, the experimental results indicate that our proposed model PCAN achieves better performances on AUC,Precision,Recall and AUC,Precision,Recall, which demonstrates the superior performance of the method.