Chinese named entity recognition (CNER) is a challenging issue, since CNER has the characteristics of flexible part of speech and no separator between words. ALBERT is a pre-training language representation model proposed by Google and has been widely used because of its good performance. This work proposed a new neural network model, which combines Albert, bidirectional LSTM, CNN and CRF to recognize Chinese named entities. The model first trains character level word embedding through using Albert, then feeds the word embedding results into CNN and BiLSTM to obtain local information and remote information, respectively, and finally outputs the sequence through CRF. According to the experimental evaluation over real datasets, the proposed model achieves better performance than some new existing approaches, such as: Lattice-LSTM-CRF and Collaborative-Graph-Network.