Simple molecular graphs or molecular line notations are insufficient for molecular representation learning models that automatically learn molecule representations to acquire deep semantic features about chemistry. Therefore, we proposed the Morgan Fingerprint Graph, a novel molecular graph model constructed with the nodes of the Morgan Fingerprint Identifier, whose nodes incorporate various properties of the atom neighborhood. Based on our Morgan Fingerprint Graph, we trained a BERT-based molecular representation model MFGB which employs a local rather than global attention module to adapt the model to the Morgan Fingerprint Graph. The attention module of MFGB regulates the exchange of information in the attention layer based on the connection information of the atoms in the molecules. To verify the effectiveness of MFGB, we selected several mainstream datasets for evaluation. The results demonstrated that our method improve the performance of molecular properties prediction. In addition, we analyzed MFGB from both the over-smoothing and the attention perspectives. The analysis indicates that the node vector representations learned from the Morgan Fingerprint Graph are generally less smoothing, and the attention distributions of MFGB match the partial charge density distribution of the molecules.