Knowledge Graph Embedding (KGE) is to learn continuous vectors of entities and relations in the Knowledge Graph (KG). Inspired by the R-GCN model, we propose a novel embedding learning model named RotatGAT, which combines the RotatE model and the GAT model. The goal is to overcome the shortcomings of R-GCN, that has a relatively high computing complexity and cannot distinguish the importance of neighbors. We introduce the RotatE model into RotatGAT to represent the embeddings of heterogeneous entities and relations in KG. Considering RotatE cannot use the structure information to learn entities' embeddings, we introduce the GAT model to learn the importance of neighbors of an entity and aggregate the feature information of neighbors for graph embedding learning. The link prediction experiments show the overall performance of RotatGAT on four benchmark datasets outperforms existing state-of-the-art models.