University courses have the characteristics of dispersed knowledge, low relevance, high text density, and long sentence lengths. Traditional relationship extraction methods cannot capture contextual semantic relationships well, resulting in unsatisfactory entity relationship extraction results. Propose an improved Bert BiGRU Ratt model based relationship extraction method. Firstly, use the Bert pre trained language model to obtain dynamic word vectors and obtain character features; Secondly, the obtained vectorized text is fed into the attention layer of relational words, and different weights are assigned to different characters; Afterwards, Bidirectional Gated Recurrent Unit is used to encode the information of different sentences to obtain sentence features. Finally, the attention mechanism is used to perform feature learning on all information again to obtain the final text information. After conducting experiments on a manually annotated dataset of the university course ‘Data Structure’, satisfactory results were obtained. Among them, the accuracy rate is 84.61%, the recall rate is 89.34%, and the F1 value reaches 84.35%. Compared with the Bert BiLSTM Att and Bert BiGRU Att models, the accuracy rate has increased by 3.4% and 2.1%, the recall rate has increased by 6.2% and 3.9%, and the F1 value has increased by 2.9% and 3.2%. The proposed model has certain outstanding effects and practical application significance.