What is gained when midterm oral exams are implemented in the undergraduate engineering classroom? This research paper examines whether midterm oral exam scores add value above and beyond midterm written exam scores in predicting students' final written exam scores. The purpose of this study is to evaluate the potential utility of oral exams as formative assessments: if oral exam scores provide additional information beyond written exam scores, they may add meaningful value for students and instructors. The current study investigates this question using data from 10 undergraduate engineering classes (N = 925), representing 6 different courses and 5 different instructors. Though course and exam context differed, all classes implemented a low-stakes midterm oral exam, a midterm written exam, and a midterm final exam. We compared two multiple regression models: a smaller model with only the midterm written exam score as a predictor for the final written exam score, and a full model with both the midterm written exam score and the midterm oral exam score as predictors for the final written exam score. We found that the fuller model with oral exam score was a better fit for our data, indicating that including oral exams explains more variance in students' final exam performance than midterm written exams alone. Further analyses tentatively indicate that the granularity of the rubric used to score oral exams matters, with finer-grained rubrics more consistently providing predictive value. This study has implications for developing a theory of oral exams, as it leaves room for the possibility that oral exams tap deeper learning processes than written exams. These results show that oral exams provide actionable information instructors can use to make interventions to foster students' meaningful learning before the end of the term. The quantitative analysis also provides instructors with a simple statistical measure to assess the role of oral exams in student's learning. [ABSTRACT FROM AUTHOR]