In the era of information, data plays a crucial role, especially in the assessment of competitive works. However, large-scale innovative competitions encounter challenges including broad and repetitive indices, incomplete coverage, and ambiguous evaluation standards. To tackle these issues, this paper puts forward a three-fold evaluation program framework: work distribution, evaluation score adjustment, and innovative work selection. At the work distribution stage, a multi-constraint optimization model is formulated to minimize cross-degree variance. The distribution method is optimized using an enhanced particle swarm algorithm. For the scoring stage, three methods are employed: two-stage standard value adjustment for in-group + out-group, standard value adjustment based on judging experience, and standard value adjustment based on distribution distance. These methods ensure a reasonable, fair, and accurate scoring process. In the innovative work selection stage, the standardized extreme deviation model with integral improvement and the integral extreme deviation threshold model are introduced to enhance the judging process’s recognition ability for highly innovative works. Through the implementation of this evaluation program framework, we aim to address the current challenges in evaluating works in large-scale innovation competitions. It is anticipated that the framework will improve the accuracy and comprehensiveness of evaluation indices, thereby fostering scientific research and innovation development.