In recent years, domain adaptation methods have achieved a certain success in cross-domain unsupervised fault diagnosis. For reducing the distribution discrepancies, the most existing multi-classifier minimax optimization methods employed traditional metrics such as LI, L2, Wasserstein distance to measure the difference of multi-classifiers, but there are still three disadvantages: 1. The difference between multi-classifiers is optimized only by maximizing the distance metric. 2. The category confusion (uncertainty) of each classifier cannot be optimized when metrics minimized. 3. The prediction uncertainty of classifiers is excessively optimized when maximized. Therefore, this paper proposes a novel uncertainty correlation metric (UC) in the gradual inference multi-classifier structure to improve domain adaptation for cross-location fault diagnosis. The UC metric is constructed from class correlation and uncertainty weight, and applied to the proposed gradual inference multi-classifier structure for minimax training (UC-GI). First, the information recognition difference from the bottom to the top of the gradual inference structure constructs the original difference of the multiple classifiers. Second, when maximizing the UC metric, each classifier performs multi-category information trade-offs. Third, when the metric is minimized to unity the prediction results of the classifiers, UC-GI can optimize the prediction uncertainty of each classifier, thereby reducing category confusion. Final, built on UC-GI, a novel class-level distribution alignment (UC-GIDA) is constructed by implementing classic domain adversarial learning in gradual inference. Extensive comparison and evaluation verity the effectiveness and advantages of the proposed method.