The dual-oxygen channels of FY3D satellite provide a novel cloud detection approach. However, the inherent relationship between the brightness temperature observations of these channels is not necessarily linear. Fortunately, machine learning algorithms have demonstrated good simulation capabilities for nonlinear relationships. This paper explores the principles of Gaussian process regression (GPR), Extreme Learning Machine (ELM), and BP artificial neural network (BP-ANN). Firstly, we establish a new bright-temperature relationship function by examining the channel matching relationship between MWTS-2 temperature channels and MWHS-2 dual-oxygen channels. Subsequently, we investigate the application potential of three nonlinear intelligent algorithms. The results demonstrate that mean error of the GPR algorithm is only 25% of that of the ELM algorithm, which is equivalent to the BP-ANN algorithm. However, the RMSE error is only 2% of the BP-ANN algorithm. Therefore, GPR algorithm has particularly superior performance in simulating nonlinear brightness temperature relationship function. This study offers a new perspective for domestic satellite microwave cloud detection.