• Based on the spatial relationship of signals, a graph neural network signal noise reduction method for RDTS is proposed. • The significant advantage of this method is that it reduces the model size by segmenting the input data while improving the model performance. • The method can greatly reduce the temperature bias and improve the smoothness of the temperature curve. Focusing on the problem of increasing the deviation of temperature measurement for Raman-based distributed temperature sensor (RDTS) caused by random noise, a new method of applying a three-layer GraphSAGE-based graph neural network (3L-GraphSAGE) to noise reduction is proposed, where the spatial relationship between each signal is first constructed, and then the effective denoised results are obtained from the developed 3L-GraphSAGE model. First, an experimental setup is built for collecting fiber signals. Then, the datasets are input into the 3L-GraphSAGE to train the model. Finally, the test datasets are input into the well-trained 3L-GraphSAGE model to obtain effective denoised signals. To evaluate the performance of 3L-GraphSAGE, three evaluation indexes are calculated, including maximum deviation (MD), root mean square error (RMSE) and smoothness. The experimental results show that it can efficiently suppress the random noise and reduce the temperature measurement deviation in RDTS compared with direct demodulation of the raw data, and significantly improve the curve smoothness compared with wavelet transform by soft threshold function (WT-soft) and fast waveform type (FWT). Therefore, 3L-GraphSAGE model can provide an available method for improving the performance of RDTS. [ABSTRACT FROM AUTHOR]