The motion distortion of LiDAR is the phenomenon caused by the movement of the LiDAR carrier. Such phenomenon makes the LiDAR point cloud information unreliable and seriously degrades the tasks that rely on LiDAR such as 3D object perception, localization, and mapping. In this work, a novel end-to-end network named FlowDSnet is proposed to compensate for the distortion of LiDAR point clouds in an end-to-end way. The proposed pipeline takes the original adjacent two frames of point clouds as input and recovers the undistorted coordinates of each point by predicting the scene flow in 3D space. Different from the traditional distortion removal algorithm, our method does not rely on the information of other sensors, such as the Inertial measurement unit (IMU), but applies a novel flow embedding algorithm combined with spatiotemporal interpolation to conduct distortion correction. And the input is two adjacent frames of raw LiDAR measurement. Besides, both qualitative and quantitative experiments on automative driving datasets are conducted to prove the effectiveness of the proposed pipeline. The experimental results show that this network can accomplish the LiDAR de-skewing task well, especially in a scene with many dynamic objects.