In recent years, Convolutional Neural Networks (CNNs) have achieved excellent results in the study of single image super-resolution. However, super-resolution algorithms based on CNNs still face serious challenges, such as poor detail reconstruction, numerous parameters, and difficulty of training. A Residual Dense Information Distillation Network (RD-IDN) is proposed in this paper which uses dense skip connections and residual structure to solve the problems of difficult training and low utilization of features in Information Distillation Network. Experimental results show that the proposed method is superior to many other Super Resolution algorithms in terms of reconstruction performance and computational comsumption.