Natural Language Processing (NLP) demands the generation of text that exhibits cohesion, fluidity, and semantic coherence. Text generation plays a pivotal role in achieving this objective. Over time, the evolution of Deep Learning (DL) techniques has led to the emergence of several methods for generating text, including Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), and Transformers. This study undertakes a comprehensive examination of DL methods for text generation within the realm of NLP.After providing a general overview of text generation and its inherent challenges, an extensive exploration of the various deep learning models and their adaptations is conducted. The strengths and limitations of these models are meticulously assessed, while their performance relative to more traditional approaches is also examined. To conclude, current trends are illuminated, and unanswered questions within this domain are posed. Beyond simply identifying areas ripe for further investigation, this review aims to equip both scholars and practitioners with a comprehensive understanding of the latest developments in DL-based text generation.