Medical diagnosis and treatment are greatly aided by biomedical image analysis. Deep learning model training is difficult, nevertheless, due to the scarcity of labelled medical images. For creating synthetic biological images to supplement the training data, Generative Adversarial Networks (GANs) have shown good promise. In this research paper, we focus on the application of GANs for biomedical image augmentation. Specifically, we investigate and compare the performance of a prominent GAN architecture: Deep Convolutional GAN (DCGAN) is a variant of GAN specifically designed for image generation tasks. We assess the generated images based on their quality, diversity, and preservation of biomedical aspects using different evaluation metrics. We also used a classification process to compare the classifier on real and synthetic augmented data. Our test findings show that DCGAN is capable of producing realistic synthetic biomedical images. The results of this work advance knowledge of GANs for biomedical image enhancement and offer guidance on choosing the DCGAN designs for tasks requiring medical image analysis. Researchers and practitioners can increase the diversity and quantity of training data by utilizing GAN-based augmentation strategies, which will enhance the performance and generalization of deep learning models in biomedical applications.The primary objective of this study is to evaluate the effectiveness of DCGAN in generating synthetic microscopic biomedical images of cervical cancer that can augment the training data for deep learning classification models. We aim to assess the quality, diversity, and preservation of biomedical features in the generated images.The structure of this study is as follows: we begin by brief introduction of Cervical cancer, GAN in general and DCGAN. Section 2: lays down review of the related literature. Section 3: presents the methodology of the study. Section 4: presents the results and discussion from the study. Section 5: summaries the finding and concludes with prospective future applications.