Grid management is made more challenging by the unpredictable nature of solar resources as solar dispersion rates continue to increase. The intermittent and unpredictable nature of renewable energy sources' power supply makes grid integration one of the biggest hurdles. Thus, it becomes crucial to anticipate solar power to maintain grid stability, permit an ideal unit commitment, and enable cost-effective dispatch. Every year, new methods and strategies are developed globally to improve model accuracy to lower forecast uncertainty. Explaining artificial intelligence’s (AI) decision-making process is challenging due to its "black box" character, especially in complicated models such as neural networks. Concerns concerning the accuracy and dependability of outcomes generated by AI are raised by this opacity. The development of explainable AI techniques to clarify the decision-making process underlying AI-generated predictions is strongly encouraged in order to address this issue, which is critical for ensuring confidence, accountability, and ethical use of AI. To address this problem, this study proposes several machine learning-based techniques, such as decision trees (DT), random forest (RF), extra trees (ET), and extreme gradient boosting (XGB) used to forecast the global horizontal irradiance (GHI) using an explainable AI model, which determines the most significant factor for the estimation and derive a basis for the estimation outcomes employing an explainable AI.