Next Generation Radio Access Network (NG-RAN) is expected to support extremely high data rates, low-latency applications, and massive machine communication. The increasing growth of network complexity and the highly dynamic service demands make NG-RAN to incorporate Artificial Intelligence and Machine Learning (AI/ML) algorithms due to their ability to deal with complex network architecture and making intelligent decisions. However, AI/ML model performance degradation (i.e., drift) is prevalent in NG-RAN due to their highly dynamic service demands. This paper proposes a novel drift handling mechanism based on the average root mean square error (RMSE) over a defined window. The proposed drift handling mechanism is compared with one class drift detector (OCDD) and evaluated over channel quality indicator (CQI) use-case. The results showed that proposed drift handling mechanism could outperform whenever a model performance degrades.