In the modern era, the classification dilemma of Imbalanced Datasets (IDs) has received a sustained interest, both in practical and theoretical aspects. One widely used Oversampling (OS) approach to handle Imbalanced Classification (IC) is the "Synthetic Minority Over-sampling Technique (SMOTE)". Over the past decade, researchers have proposed various SMOTE-based approaches, such as Borderline SMOTE (B-SMOTE), GSMOTE, SVM SMOTE, and among others. In this paper, we propose the Class-Balanced by SMOTE & Filtering combined with XGBoost (BSF-XGBoost) approach to address highly imbalanced datasets. This prototype simultaneously tackles feature selection with different filtering mechanisms and the class imbalance dilemma. The XGBoost approach offers several hyperparameters that focus more on the Minority Class (Minc) misclassification through the training stage. To evaluate the performance of the BSF-XGBoost algorithm, we conducted experiments on 10 highly imbalanced binary-class datasets, employing existing classification models, and comparing the results with various OS approaches. Finally, the findings indicate that our proposed algorithm exhibits slightly better performance compared to other existing classification approaches and variants of SMOTE-based methods.