Classification on Unbalanced Datasets using Boost Techniques (AdaBoost M2, SMOTE Boost, RusBoost,..)
Below is the detailed results:
Average Classifier Precision for AdaBoost : 0.77
Average Classifier Precision for RUSBoost : 0.82
Average Classifier Precision for SMOTEBoost : 0.66
Average Classifier Precision for RandomBalanceBoost :0.6
Average Classifier Precision for RandomForest : 0.95
Average Classifier Precision for SVM : 1.0
-
Best performing method based on Average Precision of classifiers: "SVM"
-
Best Performing Ensemble Classifier is "Random Forset" Runner up (second best) is RUSBOOST
Taha Samavati - Analysis of final results