IEEE Access (Jan 2019)
SSEM: A Novel Self-Adaptive Stacking Ensemble Model for Classification
Abstract
In the past decades, the ensemble systems have been shown as an efficient method to increase the accuracy and stability of classification algorithms. However, how to get a valid combination of multiple base-classifiers is still an open question to be solved. In this paper, based on the genetic algorithm, a new self-adaptive stacking ensemble model (called SSEM) is proposed. Different from other ensemble learning classification algorithms, SSEM selectively integrates different base-classifiers, and automatically selects the optimal base-classifier combination and hyper-parameters of base-classifiers via the genetic algorithm. It is noted that all of machine learning methods can be the components of SSEM. In this work, based on two base-classifier selection principles (low complexity of base-classifier and high diversity between different base-classifiers), we select five state-of-art classifiers including Naïve Bayes (NB), Extremely Randomized trees (ERT), Logistic, Random Forest (RF) and Classification and Regression Tree (CART) as the base-classifiers of SSEM. To demonstrate the efficiency of SSEM, we have applied it to nine different datasets. Compared with other 11 state-of-art classifiers (NB, ERT, Logistic, RF, CART, Back Propagation Network (BPN), Support Vector Machine (SVM), AdaBoost, Bagging, Convolutional Neural Networks (CNN) and Deep neural network (DNN)), SSEM always performs the best under the five evaluation indexes (Accuracy, Recall, AUC, F1-score and Matthews correlation coefficient (MCC)). Moreover, the significance test result shows that SSEM can achieve highly competitive performance against the other 11 state-of-art classifiers. Altogether, it is evident that SSEM can be a useful framework for classification.
Keywords