Jisuanji kexue (Nov 2022)

Universal Multi-class Ensemble Method with Self Adaptive Weights

  • WEI Jun-sheng, LIU Yan, CHEN Jing, DUAN Shun-ran

DOI
https://doi.org/10.11896/jsjkx.210900054
Journal volume & issue
Vol. 49, no. 11
pp. 212 – 220

Abstract

Read online

Ensemble learning has always been one of the strategies to build a powerful and stable predictive model.It can improve the accuracy and stability of the results by fusing multiple models.However,existing ensemble methods still have certain shortcomings in the calculation of weights.When facing a variety of classification problems,they cannot adaptively select ensemble weights,and they are not universal.In view of the above problems,a universal multi-class ensemble method with self-adaptive weights(UMEAW) is proposed.Different to usual ensemble classification method that only targets one kind of classification task,when facing different classification problems,firstly,UMEAW calculates the weight allocation coefficient according to the number of classification,and then the weights of base classifiers is automatically calculated according to the model evaluation index and the weight allocation coefficient by using the distribution characteristics of exponential function.Finally,the weights is adjusted adaptively through continuous iteration to realize model ensemble under different classification tasks.Experimental results show that UMEAW can achieve model ensemble on 9 datasets with different classification numbers,different fields and different scales,and the effect of UMEAW is better than the baselines in most tasks.Compared with a single model,F1 value increases by 3%~25% after UMEAW fusion.Compared with other ensemble methods,the F1 value improves by 1%~2%.It is proved that UMEAW is universal and effective.

Keywords