IEEE Access (Jan 2020)
Building Deep Random Ferns Without Backpropagation
Abstract
Randomized sampling-based ensemble learning is emerging as a new alternative to deep neural networks (DNNs) because it supports diversity and locality and does not require backpropagation in the learning process. By connecting randomized weak classifiers in a layer-by-layer manner, it is possible to create models having a performance similar to that of DNNs, but with less overfitting and better generalizability than before. Therefore, in this paper, we propose a deep random ferns (d-RFs) model, in which extremely randomized ferns are connected to multilayers, allowing a high classification performance and a lightweight and fast structure. The input vector is first encoded as a transformed feature vector in the feature encoder layer and then is input to the cascade layers. The feature encoding process is similar to the DNN convolution and helps improve the performance of the final output layer. Unlike in the backpropagation paradigm, the cascade layer adjusts the number of ferns and layers required for the d-RFs adaptively, using only a small scale of data. RFs ensemble approaches have considerably fewer hyper-parameters than a DNN or deep forest model, and the complexity can be determined automatically in a data-dependent manner. In addition, experimental results show that an ensemble of multiple weak classifiers reduces the bias between models through an averaging of the weakly correlated classifiers, resulting in a better overall model. The proposed lightweight d-RFs was successfully applied to benchmark datasets and yielded a similar or better accuracy level and a smaller number of parameters and operations as compared with state-of-the-art methods.
Keywords