IEEE Access (Jan 2020)

Extreme Learning Machine Under Minimum Information Divergence Criterion

  • Chengtian Song,
  • Lizhi Pan,
  • Qiang Liu,
  • Zhihong Jiang,
  • Jianguang Jia

DOI
https://doi.org/10.1109/ACCESS.2020.3007522
Journal volume & issue
Vol. 8
pp. 122026 – 122035

Abstract

Read online

In recent years, extreme learning machine (ELM) and its improved algorithms have been successfully applied to various classification and regression tasks. In these algorithms, MSE criterion is commonly used to control training error. However, MSE criterion is not suitable to deal with outliers, which can exist in general regression or classification tasks. In this paper, a novel extreme learning machine under minimum information divergence criterion (ELM-MinID) is proposed to deal with the training set with noises. In minimum information divergence criterion, the Gaussian kernel function and Euclidean information divergence are utilized to substitute the mean square error (MSE) criterion to enhance the anti-noise ability of ELM. Experimental results on two synthetic datasets and eleven benchmark datasets show that this method is superior to traditional ELMs.

Keywords