IEEE Access (Jan 2019)

MPCE: A Maximum Probability Based Cross Entropy Loss Function for Neural Network Classification

  • Yangfan Zhou,
  • Xin Wang,
  • Mingchuan Zhang,
  • Junlong Zhu,
  • Ruijuan Zheng,
  • Qingtao Wu

DOI
https://doi.org/10.1109/ACCESS.2019.2946264
Journal volume & issue
Vol. 7
pp. 146331 – 146341

Abstract

Read online

In recent years, multi-classifier learning is of significant interest in industrial and economic fields. Moreover, neural network is a popular approach in multi-classifier learning. However, the accuracies of neural networks are often limited by their loss functions. For this reason, we design a novel cross entropy loss function, named MPCE, which based on the maximum probability in predictive results. In this paper, we first analyze the difference of gradients between MPCE and the cross entropy loss function. Then, we propose the gradient update algorithm based on MPCE. In the experimental part of this paper, we utilize four groups of experiments to verify the performance of the proposed algorithm on six public datasets. The first group of experimental results show that the proposed algorithm converge faster than the algorithms based on other loss functions. Moreover, the results of the second group show that the proposed algorithm obtains the highest training and test accuracy on the six datasets, and the proposed algorithm perform better than others when class number changing on the sensor dataset. Furthermore, we use the model of convolutional neural network to implement the compared methods on the mnist dataset in the fourth group of experiments. The results show that the proposed algorithm has the highest accuracy among all executed methods.

Keywords