International Journal of Computational Intelligence Systems (Nov 2018)

AEkNN: An AutoEncoder kNN-Based Classifier With Built-in Dimensionality Reduction

  • Francisco J. Pulgar,
  • Francisco Charte,
  • Antonio J. Rivera,
  • María J. del Jesus

DOI
https://doi.org/10.2991/ijcis.2018.125905686
Journal volume & issue
Vol. 12, no. 1

Abstract

Read online

High dimensionality tends to be a challenge for most machine learning tasks, including classification. There are different classification methodologies, of which instance-based learning is one. One of the best known members of this family is the k-nearest neighbors (kNNs) algorithm. Its strategy relies on searching a set of nearest instances. In high-dimensional spaces, the distances between examples lose significance. Therefore, kNN, in the same way as many other classifiers, tends to worsen its performance as the number of input variables grows. In this study, AEkNN, a new kNN-based algorithm with built-in dimensionality reduction, is presented. Aiming to obtain a new representation of the data, having a lower dimensionality but with more informational features, AEkNN internally uses autoencoders. From this new vector of features the computed distances should be more significant, thus providing a way to choose better neighbors. An experimental evaluation of the new proposal is conducted, analyzing several configurations and comparing them against the original kNN algorithm and classical dimensionality reduction methods. The obtained conclusions demonstrate that AEkNN offers better results in predictive and runtime performance.

Keywords