IEEE Access (Jan 2024)

On the Attractive and Repulsive Forces of Generalized Stochastic Neighbor Embedding With Alpha-Divergence

  • Hsin-Yi Lin,
  • Huan-Hsin Tseng,
  • Jen-Tzung Chien

DOI
https://doi.org/10.1109/ACCESS.2024.3420425
Journal volume & issue
Vol. 12
pp. 90380 – 90394

Abstract

Read online

Stochastic neighbor embedding (SNE) performs nonlinear transformation from high-dimensional observation space to low-dimensional latent space which preserves neighbor affinities. Data pairs in latent space tend to be crowded due to the dimensionality reduction. To mitigate the crowding problem, certain characteristics are favorable in the design of the SNE setting. This study presents a fundamental analysis of SNE that not only generalizes the previous SNEs but also provides a systematic way to understand the intrinsic properties. From the perspective of theoretical connection, we are able to conceive a new generalized SNE (g-SNE) by introducing a regularized power-law distribution with the $\alpha $ -divergence for manifold learning. The proposed method generalizes and incorporates various favorable features for the clustering process. In addition, the proposed method provides high flexibility, admitting tailored realizations to properly reflect the similarity between original and dimension-reduced samples. Experiments are performed to analyze the proposed method, and its effectiveness is demonstrated with several learning tasks.

Keywords