Algorithms (Jan 2022)

<i>k</i>-Center Clustering with Outliers in Sliding Windows

  • Paolo Pellizzoni,
  • Andrea Pietracaprina,
  • Geppino Pucci

DOI
https://doi.org/10.3390/a15020052
Journal volume & issue
Vol. 15, no. 2
p. 52

Abstract

Read online

Metric k-center clustering is a fundamental unsupervised learning primitive. Although widely used, this primitive is heavily affected by noise in the data, so a more sensible variant seeks for the best solution that disregards a given number z of points of the dataset, which are called outliers. We provide efficient algorithms for this important variant in the streaming model under the sliding window setting, where, at each time step, the dataset to be clustered is the window W of the most recent data items. For general metric spaces, our algorithms achieve O1 approximation and, remarkably, require a working memory linear in k+z and only logarithmic in |W|. For spaces of bounded doubling dimension, the approximation can be made arbitrarily close to 3. For these latter spaces, we show, as a by-product, how to estimate the effective diameter of the window W, which is a measure of the spread of the window points, disregarding a given fraction of noisy distances. We also provide experimental evidence of the practical viability of the improved clustering and diameter estimation algorithms.

Keywords