IEEE Access (Jan 2024)

ASPDD: An Adaptive Knowledge Distillation Framework for TSP Generalization Problems

  • Sisi Zheng,
  • Rongye Ye

DOI
https://doi.org/10.1109/ACCESS.2024.3387851
Journal volume & issue
Vol. 12
pp. 52902 – 52910

Abstract

Read online

The traveling salesman problem (TSP) is a classic non-deterministic polynomial-hard (NP-hard) problem. Currently, almost all the research works utilizing Transformers to solve TSP problems employ supervised learning. However, it is extremely challenging to obtain accurate solution labels for the model in large-scale instances, resulting in a severe lack of scale generalization capability. Recent research combines knowledge distillation and Transformer to effectively address the distribution generalization issue. Nonetheless, if the framework is directly applied to the problem of scale generalization, the solution is not satisfactory. To address the aforementioned issues, we propose an adaptive soft probability distributed distillation (ASPDD) framework to improve Transformer scale generalization capability. The ASPDD framework uses a soft probability distributed distillation (SPDD) method to improve the knowledge interaction between the student and teacher models. In particular, ASPDD introduces an adaptive selection strategy, so that the student model can find weak points for improvement training in each training. This framework is utilized for training a model for a small-scale instance (TSP20) and deploying it to a large-scale instance. Extensive benchmarking (10000 instances) demonstrates that our ASPDD framework can achieve competitive results as compared to other Transformer baseline models and knowledge distillation frameworks. In addition, the ASPDD framework is applied to eleven publicly accessible benchmark datasets (TSPLIB). On six benchmark datasets, the experimental results demonstrate that our ASPDD framework outperforms previous knowledge distillation models.

Keywords