Jisuanji kexue yu tansuo (Mar 2025)

GCN-Pointransformer Model for Solving Traveling Salesman Problem

  • QIU Yunfei, LIU Yifei, YU Zhilong, JIN Haibo

DOI
https://doi.org/10.3778/j.issn.1673-9418.2404030
Journal volume & issue
Vol. 19, no. 3
pp. 657 – 666

Abstract

Read online

Because the Transformer model is based on the fully connected attention mechanism, the computational complexity is high and the GPU memory usage is too large when solving the classic traveling salesman problem (TSP). To solve this problem, a GCN-Pointransformer model based on graph convolutional embedding layer and multi-head local self-attention mechanism is proposed. Firstly, the graph convolutional embedding method is used to learn spatial features from the input data, and the graph convolutional embedding layer contains multiple convolution kernels that can extract the local features of the input data. Secondly, the multi-head local self-attention (MHLSA) is used, which removes redundant information and extracts useful features. In addition, this paper uses a reversible residual network in the encoder to store only input and output embedding feature pairs during backpropagation. Finally, the model adds a Pointer layer in the decoder, using attention weights as probability distributions to determine the next node to be visited. Comparative experiments on the TSP random datasets show that the optimization gap is reduced by 12%, the GPU memory is reduced by about 11%, and the inference time is reduced by about 25%, and the results show that the proposed method is superior to the standard Transformer model for solving TSP.

Keywords