International Journal of Applied Earth Observations and Geoinformation (May 2024)

Hierarchical local global transformer for point clouds analysis

  • Dilong Li,
  • Shenghong Zheng,
  • Ziyi Chen,
  • Xiang Li,
  • Lanying Wang,
  • Jixiang Du

Journal volume & issue
Vol. 129
p. 103813

Abstract

Read online

Transformer networks have demonstrated remarkable performance in point cloud analysis. However, achieving a balance between local regional context and global long-range context learning remains a significant challenge. In this paper, we propose a Hierarchical Local Global Transformer Network (LGTNet), designed to capture local and global contexts in a hierarchical manner. Specifically, we employ serial local and global Transformers to learn the inner-group and cross-group self-attention, respectively. Besides, we propose a geometric moment-based position encoding for local Transformer, enabling the embedding of comprehensive local geometric relationship. Additionally, we also introduce a global feature pooling module that extracts the global features from each encoder layers. Extensive experimental results demonstrate that LGTNet achieves state-of-the-art performance on ShapeNetPart and ScanObjectNN datasets. This approach effectively enhances the understanding of point cloud scenes, thereby facilitating the use of point cloud data in remote sensing applications.

Keywords