IEEE Access (Jan 2021)
Pixel-Level Prediction for Ocean Remote Sensing Image Features Fusion Based on Global and Local Semantic Relations
Abstract
With the rapid development of remote-sensing imaging technology, remote-sensing images have become increasingly diverse, and people are paying more attention to ocean remote-sensing research. Because ocean remote-sensing data are complex, and the ocean environment is diverse, results will differ, even if the same target is detected at different times in the same scene. To obtain more semantic features and better pixel-level prediction capabilities, this paper proposes a pixel-level ocean remote-sensing image algorithm (GLPO-Net) that combines local and global features. First, texture features, color features, and spatial relationship features are extracted. Second, the algorithm constructs a multiscale local cross-attention mechanism strategy to obtain feature weight information in different directions to fully mine the local features of ocean remote-sensing images. Concurrently, an algorithm constructs a multiscale global cross-attention mechanism strategy to obtain global features. Then, the fusion of global features and local features is described in each submodule to obtain more representative deep features. Finally, small-sample ocean remote-sensing is described via image pixel-level prediction. The algorithm proposed in this paper has been tested with three public ocean remote-sensing datasets. The experimental results show that the proposed GLPO-Net algorithm can learn features from small samples of ocean remote-sensing images. Compared to the prediction results of other remote-sensing image algorithms, GLPO-Net exhibits better prediction capabilities.
Keywords