Remote Sensing (Jul 2024)
Semantic Space Analysis for Zero-Shot Learning on SAR Images
Abstract
Semantic feature space plays a bridging role from ‘seen classes’ to ‘unseen classes’ in zero-shot learning (ZSL). However, due to the nature of SAR distance-based imaging, which is drastically different from that of optical imaging, how to construct an appropriate semantic space for SAR ZSL is still a tricky and less well-addressed issue. In this work, three different semantic feature spaces, constructed using natural language, remote sensing optical images, and web optical images, respectively, are explored. Furthermore, three factors, i.e., model capacity, dataset scale, and pre-training, are investigated in semantic feature learning. In addition, three datasets are introduced for the evaluation of SAR ZSL. Experimental results show that the semantic space constructed using remote sensing images is better than the other two and that the quality of semantic space can be affected significantly by factors such as model capacity, dataset scale, and pre-training schemes.
Keywords