IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)
Multitask GANs for Oil Spill Classification and Semantic Segmentation Based on SAR Images
Abstract
The increasingly frequent marine oil spill disasters have great harm to the marine ecosystem. As an essential means of remote sensing monitoring, synthetic aperture radar (SAR) images can detect oil spills in time and reduce marine pollution. Many look-alike oil spill regions are difficult to distinguish in SAR images, and the scarcity of real oil spill data makes it difficult for deep learning networks to train effectively. In order to solve the abovementioned problems, this article designs a multitask generative adversarial networks (MTGANs) oil spill detection model to distinguish oil spills and look-alike oil spills and segment oil spill areas in one framework. The discriminator of the first generative adversarial network (GAN) is transformed into a classifier, which can effectively distinguish between real and look-alike oil spills. The generator of the second GAN model integrates a fully convolutional symmetric structure and multiple convolution blocks. Multiple convolution blocks can extract the shallow oil spill information, and the fully convolutional symmetric structure can extract the deeper features of the oil spill information. The algorithm only needs to use a small number of oil spill images as the training set to train the network, and the limitation of the oil spill dataset can be solved. Validation evaluations are conducted on three datasets of Sentinel-1, ERS-1/2, and GF-3 satellites, and the experimental results demonstrate that the proposed MTGANs oil spill detection framework outperforms other models in oil spill classification and semantic segmentation. Among them, the classification accuracy of the oil spill and look-alikes can reach 97.22$\%$. The average OA for semantic segmentation of the oil spill area can be 97.47$\%$ and the average precision can reach 86.69$\%$.
Keywords