IEEE Access (Jan 2025)
Can Knowledge Distillation Enable Seismic Interpolation, Super-Resolution, and Denoising Simultaneously?
Abstract
In seismic data processing, interpolation, super-resolution, and denoising are three key issues to improve data quality. While deep learning methods have successfully addressed one or two of these tasks, simultaneously resolving all three within a single network faces difficulties. These include conflicting objectives and complex network design. This paper proposes a heterogeneous knowledge distillation architecture comprising a teacher network (TNet) and a student network (SNet). TNet implements super-resolution and denoising tasks, while SNet implements interpolation task. This separation avoids goal conflicts between the networks. SNet retains TNet’s structure but reduces channels by half, simplifying both architectures. SNet combines data consistency loss and distillation loss. The data consistency loss integrates $\ell _{1}$ and multi-scale structural similarity. The distillation loss employs synchronous positive-negative learning. Consequently, SNet inherits TNet’s capabilities and further refines them during optimization. For comparison, we combine three interpolation algorithms with three super-resolution algorithms and select the one with the best performance. Experimental results on four field data show the superiority of our method over the counterpart in terms of data quality at a lower computational cost.
Keywords