IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)
A NeRF-Based Color Consistency Correction Method for Remote Sensing Images
Abstract
Remote sensing images are prone to significant variations in photometry due to changing seasons, illumination, and atmospheric conditions. These variations often result in visible stitching seams at the edges of mosaic images, which can impact the visual quality and interpretation of the data. To address color inconsistencies in remote sensing images, conventional methods rely on absolute radiation correction and relative radiation normalization. However, these approaches may not be effective in handling complex variations and producing visually pleasing outcomes. This article introduces a novel approach based on neural radiance fields (NeRFs) for correcting color inconsistencies in multiview images. Our method leverages implicit expressions and reillumination of the feature space to capture the intrinsic radiance and reflectance properties of the scene. By intricately weaving image features together, we generate a fusion image that seamlessly integrates color information from multiple views, resulting in improved color consistency and reduced stitching seams. To evaluate the effectiveness of our approach, we conducted experiments using satellite and unmanned aerial vehicle images with significant variations in range and time. The experimental results demonstrate that our NeRF-based method produces synthesized images with exceptional visual effects and smooth color transitions at the edges. The fusion images exhibit enhanced color consistency, effectively reducing visible stitching seams and elevating the overall image quality.
Keywords