GIScience & Remote Sensing (Dec 2025)
Multi-sensor near-realtime burnt area monitoring using a superpixel-based graph convolutional network approach
Abstract
Recent disastrous wildfire seasons highlight the urgent need for timely and accurate wildfire data to support relief efforts, to monitor the environmental impacts and to inform the public. While satellite-based thermal anomaly data is available in near real-time (NRT), deriving actual fire-affected areas from NRT imagery remains challenging. The proposed methodology combines a superpixel segmentation algorithm with rule-based and deep learning classification techniques to accurately derive burnt areas (BA) in NRT. This approach supports a range of mid- to high-resolution optical sensors and fuses data from diverse sources to continuously refine the burnt area during the monitoring of active fires. The NRT (DLRBAv2NRT) and the refined non-time critical (DLRBAv2NTC) BA product based on mid-resolution Sentinel-3 imagery were produced and tested against established global BA products for wildfire seasons in Greece 2023, British Columbia (Canada) 2023, and Central Chile 2023/2024. DLRBAv2NTC classified BA with the highest accuracies over all study regions (avg. IoU: 0.71; avg. F1-Score: 0.83). Despite its NRT processing capability, the DLRBAv2NRT achieved comparable accuracies (avg. IoU: 0.69; avg. F1-Score: 0.81) and could outperform the well-established and widely used global NASA burnt area product MCD64A1v061 by +2% (IoU) and +1% (F1-Score). Furthermore, the multi-sensor and fusion capability of the methodology was successfully demonstrated for the 2024 Valparaiso fire in Chile. The proposed mapping procedure demonstrates a fully-automated and flexible approach to derive burnt area delineations from satellite data in NRT with high accuracy. This allows for high-frequency monitoring of NRT burnt areas on a global scale.
Keywords