Scientific Reports (Jul 2024)
Uncertainty quantification for probabilistic machine learning in earth observation using conformal prediction
Abstract
Abstract Machine learning is increasingly applied to Earth Observation (EO) data to obtain datasets that contribute towards international accords. However, these datasets contain inherent uncertainty that needs to be quantified reliably to avoid negative consequences. In response to the increased need to report uncertainty, we bring attention to the promise of conformal prediction within the domain of EO. Unlike previous uncertainty quantification methods, conformal prediction offers statistically valid prediction regions while concurrently supporting any machine learning model and data distribution. To support the need for conformal prediction, we reviewed EO datasets and found that only 22.5% of the datasets incorporated a degree of uncertainty information, with unreliable methods prevalent. Current open implementations require moving large amounts of EO data to the algorithms. We introduced Google Earth Engine native modules that bring conformal prediction to the data and compute, facilitating the integration of uncertainty quantification into existing traditional and deep learning modelling workflows. To demonstrate the versatility and scalability of these tools we apply them to valued EO applications spanning local to global extents, regression, and classification tasks. Subsequently, we discuss the opportunities arising from the use of conformal prediction in EO. We anticipate that accessible and easy-to-use tools, such as those provided here, will drive wider adoption of rigorous uncertainty quantification in EO, thereby enhancing the reliability of downstream uses such as operational monitoring and decision-making.
Keywords