SOIL (Aug 2020)

Game theory interpretation of digital soil mapping convolutional neural networks

  • J. Padarian,
  • A. B. McBratney,
  • B. Minasny

DOI
https://doi.org/10.5194/soil-6-389-2020
Journal volume & issue
Vol. 6
pp. 389 – 397

Abstract

Read online

The use of complex models such as deep neural networks has yielded large improvements in predictive tasks in many fields including digital soil mapping. One of the concerns about using these models is that they are perceived as black boxes with low interpretability. In this paper we introduce the use of game theory, specifically Shapley additive explanations (SHAP) values, in order to interpret a digital soil mapping model. SHAP values represent the contribution of a covariate to the final model predictions. We applied this method to a multi-task convolutional neural network trained to predict soil organic carbon in Chile. The results show the contribution of each covariate to the model predictions in three different contexts: (a) at a local level, showing the contribution of the various covariates for a single prediction; (b) a global understanding of the covariate contribution; and (c) a spatial interpretation of their contributions. The latter constitutes a novel application of SHAP values and also the first detailed analysis of a model in a spatial context. The analysis of a SOC (soil organic carbon) model in Chile corroborated that the model is capturing sensible relationships between SOC and rainfall, temperature, elevation, slope, and topographic wetness index. The results agree with commonly reported relationships, highlighting environmental thresholds that coincide with significant areas within the study area. This contribution addresses the limitations of the current interpretation of models in digital soil mapping, especially in a spatial context. We believe that SHAP values are a valuable tool that should be included within the DSM (digital soil mapping) framework, since they address the important concerns regarding the interpretability of more complex models. The model interpretation is a crucial step that could lead to generating new knowledge to improve our understanding of soils.