Remote Sensing (Sep 2020)

Evaluation of GEOS-Simulated L-Band Microwave Brightness Temperature Using Aquarius Observations over Non-Frozen Land across North America

  • Jongmin Park,
  • Barton A. Forman,
  • Rolf H. Reichle,
  • Gabrielle De Lannoy,
  • Saad B. Tarik

DOI
https://doi.org/10.3390/rs12183098
Journal volume & issue
Vol. 12, no. 18
p. 3098

Abstract

Read online

L-band brightness temperature (Tb) is one of the key remotely-sensed variables that provides information regarding surface soil moisture conditions. In order to harness the information in Tb observations, a radiative transfer model (RTM) is investigated for eventual inclusion into a data assimilation framework. In this study, Tb estimates from the RTM implemented in the NASA Goddard Earth Observing System (GEOS) were evaluated against the nearly four-year record of daily Tb observations collected by L-band radiometers onboard the Aquarius satellite. Statistics between the modeled and observed Tb were computed over North America as a function of soil hydraulic properties and vegetation types. Overall, statistics showed good agreement between the modeled and observed Tb with a relatively low, domain-average bias (0.79 K (ascending) and −2.79 K (descending)), root mean squared error (11.0 K (ascending) and 11.7 K (descending)), and unbiased root mean squared error (8.14 K (ascending) and 8.28 K (descending)). In terms of soil hydraulic parameters, large porosity and large wilting point both lead to high uncertainty in modeled Tb due to the large variability in dielectric constant and surface roughness used by the RTM. The performance of the RTM as a function of vegetation type suggests better agreement in regions with broadleaf deciduous and needleleaf forests while grassland regions exhibited the worst accuracy amongst the five different vegetation types.

Keywords