Environmental Research Letters (Jan 2019)
Embracing the complexity of extreme weather events when quantifying their likelihood of recurrence in a warming world
Abstract
Global-average temperatures are a powerful metric for both long-term climate change policy, and also to measure the aggregate fluctuations in weather experienced around the world. However, here we show how the consideration of anomalies in annual temperatures at the global land-average scale, particularly during extremely hot years, tends to overestimate the perceived severity of extreme heat actually felt by local communities during these events. Thus, when global-mean temperatures are used as a proxy to infer the role of climate change on the likelihood of witnessing hot years, the component of extreme event risk attributed to human influence can also be overstated. This study suggests multiple alternative approaches to characterise extreme weather events which have complex spatial signatures, each of which improve the representation of perceived experiences from the event when compared with the default approach of using area-averaged time-series. However, as the definition of an extreme event becomes more specific to the observed characteristics witnessed, changes are needed in the way researchers discuss the likelihood of witnessing ‘similar events’ with future climate change. Using the example of the 2016 hot year, we propose an alternative framework, termed the ‘Time of Maximum Similarity’, to show that events like the record-breaking annual temperatures of 2016 are most likely to be witnessed between 2010–2037, with hot years thereafter becoming significantly more severe than the heat of 2016.
Keywords