Exceeding the 1.5°C warming threshold seems inevitable

Paris Agreement of 2015 legally obliges the international community to keep global anthropogenic warming well below 2.0°C and strive to maintain the border 1.5 °C above pre-industrial levels. Despite recent efforts to limit carbon emissions, current trends suggest that the 1.5°C threshold is likely to be exceeded in the coming decades. The 2018 IPCC report stated that there is a 66% chance that the threshold will be exceeded between 2030 and 2052 if global warming continues at the current rate.

Traditional approaches and their challenges

Complex models are commonly used to predict the timing of exceedances. Earth System Models (ESMs)However, these models are computationally intensive and their internal variability is model-dependent, which can significantly bias their projections. There are also issues such as the “hot model” problem and inter-model uncertainties that increase uncertainty in ESM projections. In addition, the 20-year average that the IPCC uses to filter out natural cycles may not be sufficient given the multidimensional variability of the climate system.

Innovative data-driven approach

As an alternative or complement to the ESM, a purely data-driven stochastic approach, based solely on observed global temperatures. This approach is based on the knowledge that the global average temperature is composed of anthropogenic warming trend (caused by human factors such as greenhouse gases, aerosols and land use changes) and stochastic natural trend (arising from natural variability, e.g. solar cycle and volcanism). Natural trends superimpose on the anthropogenic trend and can either increase or decrease it. Therefore, it is correct consideration of natural variability crucial for estimating the magnitude of current and future anthropogenic trends.

The basis of this approach is the use of the properties persistence observed global temperatures, specifically their long-term persistence with a Hurst exponent close to 1. This property allows generating simulated data (so-called "surrogate data"), which reflect the natural variability of global average temperature.

Methodology

The process involves three main steps for data Berkeley Earth a SnakeCRUT:

  • Long-term persistence analysis: Used Detrended Fluctuation Analysis (DFA3) to determine the Hurst exponent, which quantifies how the natural fluctuations of the record change with the length of the time period. A higher Hurst exponent indicates stronger long-term persistence and a more pronounced "hill-valley" structure.
  • Generating simulated time series: A large number of simulated time series with the same Hurst exponent as observed in the real data are created. These simulated records serve as a proxy set for the monthly data between 1970 and 2023.
  • Taking ENSO into account: For more accurate estimates, the influence of the El Niño Southern Oscillation (ENSO) is taken into account, leading to a more accurate set of simulated data. This approach allows for the determination of q-quantiles of natural trends, which in turn determine the uncertainty in the anthropogenic trend.

Key findings and predictions

Assuming that anthropogenic warming continues at the current rate, the study estimates times of exceeding the thresholds 1.5 °C and 2.0 °C:

  • Threshold 1.5 °C:
    • Best guess: 2031 (Berkeley Earth) and 2035 (HadCRUT).
    • Very likely range (90 %): 2019–2052 (Berkeley Earth) and 2022–2057 (HadCRUT).
    • Probable range (66 %): 2024–2042 (Berkeley Earth) and 2027–2046 (HadCRUT).
  • Threshold 2.0 °C:
    • Best guess: 2056 (Berkeley Earth) and 2060 (HadCRUT).
    • Very likely range (90 %): 2039–2086 (Berkeley Earth) and 2042–2091 (HadCRUT).
    • Probable range (66 %): 2046–2071 (Berkeley Earth) and 2049–2075 (HadCRUT).

Comparison with IPCC models

The results of a data-driven approach are surprisingly consistent with IPCC projections based on scenarios SSP2-4.5 (medium emissions) and SSP3-7.0 (high emissions), especially for the near (around 2031) and medium (around 2051) future. This agreement is significant because the two approaches are methodologically completely independent. While the uncertainty in the ESM projections is related to internal variability and representation of physical processes, the uncertainty in the data-driven approach stems from the statistical estimation of the past natural trend. The fact that these independent methods yield similar results, strongly supports each other.

Study indicates that Exceeding the 1.5°C warming threshold seems inevitable. Nevertheless, there is hope that exceeding 2.0°C could be avoided if global emissions are eliminated before 2050. However, as this may not be feasible or sufficient, it is necessary to consider overrun management, which means returning to below 2.0°C through negative emissions.

Imagine trying to navigate a ship through unpredictable oceans, where the weather is constantly changing. You have two types of forecasts available: a comprehensive satellite system (which symbolizes Earth System Models) and a simpler but more reliable system based on long-term observations of winds and currents (which represents data-driven approach). While these systems are independent and each has its own strengths and sources of uncertainty, they surprisingly show very similar paths and risk zones for the approaching storm. This gives you stronger confidence in the relentlessness of the approaching weather and the need to prepare for its consequences, even for a short-term overshoot of stormy conditions, if you want to eventually return to calmer waters. Spring

- if you found a flaw in the article or have comments, please let us know.

You might be interested in...