A comparison of two methods for detecting abrupt changes in the variance of climatic time series
Abstract. Two methods for detecting abrupt shifts in the variance – Integrated Cumulative Sum of Squares (ICSS) and Sequential Regime Shift Detector (SRSD) – have been compared on both synthetic and observed time series. In Monte Carlo experiments, SRSD outperformed ICSS in the overwhelming majority of the modeled scenarios with different sequences of variance regimes. The SRSD advantage was particularly apparent in the case of outliers in the series. On the other hand, SRSD has more parameters to adjust than ICSS, which requires more experience from the user in order to select those parameters properly. Therefore, ICSS can serve as a good starting point of a regime shift analysis. When tested on climatic time series, in most cases both methods detected the same change points in the longer series (252–787 monthly values). The only exception was the Arctic Ocean sea surface temperature (SST) series, when ICSS found one extra change point that appeared to be spurious. As for the shorter time series (66–136 yearly values), ICSS failed to detect any change points even when the variance doubled or tripled from one regime to another. For these time series, SRSD is recommended. Interestingly, all the climatic time series tested, from the Arctic to the tropics, had one thing in common: the last shift detected in each of these series was toward a high-variance regime. This is consistent with other findings of increased climate variability in recent decades.