A growing body of evidence indicates that anthropogenic greenhouse gases are changing Earth's climate, and that those changes may involve not only changes in climatic means but also in variability. Climate models may be informative about these future changes, but their use is complicated by the fact that they do not capture variability in current climate well. Many methods have therefore been developed to combine models and data in simulations of future climate, but current methods generally account only for changes in marginal variation and do not capture projected changes in correlation (spatial, temporal, spatiotemporal). We develop here a procedure to simulate future daily mean temperature that modifies climate observations based on changes in the mean and spectral density suggested by climate model output, and illustrate our methodology with projections from the CCSM3 (Community Climate System 3) climate model. We are able to simulate a future climate with changing temporal covariance while largely retaining non-Gaussian features of the observations. Our results suggest that in CCSM3, at most locations and most timescales, variability in daily mean temperature decreases under anthropogenic warming. The methodology presented here applies only to fully equilibrated future climate states, but may be extended to simulating transient states as well.

With mounting evidence indicating that Earth's climate is changing

Comparison of modeled and observed global mean temperatures.

Changes in variability in both temperature and precipitation are physically
plausible. For precipitation, standard physics would suggest increases in
both spatial and temporal variability, with dry areas drier, wet areas
wetter, and rainfall occurring in more intense events

One complication to analyses of potential future changes in climate
variability is that while the deterministic climate models used for long-term
climate forecasts appear to capture trends, they do not accurately reproduce
observed current climate. These models, known as atmosphere–ocean general
circulation models (AOGCMs), are physically based numerical simulations of
transport of energy and moisture in the atmosphere and ocean, typically with
separate submodels for the atmosphere, ocean, sea ice, and vegetation. Many
AOGCMs successfully reproduce observed large-scale circulation, atmospheric
structure, latitudinal temperature gradients, storm tracks, and
quasi-periodic interdecadal phenomena such as the El Niño–Southern
Oscillation. When driven with historical records of CO

The comparisons above suggest that climate models may be informative about
changes in climate, even while failing to capture certain current
characteristics. This is well-demonstrated for means (again, see
Fig.

(Left) Three locations (individual model pixels) used as examples
throughout the manuscript, chosen to represent different combinations of
seasonality, variability, and expected future changes: Illinois,
mid-continental with a strong seasonal component (green, 38.97

Marginal densities (by season) of daily mean temperature
(

Many methods for combining observations with model output in climate
projections have been developed for use in impacts studies, especially those
involving hydrology and agriculture

The other approach to downscaling, dynamic downscaling, involves the running of a regional climate model (RCM) at a higher spatial resolution over a much smaller spatial domain, where boundary conditions are supplied by an AOGCM.

. We provide a brief summary of existing approaches, along with what we consider to be the primary shortcomings of each approach.All approaches that combine observations and model output in simulating
future climates correct in some way for model–observation discrepancies. One
approach is a simple “bias correction” in which any offsets between current
observed and modeled present-day climate are assumed to be systematic model
errors. Model simulations of future climate are then “corrected” by adding
the present-day bias (determined by comparing observations to a baseline
run). Bias corrections can be made on annual mean temperatures or, more
commonly, on monthly mean temperatures or annual harmonics, since models may
not perfectly capture observed seasonal variation. One drawback of this
approach is that all higher-order moments of the marginal and joint
probability distributions (variability, skewness, stationarity, etc.) are
provided by the future model output. As we have seen in
Fig.

A variant on this approach, typically termed “bias correction/spatial
disaggregation” (BCSD), attempts to provide a better approximation of
observed climate distributions by separately bias-correcting the different
quantiles of model output

While the previous two approaches are model-based, i.e., they quantify
present-day model–observation discrepancy and apply it to future model
output, the “change-factor” or “delta” method

Because current climate is transient and changes as a result of increased GHG emissions are not fully realized yet, preindustrial GHG forcings may be a reasonable assumption in these problems.

, then adding this difference onto some observation set. As a result, higher-order moments (in terms of the marginal and joint PDFs) will be derived from observations.In this work, we adopt the observation-based approach of the delta method
(modifying observations based on changes suggested by model output) but
extend the method to account for possible changes in variability and temporal
correlations. While recent work has extended the delta-method approach to
accommodate some aspects of changing variability

A delta- or change-factor approach that involves modifying covariance
structures poses substantial challenges. The approach requires modifying a
vector of random variables with a given joint dependence structure to produce
a new vector of random variables with a different dependence structure. To
achieve this goal, it helps to think about modifying quantities that are
independent (or close to independent) under both present and future climates.
In this regard, spectral-based approaches provide a natural framework. We
propose an approach that modifies the discrete Fourier transform (DFT) of
observations based on an estimated ratio of spectral densities of model
output. Under a large class of stationary processes, the DFT is a
transformation to approximate independence

One caveat is that the procedure is designed to transform model simulations
of an assumed equilibrium climate to another equilibrium climate while,
during foreseeable human timescales, climate will continue to remain in a
transient state. This approach does not directly address the important
problem of simulating transient climate behavior in the covariance structure.
However, it is likely that the method would remain an improvement over the
delta method even in predicting future transient climate states, with certain
extensions related to nonstationary time series. We do not explore the issue
in this paper, but point out a potential approach in
Sect.

In the remainder of the paper, Sect.

Our method produces data-driven simulations of future climate that combine observed climate with model predictions of changes to climate means, variability and temporal correlation. To do this we need to take account of changes in variability of model output over all temporal scales.

In the sections below, we first demonstrate the principle of our approach for
an idealized situation: we assume an infinite length observational time
series with known changes in the spectral process. We then develop the method
for the more practical setting in which

the time series of both observations and model output are finite

we do not know the explicit form of the spectral process

we do not know the explicit form of changes to the spectral process

climate exhibits a strong seasonal cycle in both first and second-order moments.

We demonstrate here that given an infinite length Gaussian time series
representing present-day climate with a known spectral process and known future
changes in the spectral process, we can modify the continuous Fourier
transform separately at each frequency to produce output that has the correct
joint distribution for the future process. Let

We are interested in modifying

The spectral densities

When working with real time series of climate observations and model output,
the spectral densities in the past and future,

Carrying out the simulation on real data then requires the following steps,
starting with

Preprocess the observations and model output to produce

Estimate the ratio of spectral densities of

Reverse preprocessing to produce simulations

In the following subsections we describe in detail these primary steps: estimating the spectral ratio and modifying the discrete Fourier transform; removing the seasonal cycle; and modulating the deseasonalized time series.

Let

In the previous section

We propose a penalized likelihood approach for estimation of

Let

Let

We further linearize the log likelihood and carry out estimation using an
iterative, weighted least squares approach

Although penalties could be placed on the individual spectral densities
themselves, for our analysis we only need an estimate of the ratio; therefore, we
place the penalty on the log ratio of the spectral densities

The objective function that we minimize can then be written as

We do not develop an automated method for choosing the smoothing parameter

The previous section assumed that the process of interest was a stationary process with constant mean. Daily mean temperature however involves a strong seasonal component. So, before estimating the spectral ratio and modifying the DFT of the observations, we remove the seasonal cycle in the observations and AOGCM output. The empirical mean of the observations and present–future difference in the AOGCM output are then added back on at the end of the algorithm. This part of our approach is analogous to the delta method and in fact reduces to the delta method when the present and future spectral densities are equal.

(Top) Log (base 10) of averaged periodograms for the Illinois location, by season, for the reanalysis (left), model baseline period (middle), and model scenario period (right). Note strongest variability in winter, weakest in summer. (Middle) Identical to top but now for the demodulated time series. Seasonal differences in variability are effectively removed, suggesting we can treat these time series as stationary in time. (Bottom) Modulation constants used for the reanalysis (left), model baseline period (middle) and model scenario period (right), showing smallest values in summer, as expected. See Figs. S1 and S2 for similar plots for other locations used as examples; results are similar.

As mentioned previously, the delta method uses model output for changes in
first-order characteristics (e.g., overall mean and seasonal cycle) estimated
from the model output. This method typically involves adding the difference in the
overall mean (usually including the seasonal cycle) of the base and scenario
time slices for the AOGCM to the observations. Let

In our example, our AOGCMs have been run far past the point of CO

The trend in the observations may be affected by volcanoes (e.g., Pinatubo), which produce a temporary reduction in GMT. The fact that these trends are not removed implicitly assumes that intermittent volcanic eruptions would continue in the future. Another potential concern is that the aerosol forcings that affect observed climate will not continue to evolve indefinitely as they have in the past.

.Thus far we have assumed that the deseasonalized observations and model
output,

Following Priestley, we consider

In this section, we continue to illustrate our methodology using NCEP Climate
Forecast System Reanalysis observations

Although we are only modifying the temporal covariance structure, we can
produce maps that show how variability is changing at different locations and
different frequencies (e.g., see Fig.

Variability clearly changes differently at different locations
(Figs.

(Top) Estimated log (base 10) ratio of spectral densities for model scenario vs. baseline at low and high frequencies. The low-frequency results are the estimated log ratios at 1168 days and the high-frequency results at 2 days; however, due to the large degree of smoothing, it is best to think of them as representing low- and high-frequency behavior. Both long- (left column) and short-term (right column) variability decreases in nearly all locations. Remainder of rows: estimated log-spectral densities at these frequencies for reanalysis (second row), model baseline period (third row) and model scenario period (bottom row), using the demodulated and deseasonalized time series. The pattern of enhanced variability over land vs. ocean and high vs. low latitudes is as expected.

(Top) Logarithm (base 10) of the estimated spectral density ratios in the Southern Ocean (blue), Illinois (green), and Gulf of Guinea (red). (Bottom) Logarithm of the estimated spectral densities of the reanalysis data (solid line), base period (dashed line), and scenario period (dashed and dotted line). The spectral density estimation was performed on the deseasonalized and demodulated time series.

In contrast to those locations, however, are pixels such as the Southern
Ocean, where the change in variability remains relatively constant across all
frequencies (with approximately a 60 % decrease in overall variability).
For locations that exhibit this type of change in the spectral ratio, a
simple scaling of the observations may be acceptable. However,
Fig.

All three locations used as examples show evidence of a mean shift (see
Fig.

Time series of daily mean temperature (

An important aspect of our approach is that it does not significantly alter
the non-Gaussian aspects (e.g., tail behavior) of observed climate. In fact,
in our method, when the model does not show changes in mean or covariance,
the simulations are simply the observations and, thus, non-Gaussian features
of the data are retained exactly. When the observations are not significantly
changed, the non-Gaussian features of the data are largely retained. For
instance, JJA in the Gulf of Guinea shows a marginal distribution that is
positively skewed. In this case, the simulation shows a slight decrease in
marginal variability, as well as an increase in mean temperatures, but we
maintain the positive skewness of the observations (see Fig.

Preserving the shapes of distribution of the observations (e.g., skewness,
kurtosis) would be a problem if the actual shapes of distributions changed
from present to future. For locations in or near bodies of water, changes in
temperature means can alter climate variability distributions because those
distributions are sensitive to the freezing point of water. As long as water
is liquid, temperature variability is constrained because air temperatures
cannot drop significantly below freezing. This property is evident in the
marginal distributions for both observations and model output in
Fig.

Marginal densities (by season) of daily temperature (

Detailed characterization of the nature in which climate is changing (mean shifting, tail behavior, spatial and temporal covariance structures) is still a relatively open area of inquiry. One of the best ways of studying how future climate might change is by first investigating the nature in which the statistical properties of the output of AOGCMs change from the present to possible future scenarios. We have provided a method of quantifying how temporal covariance is changing in these AOGCMs at different temporal scales. Our results show that variability is changing differently at different locations. At a given location, the changes in variability may be different (in both magnitude, direction of the change, or both) across different frequencies. We used this estimate of changing variability to produce simulations that modify the temporal covariance structure of the observations. In this way, we extend the delta method to be able to account for changes in the mean and covariance structure.

Our method for producing simulations relies on modifying the discrete Fourier
transform of the observations and, as such, the length of the simulations in
this manner is currently limited by the available data. However, it is
possible that by recycling old observations, one could generate simulations
of longer length. Another possibility is to modify the observations by phase
scrambling

We point out that we have not accounted for any changes in spatial and spatiotemporal covariance structures. Accounting for changes in spatial covariance is complicated by the nonstationarity present in the observations (due to geography, land–ocean contrasts, etc.) and remains the subject of future research. However, we do note that, due to the use of the observations, we are mimicking any spatial structure in the present climate regime.

Next, while we have provided a method for producing simulations of daily mean
temperature, most impacts assessments also rely on simulations of daily
precipitation. The methodology presented here is not fit to handle the highly
non-Gaussian, nonlinear nature of daily precipitation directly; however, a
popular approach in the statistics literature is to model precipitation using
a latent Gaussian process

Perhaps most importantly, the methodology presented here is based on the
assumption of stationarity in the model output and the data. While we did
incorporate the concept of a uniformly modulated process to deal with
seasonal nonstationarity, this methodology is still limited to simulating
equilibrium climate. Because for the foreseeable future our climate will be
in a transient state, we must consider ways of extending this methodology to
be able to simulate transient climate. We point out that there is the
potential for this methodology to be extended by considering an evolutionary
spectral approach

Lastly, our methodology is limited to generating simulations for those GHG
scenarios for which an AOGCM has been run. We cannot produce simulations for
arbitrary GHG emissions scenarios without first running the AOGCM to obtain
the necessary output. However, we note the potential to consider
“emulating” higher-order characteristics in the general circulation models
in order to generate simulations for arbitrary emissions scenarios. For
transient climates, it may be possible to relate changes in the covariance
structure to the past trajectory of CO

We believe that our approach provides a general framework for high-resolution
future climate simulation. Two critical features of our approach are that (1) it
is observation-driven, using the model output to suggest how to modify the
existing observations, and (2) it considers changes in both mean and
covariance structures; and this modification of covariance structure, by
taking place in the frequency domain, involves modifying quantities that are
at least approximately independent. We anticipate many opportunities to
extend our framework to generate more realistic simulations for use in
impacts assessment, and suggest that any extensions should seek to maintain
these features when feasible. Society's obvious need for better impacts
assessment begins with a better understanding of

All model output shown here is from the Community Climate System Model
Version 3 (CCSM3), a fully coupled general circulation model developed at the
National Center for Atmospheric Research (NCAR), with the full representation
of the atmosphere, land, sea ice, and ocean (the modules CAM3, CLM3, CSIM5,
and POP 1.4.3, respectively)

We use two data sets for observational comparisons. In
Fig.

The authors thank Michael Glotter and Shanshan Sun for helpful comments regarding the manuscript. This research was funded by the NSF Statistical Methods in the Atmospheric Sciences Network (awards no. 1106862, 1106974, and 1107046) and the NSF Decision Making Under Uncertainty Program (award no. 0951576). Edited by: C. Paciorek Reviewed by: two anonymous referees