The study of climate change and its impacts depends on generating projections of future temperature and other climate variables. For detailed studies, these projections usually require some combination of numerical simulation and observations, given that simulations of even the current climate do not perfectly reproduce local conditions. We present a methodology for generating future climate projections that takes advantage of the emergence of climate model ensembles, whose large amounts of data allow for detailed modeling of the probability distribution of temperature or other climate variables. The procedure gives us estimated changes in model distributions that are then applied to observations to yield projections that preserve the spatiotemporal dependence in the observations. We use quantile regression to estimate a discrete set of quantiles of daily temperature as a function of seasonality and long-term change, with smooth spline functions of season, long-term trends, and their interactions used as basis functions for the quantile regression. A particular innovation is that more extreme quantiles are modeled as exceedances above less extreme quantiles in a nested fashion, so that the complexity of the model for exceedances decreases the further out into the tail of the distribution one goes. We apply this method to two large ensembles of model runs using the same forcing scenario, both based on versions of the Community Earth System Model (CESM), run at different resolutions. The approach generates observation-based future simulations with no processing or modeling of the observed climate needed other than a simple linear rescaling. The resulting quantile maps illuminate substantial differences between the climate model ensembles, including differences in warming in the Pacific Northwest that are particularly large in the lower quantiles during winter. We show how the availability of two ensembles allows the efficacy of the method to be tested with a “perfect model” approach, in which we estimate transformations using the lower-resolution ensemble and then apply the estimated transformations to single runs from the high-resolution ensemble. Finally, we describe and implement a simple method for adjusting a transformation estimated from a large ensemble of one climate model using only a single run of a second, but hopefully more realistic, climate model.
While numerical simulations of the Earth's climate capture many observed
features of daily temperature, they cannot capture all details of observed
temperature distributions with perfect fidelity. Studies of climate impacts
that require detailed simulation of future climate on fine temporal and
spatial scales therefore often make use of a combination of model output and
observations. There are two fundamental approaches to combining model output
and observations to simulate future climate:
Any method for simulating future climate under forcing levels for which no
observational data are available (e.g., elevated
The emergence of large initial-condition ensembles opens up many possibilities for the use of more complex statistical analyses to describe the climate in greater detail. The availability of large ensembles has important, but different, impacts on bias correction and delta change approaches. We can think of both methods as having two steps: estimation of a transformation and then application of that estimated transformation to some set of data to obtain a simulated future. The estimation step for bias correction does not benefit much from the large ensemble because it requires estimation of the distribution of the observations as a function of season and year from the limited observational record, which will remain the weak link in the estimated transformation no matter the size of the ensemble. The simulation step does benefit from the large ensemble, because now we can trivially simulate many futures by applying the estimated bias correction transformation to each member of the ensemble. The situation is reversed for delta change. Now, estimation of the transformation is greatly enhanced by the large ensemble because the transformation is from the model present to the model future and we have both for every ensemble member. On the other hand, for the simulation step, we have to apply the transformation to the observational record, so we have no direct way of producing multiple simulated futures. Which method will be more useful may depend on the available information, the quantity being simulated, and the use to which the simulation will be put. When considering far tails of climate variable distributions, there is some value in being able to make many simulations. Nevertheless, we maintain that it will often be more important to have fewer projections with a well-estimated transformation than more projections with a poorly estimated transformation.
The availability of more than one large ensemble of the same forcing scenario
under different models or versions of a model is particularly valuable for
multiple reasons. Obviously, but importantly, we can look at how estimated
transformations differ across ensembles as a measure of potential model bias.
We can also provide a more direct evaluation of the delta change method by
using one of the ensembles to estimate the transformations and the other as
repeated realizations of a set of pseudo-observations. Specifically, we can
apply the transformation estimated from one model to the present data of a
second model (i.e., pseudo-observations) and then directly compare their
statistical characteristics to the observed future pseudo-observations from
this second model. Here, we have two ensembles for the same forcing scenario,
both using the Community Earth System Model (CESM,
One common approach for producing projections of future climate is to use
transformations of quantiles
Using this new approach to quantile estimation, we then develop an algorithm
for transforming available observations to produce a future simulation.
Importantly, given the limited observational record, the algorithm requires
only minimal processing of the observational data, just a simple linear
rescaling to make the ranges of the observational data comparable to that of
the model output. Although the primary algorithm we show is a delta method
approach, we describe how the quantile estimation method could be modified to
estimate the transformations needed in a bias correction approach. We
describe this modification in Sect.
In the present study, the two ensembles are of comparable sizes, but it could
commonly happen that lower-resolution models would have larger ensembles.
This situation is reminiscent of the problem of multi-fidelity surrogate
modeling in the computer modeling literature
In the following sections, we describe our data, methods, and results. Section
We consider surface temperature from two climate model ensembles, both
generated with the CESM
For the observational record, we use the ERA reanalysis data product
Comparison of marginal distributions of daily mean temperatures in LENS and SFK15 ensembles and ERA-Interim reanalysis. We show data for the whole year during the time period 1979–2016.
In this work, we show results for eight cities in the United States with
widely varying climate characteristics. To give some idea of how results for
the two models compare to each other and to the reanalysis data, Fig.
We present a method that projects observed climate variables into the future by building a quantile map from the ensemble data and avoids complex modeling of the observations due to their relative sparsity compared with the ensemble data. The only processing of the observational data is a linear normalization to put it on approximately the same scale as the model output. The climate model output is also first normalized, but is then nonlinearly transformed by using estimated quantiles from a statistical model of the quantiles as functions of two temporal dimensions, seasonal and long-term, including interactions between them to allow for slowly changing seasonal patterns. Finally, the initial normalizations are inverted to obtain the simulated future temperatures. In the end, every observed temperature is subject to a monotonic transformation into a future temperature so that the transformed temperature time series will largely retain the temporal and spatial dependencies of the observational data.
We first describe the quantile mapping procedure through which we project a set of observations into the future. For now, assume that all relevant quantile functions and normalizing functions are known; see the next subsection for details on how we estimate these functions.
The basic goal of the transformation is to shift each quantile of the present
time series to its corresponding future quantile value. Let
We now describe this procedure mathematically. Write
This procedure implicitly makes the assumption that the observational
quantile map from the present to the future is the same as the climate
model's map from the present to the future, which can be broken down into the
following three components:
A pictorial representation of the steps of the transformation for the LENS
pixel including Chicago is given in Fig.
Our procedure for carrying out this transformation is novel in several ways. First, for any given day of the year, it naturally allows a separate transformation from any one year during which both model output and observations are available to any year in which model output is available. Second, it separately accounts for linear transformations in both the observations and model output, with the consequence that if the results from the model output only show linear transformations in temperature between the present and the future, then the transformation applied to the observations will also be linear. Third, it transforms even highly extreme observations in a natural way without making strong assumptions about the tails of temperature distributions.
Although outside the scope of this paper, our technique can be modified to
swap the assumptions in Eq. (
A large body of work has been dedicated to fitting quantiles
The methods for estimating quantiles used here expand on the approach used in
We model the CDF of temperature by assuming each quantile is smoothly varying
in both temporal dimensions, day of the year
Write
We are particularly interested in estimating fairly extreme quantiles. The
further out in the tail of the distribution one goes, the less effective
information there is in the data about the corresponding quantile. Thus, we
choose to use simpler models for exceedances beyond two well-estimated
quantiles, here the
When projecting the observations, we consider the time-window 1979–2016, which we project 80 years into the future (years 2059–2096). As previously mentioned, we do not require a detailed modeling of the CDF of the observations, merely an estimate of the location and scale as a function of the two temporal dimensions. Because of the reduced time span of these observations we use fewer degrees of freedom in the splines, namely 10 seasonal and 1 temporal (i.e., linear) without any interaction terms. This normalization step is only included to put the observations and model output on the same scale, so it is not essential that the model used to normalize the observations captures finer details of the median and scale functions.
As already noted, we include zonal volcanic forcings in the statistical model
as they are included in the CESM climate model for the historical period and
cause substantial short-term changes in temperature. We also tried including
the
The availability of a large ensemble makes it possible to quantify the
uncertainty of our estimates based on treating the
There is, of course, no direct way to evaluate the accuracy of a future
climate projection using currently available data. However, if we are willing
to assume that the discrepancies in the statistical characteristics of some
quantity of interest between two models are comparable to the discrepancies
between the output from some model and reality, then we can at least get a
qualitative idea as to how well the procedure is working. Perhaps more
realistically, we may be able to obtain something like a lower bound on the
actual errors in the statistical characteristics of our climate projections.
Specifically, by treating the output from a second model as
pseudo-observations, we can apply our transformation method and then directly
compare the statistical characteristics of these transformed
pseudo-observations to the available future model output from this second
model. This basic idea has been used frequently for evaluating statistical
downscaling procedures, where it is called the perfect model approach
There are a number of ways one could summarize the results of such an
analysis. One method is to fix a particular quantile,
We report here results for eight cities in the United States representing a
broad range of geographical settings, where we build a detailed statistical
model using both the LENS and the SFK15 ensembles. These statistical models
define quantile maps that we use to project present observational data into
the future. We see in Fig.
Example of quantile estimates fit to ensemble data, using daily mean temperatures in the years 1920 and 2099, at a grid cell containing Seattle, from LENS (red) and SFK15 (black). Solid lines are median quantile fits as a function of day of the year. Dots are ensemble output; for clarity we show only 3 % of all data, sampled randomly.
Quantile maps of temperature changes between 1989 and 2099 for
multiple cities extracted from the SFK15 ensemble on the first and third rows
and the LENS ensemble on the second and fourth rows. Temperature change is shown as
a function of seasonality (days of the year) and non-exceedance probability
of the temperature estimate. Note that
Differences between the two quantile maps (LENS minus SFK15) near Seattle as a function of years elapsed since 1920 and day of the year. At year zero, the quantile maps both produce zero change since the initial and final time are the same. As the years increase, so does the discrepancy between the two projections. Each panel shows these temperature discrepancies for a given non-exceedance probability of the temperature estimate. A positive change indicates a larger value for LENS than for the SFK15 ensemble. Notice the complex patterns both in long-term and seasonal effects, justifying the need for a model that fits all seasons simultaneously.
For both ensembles, we estimate a quantile map as a function of day of the
year and long-term time for each of the eight locations using the methods in
Sect.
We next examine the evolution in time of quantiles implied by these quantile
surfaces. Consider, for each day of the year, the difference between a given
quantile in 1920 and the same quantile for every year
Another way to assess how well the transformation approach might work in
practice is the double transformation given by Fig. (
To show explicitly how our procedure benefits from the large ensemble, we
split up the 40-run LENS ensemble into 8 ensembles of size 5 and estimate the
transformation for each of them. Variations between these mini-ensemble
temperature projections are on the order of 1
A double quantile map calculated for eight locations, first applying the map estimated from the LENS ensemble from year 1990 to 2090 followed by the inverse map estimated from the SFK15 ensemble from 2090 to 1990. Each map shows only the difference between the mapping and the original temperature. Positive values indicate that the LENS ensemble will transform to warmer temperatures in the future than SFK15 does. If the LENS and the SFK15 quantile maps were the same, these changes would be identically zero.
Standard deviation of the temperature change from 1990 to 2090,
using a jackknife estimate on the 40 simulations (see Fig.
For 8 sets of 5 mutually exclusive simulations, panels show
differences between quantile map estimates based on 5 simulations and the
quantile map estimate using the full LENS 40-simulation ensemble
(see Fig.
We next employ the perfect model approach described in Sect.
An example comparison between the two models is shown in Fig.
Comparisons between actual and projected LENS data for winter (DJF)
using a model fit from the LENS data (red) and using a model fit from the
SFK15 ensemble (green) near Seattle. The projected data are obtained by using
40 simulations of the time period 1979–2016 (viz. “Initial” shown in blue)
projected 80 years into the future, 2059–2096. Both projection estimates are
compared against the actual LENS model output during the same period,
2059–2096 (viz. “Target” shown in black). Corresponding qq-plots are
juxtaposed using the same horizontal axis. The dashed vertical line
represents the
Similar to Fig.
In the present setting we have roughly equal numbers of runs for both
resolutions of CESM. In many instances, one might have many more runs at
lower resolution because of the lower cost of those runs. This situation
bears some resemblance to what is known as multi-fidelity surrogate modeling
in the computer modeling literature; see for example
To fit the SFK15 quantile map followed by a LENS re-calibration, we first use
the detailed statistical model (or quantile map) applied in previous sections
to the 50 SFK15 simulations, so including an intercept, 14 seasonal terms,
6 long-term temporal terms, 18 interaction terms, and a volcanic forcing term. We
then reduce these 40 predictors to 4 components: component 1, an intercept;
component 2, temporal (including the volcanic term); component 3, seasonal;
and component 4, interactions. Specifically, using the notation from
Sect.
Using this quantile map, we can then project observations into the future
using the procedure described in Sect.
Comparisons between reanalysis data from the winter months (DJF) in
Seattle in the period 1979–2016 projected with three different methods.
(1) Using the 40-member LENS ensemble to construct a quantile map (shown in
red), (2) using the 50 simulations from SFK15 (in green), and (3) using
method (2) and then re-calibrating the quantile map using one LENS simulation
(in blue). For all three models, every year is projected into year 2099 and
compared using qq-plots juxtaposed by histograms. We project into a single
year here instead of projecting each year by a constant time shift to show
the flexibility of the method. The fact that the re-calibrated method (in
blue) is close to the LENS model shows that even one high-resolution
simulation can get close to the results based on all 40 LENS simulations when
aided by a set of low-resolution simulations (SKF15). The dashed vertical
line in each plot represent the
Taking advantage of two large ensembles of climate model output, we project
observations into the future based on their seasonality and long-term time
evolution from 1920 to 2099. By projecting the whole observational record
simultaneously, temporal and spatial dependencies in the observational record
that a climate model may not accurately capture are maintained in the future
projections. Our method, building on the quantile regression approach in
The ensemble of simulations makes it possible to estimate the uncertainty of the quantile projections based on the jackknife variance, which shows that the simulation standard deviation is approximately one order of magnitude lower than the larger model discrepancies as seen when comparing projections between the LENS and the SFK15 ensemble. Thus, efforts to realistically assess the full uncertainty in estimated transformations based on large ensembles should focus on the impact of model biases on these estimates.
To assess how well our method works, we use two ensembles, one treated as
pseudo-observations and the other to build our quantile map. Since the LENS
data have marginal distributions that resemble the ERA-Interim data more
closely (Fig.
The nested statistical model and the large ensemble used in building the
quantile map enable stable estimates of quantiles in the extreme tails,
e.g., the
Following
Having estimated how every year in the observational record would be
projected into any future year spanned by the climate model, we can produce
projections spanning different numbers of years than the original
observational record. For example, if we take the observational record to be
of 40 years length, which is the approximate length of the post-satellite
reanalysis era, we are not just restricted to simulating a single 40-year
period but can for example instead produce four projections of the same
future decade by separately projecting each of four 10-year periods in the
observational record into that decade. Similarly we can project every
observed year into one future year, e.g., 2099 (see
Fig.
If it is expensive to run a high-quality or high-resolution model, an alternative we propose is to first build a quantile map using a less expensive model, then reduce the dimensionality of the prediction space and recalibrate the quantile map with the limited data from the expensive model by re-estimating the regression parameters in this lower-dimensional space. While the results show a clear improvement in the projections after the recalibration, there is clearly scope for further development using this approach.
In this work, we have only modeled marginal distributions of a single
quantity, average daily temperature. The general methodology can be applied
to other quantities, although some care would be needed in extending the
approach to daily precipitation, which will generally have a large fraction
of zeroes
Instructions to reproduce and extend the results from
this paper are available at
LENS and SFK15 quantile maps at a grid cell containing Seattle as a function of years elapsed since 1920 and day of the year. At year zero, the quantile maps both produce zero change since the initial and final time are the same. Each column of plots shows the changes in quantiles as a function of the non-exceedance probability given for that column.
Same as Fig.
As in Fig.
The authors declare that they have no conflict of interest.
This work was supported in part by STATMOS, the Research Network for Statistical Methods for Atmospheric and Oceanic Sciences (NSF-DMS awards 1106862, 1106974 and 1107046), and by RDCEP, the University of Chicago Center for Robust Decision-making in Climate and Energy Policy (NSF-SES awards 0951576 and 146364). We acknowledge the University of Chicago Research Computing Center, whose resources were used in the completion of this work. Ryan L. Sriver acknowledges support from the Department of Energy Program on Coupled Human and Earth Systems (PCHES) under DOE Cooperative Agreement no. DE-SC0016162.
This paper was edited by Seung-Ki Min and reviewed by two anonymous referees.