Skip to content U.S. Department of Transportation/Federal Highway AdministrationU.S. Department of Transportation/Federal Highway Administration

Office of Planning, Environment, & Realty (HEP)
PlanningEnvironmentReal Estate

HEP Events Guidance Publications Awards Contacts

Temperature and Precipitation Projections for the Mobile Bay Region

Section 2: Data and Methods

Assessing the potential impacts of climate change on a given location and sector is a challenging task. It begins with integrating multiple datasets and model outputs that cover a range of spatial and temporal scales. Inputs and methods must be translated across disciplinary boundaries. Reasonable ways must be found to quantify the uncertainty inherent to future projections before synthesizing the results into a coherent picture of potential impacts.

Although challenging, it is important to assess climate impacts because the information generated can be valuable to long-term planning or policies. For example, projected changes in heating or cooling degree-days can be incorporated into new building codes or energy policy. Shifts in the timing and availability of streamflow can be used to redistribute water allocations or as incentive for conservation programs. Projected changes in growing season and pest ranges can inform crop genetics research and agricultural practices.

The primary challenge in climate impact analyses is the reliability of future information. A common axiom warns that the only aspect of the future that can be predicted with any certainty is the fact that it is impossible to do so. However, although it is not possible to predict the future, it is possible to project it. Projections can describe what would be likely to occur under a set of consistent and clearly articulated assumptions. For climate change impacts, these assumptions should encompass a broad variety of the ways in which energy, population, development and technology might change in the future.

By quantifying a range, future projections can be expressed in terms of risk. Risk is a concept that is already incorporated into decision-making at all levels: by individuals who routinely rely on a sense of risk to guide their purchases, from vitamins to motor vehicles; by businesses that use risk analyses as input to strategic planning; and by governments for whom risk assessment is an integral part of both domestic and foreign policy.

This section first describes a general approach to developing the projections needed to quantify the risks of climate impacts for any regional or sectoral analysis. This general framework is then applied to quantify potential effects on transportation infrastructure in the greater Mobile Bay region. The remainder of this section describes the specific datasets and methods selected for, and used in, this analysis. These include observational data, global climate models, future scenarios, and downscaling methods. Where appropriate, the extent to which these datasets and tools can be applied to other regions or sectors is also discussed.

A General Framework for Developing and Applying Climate Projections to Regional Impact Analyses

Each regional impact analysis has its own unique requirements. However, there are some common datasets, tools, and methods that can be combined in ways that are relevant across a broad range of applications (Figure 2.1).

Most analyses begin by identifying the parameters of the study. In this first step there are at least three key questions to consider that will determine the type of data, tools, and methods used in the analysis:

  1. What is the geographic extent and region of interest? For example: a watershed, a city, a state, or an eco-region.
  2. What is the system and the concern associated with it? For example: the long-term prospects for water supply from a certain reservoir; the cost of operating or maintaining city buildings; the public health response to deteriorating air quality; or the ecological impacts of an invasive species moving into the region.
  3. What existing information or tools can be used to quantify the potential impacts of climate change on this system? For example: a model already used for water management planning by the district; historical data that can be used to correlate building maintenance costs with temperature variability; a dynamical air quality model coupled with epidemiological response functions to certain pollutant levels; or a statistical climate envelope modeling package that, when combined with historical climate data, can calculate the implied limits on invasive species' ranges.
This figure shows a graphic representation of the three steps involved in conducting a climate impact analysis, as discussed in the text in this section.
Figure 2.1. A general framework for designing and
conducting regional or sectoral climate impact analyses.

The second step is to assemble the scientific data and models needed to develop future projections. There are several different approaches to doing so, depending on the answer to question 3 above (what information can be used to quantify the sensitivity of the system to climate?).

When the answer to question 3 is "not much," a study might need to begin by quantifying the response of a system to a fixed perturbation in temperature or other climate conditions that is simply added to observed conditions-for example, the impact of a steady-state 2 and 4oC warming, or a sustained precipitation decrease of 25% and 50%). This type of approach has been used for many early-stage climate impact analyses to determine the sensitivity of the system to plausible levels of change. The magnitude of the perturbation need only lie within the reasonable range of expected changes, and can be simply read from existing plots such as those available in USGCRP (2009) or generated using the Climate Wizard tool (http://www.climatewizard.org).

When the answer to question 3 includes long-term data and/or modeling tools that capture the effect of climate variability and/or change on the system of interest, however, then a different approach is possible. If the sensitivity of the system can already be determined, then it is possible to use time-dependent simulations that track the simultaneous evolution of changes in multiple aspects of climate.

In contrast to perturbation analyses, assessing the actual likelihood or risk of climate impacts requires global climate model simulations driven by future scenarios of human emissions. The spatial accuracy of global climate models is limited to the regional scale, so downscaling is commonly used to transform large-scale changes in climate into more localized conditions similar to those measured at long-term weather stations. The ability of the climate and downscaling models to simulate local conditions can be evaluated by validating historical simulations on a set of independent observed data.

Climate science can generate projections of basic climate variables, such as daily maximum temperature or 24h cumulative precipitation. For climate impact analyses, however, these projections must be translated into impact-relevant information. In some cases, the translation step consists of calculating projected changes in secondary climate indicators already used in planning or known to be relevant. Indicators are generally specific to each study, consisting of the weeks per year exceeding a given amount of rain, for example, corresponding to a local sewer overflow threshold; the average number of days per year with sufficient snow to require plowing and salting the roads; or the risk of temperature in summer exceeding a level that would affect crop yields.

In other cases, climate projections are used as input to an additional set of climate response, or impact, models. For example, daily temperature, humidity, rainfall and solar radiation can replace meteorological observations in ecological models to simulate the effects of climate change on a range of systems, from forest nutrient cycles to crop yields. Dynamic vegetation models driven by climate projections can simulate changes in wildfire frequency and area burned. Three-dimensional output fields from climate models can be used to drive air quality models, estimating the impact of warmer temperatures and changes in atmospheric circulation patterns on air quality and pollution. Empirical epidemiological models can combine climate projections with observed response functions to quantify the potential impacts of extreme heat on respiratory disease and even mortality.

The third step in Figure 2.1 is to synthesize the results of the analysis and quantify the uncertainty to assess future risk. As discussed in more detail in Section 5, near term uncertainty is dominated by natural variability (Hawkins & Sutton, 2009; 2011). Natural variability is largely chaotic, a function of heat exchanges between the ocean and the atmosphere, volcanic eruptions, variations in solar energy, and other processes unrelated to human activities. Natural uncertainty can be addressed by using input from multiple climate model simulations, each with slightly different initial conditions. Future projections of climate should always be summarized over climatological time scales-typically, 20 to 30 years--to determine the risk of a given event or magnitude of change over that period as a whole.

Towards mid-21st century, scientific understanding of the climate system becomes the largest contributor to the range in projections. Model uncertainty can be addressed by using the projections from multiple climate models. Each climate model represents the various components and processes that make up the Earth's climate system in slightly different ways. These differences affect how sensitive the simulated Earth's climate is to human emissions, as well as the regional distribution of climate change.

By the end of the century, the contribution of human emissions increases in importance. For temperature, the future scenario becomes the dominant factor determining the magnitude of future change (Hawkins & Sutton, 2009, 2011). Scenario uncertainty can be addressed by using a broad range of future scenarios, including both increases and decreases in future emissions of heat-trapping gases from human activities. For analyses after mid-century, projections should not be averaged across scenarios; rather, each scenario should be compared to quantify the role of human choices in determining the magnitude of future impacts.

Two Examples: Assessing Climate Impacts for Chicago and the U.S.

The general framework described above is similar to that used in previous assessments. Two examples illustrate how this process can be used to frame analyses with very different spatial scales, motivations, and purposes. The first example is a local study of climate impacts on the city of Chicago (Wuebbles et al., 2010; Hayhoe et al., 2010a,b,c). The second is a national study of the impacts of climate change on multiple U.S. regions and sectors (USGCRP, 2009).

CHICAGO

Step One. This analysis was initiated by the city of Chicago's Department of Environment and the Chicago Climate Action Plan advisory committee. They decided that the Action Plan should be based on scientific projections of how climate change would affect local conditions in Chicago, projections which would in turn inform a risk analysis of climate impacts on city operations under a higher and lower emissions future. As the Action Plan was specific to Chicago, the geographic extent of the analysis was limited to the area under the jurisdiction of the city itself.

The specific concern was how climate change would affect city operations, as represented by city departments and other organizations such as the Chicago Transit Authority and the Chicago School Board. No models or planning tools were being used to incorporate climate or weather-related variables into city planning, but the climate impacts team (consisting of scientists and risk management experts from the international management consulting firm Oliver Wyman) were given full access to city departments to collect information on how each department might be impacted by climate change.

Step Two. In order to capture the full range of projected future change, the SRES A1fi (higher) and B1 (lower) emission scenarios were used as the basis for future projections. Four global climate models had daily simulations available for these scenarios (CCSM3, GFDL CM2.1, HadCM3 and PCM). Long-term station data was obtained for the three stations within the city of Chicago (O'Hare, University, and Midway) as well as for additional suburban stations in order to generate a broader base of future projections. Global model simulations were downscaled using a statistical quantile regression approach to provide daily projections of maximum and minimum temperature and precipitation.

Step Three. To quantify the impact of climate change on city departments, the city invited representatives from each department to a meeting where climate scientists presented sample indicators, including projected changes in the number of days per year over a given temperature threshold, the frequency of heavy rainfall events, and projected changes in winter snowfall. These examples were used to demonstrate the type of information that could potentially be provided to the city.

The risk assessment team then followed up on this initial step with individual meetings with each city department to identify actual indicators relevant to that department. These meetings resulted in a list of over 100 secondary indicators that affected operations costs in 14 departments. These included the frequency of freeze/thaw events affecting road conditions and pothole formation; number of days per year over a specific temperature threshold at which rapid transit rails warped; accumulated heating and cooling degree-days that are proportional to energy use; and precipitation exceedences that could lead to storm sewer overflow.

This figure shows the range of changes in annual operating and maintenance costs to the city of Chicago under two scenarios, a higher scenario (A1Fi) and a lower scenario (B1), and over three future time frames: near term (2010-2039), medium term (2040-2069), and long term (2070-2099).  Under the higher scenario, in the near term projected costs are about -$5 million for maintenance and energy demand, and $1 million for labor, capital investment, local government revenue.  Over the medium term, costs are projected to increase by about $30 million.  Over the long term, costs are projected to increase by close to $60 million.  Under the lower scenario, in the near term, projected costs are about -$10 million for maintenance, and $2 million for and energy demand and for labor, capital investment, local government revenue.  Over the medium term, costs are projected to increase by about $5 million.  Over the long term, costs are projected to increase by close to $25 million.
Figure 2.2 Projected changes in annual operating and
maintenance costs to the city of Chicago under the SRES
higher (A1fi) and lower (B1) future scenarios as simulated
by the average of four climate models for the
Chicago O'Hare, Midway, and University weather stations.
(Hayhoe et al., 2010b)

This list of indicators was then provided to the scientific team, which calculated projected changes in these indicators as simulated by each global model, corresponding to each future scenario, and handed these back to the risk assessment team. The team then calculated the costs to each city department for three future time periods (2010-2039, 2040-2069 2070-2099) under higher and lower emissions based on the risk assessment model they had built using the data collected from each department. The resulting costs (shown in Fig. 2.2) were then provided to the city to use as input to decisions on adaptation and mitigation.

UNITED STATES

Step One. This analysis was initiated by the U.S. Global Change Research Program's mandate to provide a national assessment of climate change impacts for the United States. Previous assessments had divided the U.S. into 8 regions: Northwest, Southwest, Great Plains, Midwest, Northeast, Southeast, Islands, and Alaska. Each of these regions formed the basis for a chapter in the assessment.

The purpose of the assessment was to provide consistent analysis of historical trends, to generate projections of future changes in average and extreme temperature and precipitation, and to synthesize available information on the most important aspects of climate change for each region.

This 'migrating climate'  diagram illustrates what a typical summer in Michigan might feel like in the future under a higher (here, A2) and lower (B1) emissions scenario: under the higher scenario, it would be similar to Southern Iowa and Missouri by mid-century, and western Oklahoma/ north Texas by end-of-century.  Under the lower scenario, the State's climate would be similar to today's Indiana and northern Kentucky by mid-century and southern Missouri/Northern Arkansas by end of-century.
Figure 2.3 This "migrating climate" diagram
illustrates what a typical summer in Michigan
might feel like in the future under a higher
(here, A2) and lower (B1) emissions scenario
(USGCRP, 2009)

Step Two. This assessment also wanted to ensure its projections captured a broad range of projected future change, from the SRES A1fi (higher) to the B1 (lower) emission scenario. However, as A1fi simulations were available from only four global climate models, simulations corresponding to the A2 (mid-high) scenario generated by 12 more global models were also used. As this was a regional assessment covering a very large geographic area, global climate model projections were downscaled to a gridded database of observed temperature and precipitation, rather than individual weather stations. Global model simulations were downscaled using a statistical quantile mapping approach that generated monthly projections of changes in maximum and minimum temperature and precipitation, then daily values generated by sampling from the historical record.

Step Three. The purpose of the assessment was not to directly quantify climate impacts for each region, but rather to provide a consistent basis of regional climate projections within which the literature on climate impacts could be framed. For that reason, the translation process consisted of ensuring that the same type of projections was used as the basis for each regional chapter. A number of broadly-relevant secondary indicators were identified and plotted as maps, including the number of days per year over 90 and 100oF, as well as new versions of the "migrating climate" diagram shown in Fig. 2.3.

STEP ONE: Identifying the parameters of the analysis

Here, the general framework described in Figure 2.1 is applied to climate impacts on transportation infrastructure in the Mobile Bay area. This report focuses primarily on the last two steps of the general framework illustrated in Figure 2.1. Hence, only a brief overview of Step 1 is provided here.

This study focuses on the greater Mobile Bay region (see Figure 2.6 below). As this is a coastal region, the geographic extent of the study was determined by ensuring the study area included enough long-term weather stations to be sufficiently representative of the region (here, 5), but not so many weather stations as to include other more distant regions inland that could have micro-climates that differed from those of coastal Alabama.

The concern motivating this analysis of temperature and precipitation projections was the potential for changes in average and/or extreme temperature and precipitation to exceed the engineering design thresholds of transportation infrastructure, particularly in this region. For more information on these indicators, please see Gulf Coast Study, Phase 2: Task 2, Climate Variability and Change in Mobile, Alabama.1

Existing temperature and precipitation metrics used in current design standards already reflected the sensitivity of these standards to climate. A list of these metrics, summarized in Table 2.1, was used to define the secondary climate indicators calculated in Step 3. The precise definition used to calculate each metric is given in the Appendix.

TEMPERATURE PRECIPITATION
Average Extreme Average Extreme
Monthly Mean Number of extreme heat days over 95, 100, 105, 110 Monthly Mean 24h precipitation: 0.2nd to 50th percentile
Seasonal Mean Maximum number of consecutive days over 95, 100, 105, 110 Seasonal Mean 48h and 96h precipitation: mean, 0.2nd to 50th percentile
Annual Mean Warmest and coldest consecutive four days: 5th through 95th percentile Annual Mean Greatest 3-day precipitation per season
Annual 50th and 95th percentile Coldest historical day, probability    
Hottest historical 7-day period, probability    

Table 2.1. Secondary climate indicators used in this study, identified on the basis of engineering design standards.

STEP TWO: Developing and evaluating high-resolution climate projections

Scenarios: Past and Future

The Coupled Model Intercomparison Project's "20th Century Climate in Coupled Models" (20c3m) scenario is used to drive global climate model simulations from the late 1800s through 2000 (Meehl et al., 2007). This scenario reproduces climate conditions observed over the past century as closely as possible. It includes observed changes in solar radiation, volcanic eruptions, human emissions of greenhouse gases, and emissions of other gases and particles including aerosols and air pollutants.

Future scenarios are more difficult. They depend on a myriad of factors, including how human societies and economies will develop over the coming decades; what technological advances are expected; which energy sources will be used in the future to generate electricity, power transportation, and serve industry; and how all these choices will affect future emissions from human activities.

To address these questions, in 2000 the Intergovernmental Panel on Climate Change (IPCC) developed a set of future emissions scenarios known as SRES (Special Report on Emissions Scenarios; Nakicenovic et al., 2000). These scenarios encompass a range of plausible futures by estimating the emissions resulting from a range of projections for future population, demographics, technology, and energy use (Fig. 2.4a).

In this analysis, projected climate changes under the SRES higher A1fi or fossil-intensive scenario (red line) are compared to those expected under the mid-high A2 scenario (orange line) and the lower B1 scenario (blue line). Typically, just two scenarios (A1fi and B1) would suffice to encompass an adequate range of future change. However, the A2 mid-high scenario was included as simulations for the A1fi higher scenario were available from only four of the ten global climate models used in this analysis.

This figure shows projected greenhouse gas emissions over the 21st century.  On the left, it shows emissions for the SRES scenarios listed in the Special Report on Emissions Scenarios (2000, IPCC).  On the right, it shows the same information developed for scenarios as part of the 2010 Representative Concentration Pathways (RCP).  Both start with emissions rates of about 12 Gigatons of carbon equivalent per year in 2000; the SRES scenarios range from about 7 to 39 gigatons of carbon in 2100, while the newer RCP scenarios range from about 3 gigatons to about 37 gigatons in 2100. This figure shows projected greenhouse gas emissions over the 21st century.  On the left, it shows emissions for the SRES scenarios listed in the Special Report on Emissions Scenarios (2000, IPCC).  On the right, it shows the same information developed for scenarios as part of the 2010 Representative Concentration Pathways (RCP).  Both start with emissions rates of about 12 Gigatons of carbon equivalent per year in 2000; the SRES scenarios range from about 7 to 39 gigatons of carbon in 2100, while the newer RCP scenarios range from about 3 gigatons to about 37 gigatons in 2100.

Figure 2.4 Projected future greenhouse gas emissions, in units of gigatons carbon equivalent. Non-CO2 gases were converted to CO2-equivalent units using a 100-year global warming potential value. Values are shown for: (a) the 2000 SRES emission scenarios, and (b) the 2010 Representative Concentration Pathways, converted from concentrations to emissions using a carbon cycle model. The SRES higher (A1fi, red), mid-high (A2, orange), and lower (B1, dark blue) scenarios are used in this analysis. SRES A1fi is similar to RCP 8.5, while SRES B1 is similar to RCP 4.5.

The A1fi higher emissions scenario represents a world with fossil fuel-intensive economic growth and a global population that peaks mid-century and then declines. New and more efficient technologies are introduced toward the end of the century. In this scenario, atmospheric carbon dioxide concentrations reach 940 parts per million (ppm) by 2100, more than triple pre-industrial levels (Nakicenovic et al., 2000).

The A2 mid-high emissions scenario imagines a more individualistic world, where each region develops relatively independently, with slow technological development. Emissions rise rapidly towards the end of the century, with carbon dioxide concentrations reaching 870 ppm by 2100 (Nakicenovic et al., 2000).

The B1 lower-emissions scenario also represents a world with high economic growth and a global population that peaks mid-century and then declines. However, this scenario includes a shift to less fossil fuel-intensive industries and the introduction of clean and resource-efficient technologies. Emissions of greenhouse gases peak around mid-century and then decline. Atmospheric carbon dioxide concentrations reach 550 ppm by 2100, approximately double pre-industrial levels (Nakicenovic et al., 2000).

As diverse as they are, the SRES scenarios still do not cover the entire range of possible futures. Since 2000, CO2 emissions have already been increasing at an average rate of 3% per year. If they continue at this rate, emissions will eventually outpace even the highest of the SRES scenarios (Raupach et al., 2007; Myhre et al., 2009). On the other hand, significant reductions in emissions-on the order of 80% by 2050, as already mandated by the state of California-could reduce CO2 levels below the lower B1 emission scenario within a few decades (Meinhausen et al., 2006). Nonetheless, the substantial difference between the SRES higher- and lower- emissions scenarios used here provides a good illustration of the potential range of changes that could be expected, and how much these depend on future emissions and human choices. The relative importance of scenario uncertainty, as compared to uncertainty due to natural variability and the uncertainty inherent to modeling the physical climate system, is discussed in Section 5.

As of 2011, the SRES emission scenarios are in the process of being replaced by a new series of scenarios based on atmospheric carbon dioxide equivalent concentrations. These scenarios, known as Representative Concentration Pathways (RCPs), will be used as the basis for climate model simulations in support of the IPCC Fifth Assessment Report, to be published in 2012 (Moss et al., 2010). The new RCP 8.5 scenario is very similar to the SRES A1fi higher scenario, while the new RCP 4.5 scenario is similar to the SRES B1 lower scenario (Figure 2.4b). Future climate impact assessments can therefore rely on RCP 8.5 to 4.5, or 2.6, to cover the range of plausible emission futures.

Global Climate Model Simulations

Projected future global temperature change for the SRES emission scenarios (degrees C).  Temperature increase ranges from about 1.8 C to about 3.6 C compared to about 1990.  The range for each individual emission scenario indicates model uncertainty in simulating the response of the Earth system to human emissions of greenhouse gases.  Source: IPCC, 2007
Figure 2.5 Projected future global temperature
change for the SRES emission scenarios
(degrees C). The range for each individual
emission scenario indicates model uncertainty
in simulating the response of the Earth system
to human emissions of greenhouse gases.
Source: IPCC, 2007

Future scenarios are used as input to global climate models (GCMs). GCMs are complex, three-dimensional coupled models that are continually evolving to incorporate the latest scientific understanding of the atmosphere, oceans, and Earth's surface. As output, GCMs produce geographic grid-based projections of temperature, precipitation, and other climate variables and daily and monthly scales. These physical models were originally known as atmosphere-ocean general circulation models (AO-GCMs). However, many of the newest generation of models are now more accurately described as earth system models (ESMs) as they incorporate additional chemistry and biology.

Because of their complexity, GCMs are constantly being enhanced as scientific understanding of climate improves and as computational power increases. Some models are more successful than others at reproducing observed climate and trends over the past century (Randall et al., 2007; see Section 5 for more on evaluating climate simulations). However, all future simulations agree that both global and regional temperatures will increase over the coming century in response to increasing emissions of greenhouse gases from human activities (IPCC, 2007; Fig. 2.5).

Model Name Origin Atmospheric Resolution (horizontal, vertical) Climate Sensitivity (oC) Reference
BCCR-BCM2.0 Bjerknes Centre for Climate Research, Norway 1.9 x 1.9 31 levels N/A Furevik et al., 2003
CCSM3 National Center for Atmospheric Research, USA 1.4 x 1.4 26 levels 2.7 Collins et al., 2006
CGCM3 (T47) Canadian Centre for Climate Modelling and Analysis, Canada 2.8 x 2.8 31 levels 3.4 Flato, 2005
CGCM3 (T63) 1.9 x 1.9 31 levels 3.4
CNRM-CM3 Meteo-France/Centre National de Recherches Meteorologiques 1.9 x 1.9 45 levels N/A Salas-Melia, 2005
ECHAM5/MPI-OM Max Planck Institute for Meteorology, Germany 1.9 x 1.9 31 levels 3.4 Roeckner et al., 2003
GFDL-CM2.0 National Oceanic and Atmospheric Administration (NOAA)/Geophysical Fluid Dynamics Laboratory (GFDL) 2.0 x 2.5 24 levels 2.9 Delworth et al., 2006
GFDL-CM2.1 2.0 x 2.5 24 levels 3.4
PCM National Center for Atmospheric Research, USA 2.8 x 2.8 26 levels 2.1 Washington et al., 2000
UKMO-HadCM3 Hadley Centre/Met Office, UK 2.5 x 3.75 19 levels 3.3 Gordon et al., 2000

Table 2.2. Description of the ten global climate models used in this analysis, including their origin and nationality, horizontal resolution, number of vertical levels, climate sensitivity defined as the equilibrium temperature change resulting from a doubling of carbon dioxide relative to pre-industrial times, and reference.

Historical GCM simulations are initialized in the late 1800's, externally "forced" by the human emissions, volcanic eruptions, and solar variations represented by the 20c3m scenario, and allowed to develop their own pattern of natural chaotic variability over time. This means that, although the climatological means of historical simulations should correspond to observations at the continental to global scale, no temporal correspondence between model simulations and observations should be expected on a day-to-day or even year-to-year basis. For example: while a strong El Niño event occurred from 1997 to 1998 in the real world, it may not occur in a model simulation in that year. Over several decades, however, the average number of simulated El Niño events should be similar to those observed. Similarly, although the central U.S. suffered the effects of an unusually intense heat wave during the summer of 1995, model simulations for 1995 might show that year as average or even cooler-than average. However, a similarly intense heat wave should be simulated some time during the climatological period centered around 1995.

In this study, ten different global climate models were used. Their origins, horizontal and vertical resolution, and further references, are provided in Table 2.2 below. These models were chosen based on several criteria. First, only well-established models were considered, those already extensively described and evaluated in the peer-reviewed scientific literature. Models must have been evaluated and shown to adequately reproduce key features of the atmosphere and ocean system. Second, the models had to include the greater part of the IPCC range of uncertainty in climate sensitivity (2 to 4.5oC; IPCC, 2007). Climate sensitivity is defined as the temperature change resulting from a doubling of atmospheric carbon dioxide concentrations relative to pre-industrial times, after the atmosphere has had decades to adjust to the change. In other words, climate sensitivity determines the extent to which temperatures will rise under a given increase in atmospheric concentrations of greenhouse gases (Knutti & Hegerl, 2008). The extent to which model uncertainty, including climate sensitivity, affects future projections is discussed further in Section 5. The GCMs used here range from relatively low sensitivity (PCM, 2.1oC) to moderate (GFDL CM2.1, 3.4oC; see Table 2.2).

Few models have climate sensitivity exceeding 4oC and, of those, none had continuous time series available for at least two of the three scenarios used in this analysis. Thus, the third and last criteria is that the models chosen must have continuous daily time series of temperature and precipitation archived for at least two of the three emission scenarios used here (SRES A1fi, A2, and B1). The GCMs selected for this analysis are the only ten models for which continuous daily output from at least two of the three A1fi, A2 and/or B1 simulations was available.

This figure displays the locations of the 5 weather stations surrounding Mobile Bay, AL.
Figure 2.6 Locations of the 5 weather
stations surrounding Mobile Bay, Alabama,
used in this analysis.

As GCMs are global, these simulations could be used to evaluate climate impacts anywhere in the world. For some regions of the world (including the Arctic, but not the continental U.S.) there is some evidence that models better able to reproduce regional climate features may produce different future projections (e.g. Overland et al., 2011). Hence, depending on the geographic region it may or may not be desirable to cull models that have been demonstrated in the literature to fail to reproduce important regional climate characteristics (Knutti, 2010). Such characteristics include large-scale circulation features or feedback processes that can be resolved at the scale of a global model. However, it is not valid to evaluate a global model on its ability to reproduce local features, such as the bias in temperature over a given city or region. Such limitations are to be expected in any GCM, as they are primarily the result of a lack of spatial resolution rather than any inherent shortcoming in the physics of the model.

Historical Observations

Station-level observations of daily maximum and minimum temperature and precipitation were obtained from the Global Historical Climatology Network, produced jointly by the U.S. Department of Energy's Carbon Dioxide Information Analysis Center (CDIAC) and the National Oceanographic and Atmospheric Administration's National Climatic Data Center (NCDC; Vose et al., 1992). Five stations surrounding the Mobile Bay region were selected for analysis: (A) Bay-Minette, (B) Coden, (C) Fairhope, (D) Mobile Airport, and (E) Robertsdale, AL (Table 2.3, Fig. 2.6).

Although GHCN station data have already undergone a standardized quality control (Durre et al., 2008), these stations were additionally filtered using a quality control algorithm to identify and remove erroneous values that had previously been identified in the GHCN database. This additional quality control step included three tests for errors, where any occurrences were removed and replaced with "NA" values. The first error test removed the data on any days where the daily reported minimum temperature exceeds the reported maximum. The second error test removed any temperature values above or below the highest recorded values for North America (-50 to 70oC) or with precipitation below zero or above the highest recorded value for the continental U.S. (915 mm in 24h). The third error test removed repeated values of more than five consecutive days with identical temperature or non-zero precipitation values to the first decimal place.

Station Name Latitude Longitude Beginning of RecordTemp Precip GHCN ID NOAA CO-OP ID
(A) Bay-Minette 30.8839 -87.7853 Mar 1915 Nov 1913 USC00010583 10583
(B) Coden 30.3878 -88.2281 Oct 1956 Oct 1956 USC00011803 11803
(C) Fairhope 30.5467 -87.8808 Aug 1917 Aug 1917 USC00012813 12813
(D) Mobile (airport) 30.6883 -88.2456 Jan 1948 Jan 1948 USC00015478 15478
(E) Robertsdale 30.565 -87.7017 Feb 1924 May 1912 USC00016988 16988

Table 2.3. Latitude, longitude, and identification numbers for the 5 weather stations used in this analysis.

The quality control algorithm also flagged (but did not replace) any occurrences of four possible errors: first, years with very low or very high annual temperature ranges; second, decades with mean values greater than 1.5 times the standard deviation of the previous decade; and lastly, years with a number of wet (precipitation>0.1") or drizzle (precipitation>0") days exceeding the maximum recorded North American value (350 days).

Downscaling Methods

Global models cannot accurately capture the fine-scale changes experienced at the regional to local scale. GCM simulations require months of computing time, effectively limiting the typical grid cell sizes of the models to 1 or more degrees per side (Table 2.2). And although the models are precise to this scale, they are actually skillful, or accurate, to an even coarser scale (Grotch & MacCracken, 1991).

Dynamical and statistical downscaling represent two complementary ways to incorporate higher-resolution information into GCM simulations. Dynamical downscaling, often referred to as regional climate modeling, uses a limited-area, high-resolution model to simulate physical climate processes at the regional scale, with grid cells typically ranging from 10 to 50km per side. Statistical downscaling models capture historical relationships between large-scale weather features and local climate, and use these to translate future projections down to the scale of any observations-here, individual weather stations.

Regional climate models are just as computationally intensive as global climate models. They also require GCM outputs at high temporal frequencies that are not generally available. Using currently-available regional climate model simulations, such as those available from the North American Regional Climate Change Assessment Program,2 limits the range of future projections to a few global models and one future scenario that does not capture as broad a range of the uncertainty as that represented here. However, regional models provide a plethora of outputs in addition to temperature and precipitation (including atmospheric circulation, winds, humidity, etc.). Hence, regional model simulations or dynamical downscaling provides essential inputs to sensitivity analyses that require a broad suite of climate variables in order to assess a given system's potential vulnerabilities to changing climate.

Statistical models assume that the relationship between large-scale weather systems and local climate will remain constant over time. This assumption may be valid for lesser amounts of change, but could lead to biases under larger amounts of climate change (Vrac et al., 2007). Statistical models are generally flexible and less computationally-demanding, able to use a broad range of GCM inputs to simulate future changes in temperature and precipitation for a continuous period from 1960 to 2100. Hence, statistical downscaling models are best suited for analyses that require a range of future projections that reflect the uncertainty in emission scenarios and climate sensitivity, at the scale of observations that may already be used for planning purposes.

Ideally, climate impact studies should use multiple downscaling methods, as regional climate models can directly simulate the response of regional climate processes to global change, while statistical models can better remove any biases in simulations relative to observations. However, rarely (if ever) are the resources available to take this approach.

Instead, most assessments tend to rely on one or the other type of downscaling, where the choice based on the needs of the assessment (e.g., Hayhoe et al., 2004, 2008; USGCRP 2000, 2009). If the study is more of a sensitivity analysis, where using one or two future simulations is not a limitation, or if it requires many climate variables as input, and has a generous budget, then regional climate modeling may be more appropriate. If the study needs to resolve the full range of projected changes under multiple GCMs and scenarios, or is more constrained by practical resources, then statistical downscaling may be more appropriate.

Even within statistical downscaling, selecting an appropriate method for any given study depends on the questions being asked. The variety of techniques ranges from a simple delta approach (which consists of subtracting historical simulated values from future values, and adding the resulting "delta" to historical observations, as used in USGCRP, 2000) to complex clustering and neural network techniques that rival dynamical downscaling in their demand for computational resources and high-frequency GCM output (e.g., Vrac et al., 2007; Kostopoulou et al., 2007).

If the timescales of interest are seasonal or annual averages, as often required for ecological analyses, a delta approach can be appropriate. If timescales of weeks to months are required, as for hydrological analyses, then a monthly quantile mapping approach such as the Bias Correction Statistical Downscaling model (BCSD; Maurer & Hidalgo, 2008, as used in Hayhoe et al., 2004, 2008, USGCRP, 2009) is adequate. If daily values are needed, then an approach is required that uses daily information from the GCMs (as used in Hayhoe et al., 2004).

For this analysis, an approach that uses daily GCM output was used, known as the Asynchronous Regional Regression Model (ARRM; Stoner et al., submitted). Asynchronous quantile regression assumes that if two independent time series describe the same variable, at approximately the same location, then they must have similar probability distributions. This is generally a valid assumption for variables such as temperature and precipitation that are directly simulated by global models, but this assumption was tested by validating the statistical model on an independent set of observational data that was not used to train the statistical model and quantifying biases in simulated historical values across the range of the distribution from the 0.1st to the 99.9th quantile. The results of the validation exercise are discussed in detail in Section 5.

A statistical downscaling model, rather than regional climate model output, was selected for use in this study for three reasons. First, this assessment only required projected changes in air temperature and precipitation, both of which can be generated using a statistical model. No additional variables were required. Second, the study required future projections that cover the full range of plausible emission scenarios and GCM simulations. Regional climate model outputs do not yet cover this broad a range of scenario and model uncertainty (see discussion of uncertainty in Section 5). Third, projections were requested for three future time periods: near-term, mid-century, and end-of-century. Regional climate model simulations do not provide continuous time series but are typically limited to only one or two future time slices.

The high-resolution projections used in USGCRP (2009), the upcoming USGS GeoData Portal (2011), and this report are all based on a similar set of global climate model simulations and future scenarios. However, different combinations of statistical downscaling approaches and observational datasets have been used to generate each dataset. Specifically, the BCSD model used in USGCRP (2009) uses a quantile mapping approach that combines monthly GCM outputs with sampling from the historical daily record to produce daily values (Fig. 2.7a). In contrast, the ARRM model used here and in the upcoming USGS GeoData Portal uses a quantile regression technique that directly downscales daily output from global climate models (Fig. 2.7b). Table 2.4 summarizes the similarities and differences between the datasets.

Dataset Year GCM simulations Observed data Downscaling method
MOBILE 2011 CMIP3 GHCN stations ARRM
USGS GeoData Portal 2011 CMIP3, CMIP5 (2012) 1/8th degree grid and GHCN stations ARRM
USGCRP 2009 CMIP3 1/8th degree grid BCSD

Table 2.4 Downscaling methods, observational data, and climate model simulations used to generate three different datasets of high-resolution climate projections.

Asynchronous Regional Regression Model (ARRM)

The ARRM model used in this analysis is based on a highly generalizable quantile regression technique first introduced by Koenker & Basset (1978) to estimate conditional quantile functions by training a model using observational data to describe quantiles of the modelled predictor variable as functions of observed predictand covariates (Koenker & Hallock, 2001). In other words, a quantile regression model is derived for each weather station that transforms dataset A (e.g., historical model simulations) into a probability distribution that closely resembles dataset B (e.g., historical observations). This model can then be used to transform additional datasets (e.g. future model simulations) into probability distributions that continue to reflect the characteristics of dataset B (observations). The general process is illustrated in Figure 2.7(a).

Quantile regression was applied by O'Brien et al. (2001) to calibrate satellite observations from asynchronous, or non-matched, datasets, while Dettinger et al. (2004) was the first to apply this statistical technique to climate projections to examine simulated hydrologic responses to climate variations and change, as well as to heat-related impacts on health (Hayhoe et al., 2004). ARRM expands on these original applications with modifications specifically aimed at improving the ability of the model to simulate the shape of the distribution including the tails, including pre-filtering of GCM input using principal components analysis, use of a piecewise rather than linear regression to accurately capture the often non-linear relationship between modeled and observed quantiles, and bias correction at the tails of the distribution.

Quantile regression has two key advantages relative to other statistical approaches: first, it does not require temporal correspondence between model simulations and observations; and second, it is capable of incorporating model-simulated changes in the shape of the daily distribution (including shifts in the mean, skewedness, and variance) into future projections. In comparison to regional modeling, it is highly efficient, since it does not involve retention of the large-scale dynamical flow patterns, and thus does not require significant computer resources.

Transforming GCM output into high-resolution projections using ARRM begins with daily observations of temperature and precipitation, filtered using a quality control process to remove questionable or erroneous values as described previously. Next, climate model output fields are re-gridded to the scale of the observations using bilinear interpolation. For training, the method requires a minimum of 20 years of observations and model simulations with less than 5% missing data over that time period in order to produce robust results. For the five stations in this analysis, 51 years or the entire observational record from 1960 to 2010 was utilized for training purposes.

(a) DAILY QUANTILE REGRESSION
(a) Daily quantile regression, used in this study:  First, daily observed data and simulated historic GCM output for temperature and precipitation are collected.  Faulty data and low level noise are removed through a quality control process.  Then, the GCM output is regressed on the observed daily temp and precip data (for the same location) to quantify the statistical relationship between the two and thus establish or train the quantile regression model.  This is followed by a validation process to check the results against the observations.  Then, the model is used to apply the statistical relationships to GCM projections to develop downscaled temperature and precipitation variables.

(b) MONTHLY QUANTILE MAPPING
(b) Monthly quantile mapping, used in the 2009 USGCRP study:  This process is similar to the daily quantile regression process, but it uses monthly data to train the model and includes an intermediate step of developing monthly projections before developing daily projections.

Figure 2.7 This diagram illustrates the primary steps involved in (a) the ARRM downscaling model used to generate the high-resolution projections for the greater Mobile Bay region presented in this report, and (b) the BCSD downscaling model used to generate the projections used in USGCRP (2009). The green boxes show where global climate model output is used as input to the downscaling model, while the blue boxes indicate where observations are used as input. Brown-shaded boxes describe the various computational and analysis steps.

Model predictor values and observed predictand values are ranked and a function (here, a piecewise linear regression) is fitted to the datasets by month, including two weeks of overlapping data on either side. This additional refinement was added to account for shifting seasons in future projections that may produce conditions outside the range of a typical historical month in the future, and allows the method to utilize each data point twice rather than once during the training process.

Optimal placements and number of break points (up to six) in the piecewise linear regressions are identified automatically as locations with higher curvature on a plot of ranked modeled vs. observed values. The slopes of the regression segments are checked to ensure no negative slopes are present, and if there is a negative slope a break point is removed to force a positive slope.

Improved performance on temperature downscaling is obtained by filtering the model fields using an empirical orthogonal function (EOF) analysis, also referred to as principal component analysis, that retains only 97% of the original variance. As the linear regressions at the tails are based on a much lower number of data points than those in the center of the distribution, the low and high tail of the distributions undergo further scrutiny by performing bias correction at the tails, ensuring that values are within 30% of the observations.

For precipitation, the model selects from three possible predictors the one best suited to each month: convective, large-scale, or total precipitation. EOF filtering of the model output is not performed since it degrades the results and introduces negative values for precipitation. The logarithm of precipitation values is used instead of raw precipitation amount as this was found to decrease the residuals of the regression.

The downscaling is performed as follows: for each individual station, GCM output for the "training" period, 1960-2010, is regressed on observed daily temperature and precipitation for the same time period to quantify the statistical relationship between each individual quantile of that variable's daily distribution, and compared to observations (Fig. 2.8a). The statistical relationship derived from the observations and historical GCM simulations is then applied to future GCM simulation output in order to downscale future temperature and precipitation conditions to the same locations used to derive the original regression relationships (Fig. 2.8b). Finally, the validity of the statistical relationship can finally be evaluated through comparison with an independent set of observations that were not used to train the statistical model. Validation of the downscaling for the five Mobile Bay weather stations is summarized in Section 5.

2.8 (a) compares temperature simulations produced by the GCMs and actual recorded data.  The figure shows the difference between the model simulations and observed data for Maximum daily temperature.In figure 2.8 (b) the relationship between the modeled and the observed is then applied to future GCM simulations to project changes at the local level.  The figure shows past observations and future projections for Maximum daily temperature.    (This process is described in the paragraph above the figure.)

Figure 2.8 (a) Observed (black) and historical simulated distribution of daily maximum summer temperatures by four GCMs for a weather station in Mobile for evaluation period 1980-1999. (b) Historical simulated (black) and future projected daily maximum summer temperature under the SRES A1FI higher (red) and B1 lower (orange) emission scenarios.

STEP THREE: Translating into impact-relevant information

The final step in the analysis was to translate projected changes in primary climate variables (maximum and minimum temperature and 24h cumulative precipitation) into a series of secondary climate indicators listed in Table 2.1. The results of this translation are described in Sections 3 and 4.


1 Available at: http://www.fhwa.dot.gov/hep/climate/gcs_overview.htm.

2 http://www.narccap.ucar.edu/

Updated: 03/27/2014
HEP Home Planning Environment Real Estate
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000