|Accelerated Climate Prediction Initiative (ACPI)
at the Scripps Institution of Oceaongraphy
Back to main page for ACPI at Scripps
The National Research Council has recently acknowledged what has been known for the last five years: "the United States lags behind other countries in its ability to model long-term climate change" (NRC, 1999). It further finds "it is inappropriate for the United States to rely heavily upon foreign centers to provide high-end modeling capabilities".
The practical ramifications of these findings for the future energy policy of the United States are incalculable. They mean that when negotiating possible future energy policy on the international stage, such as in Kyoto, Buenos Aires, etc., the United States is at a distinct disadvantage. It simply does not have its own scientific base upon which to decide major issues. Rather, it must rely on simulations of future world conditions produced by other nations. Given the current state of the art, there is considerable room for interpretation in such simulations. In short, given the immense economic implications, the United States has no choice but to develop its own technical basis for making future energy policy decisions.
A bold solution to the unacceptable situation described above has been put forth in DOE's Accelerated Climate Prediction Initiative (ACPI). This large, long term program would acquire and put in place the computational resources required for the future world simulations, while simultaneously developing the scientific and other infrastructure needed to carry through a full assessment of potential anthropogenic threats. This information would then allow the United States to develop a rational, quantitative basis for energy policy decisions.
The basic idea is to begin immediately a pilot ACPI effort, using existing tools and previously demonstrated capabilities. This pilot will jump-start the full program, which will take several years to spin up. The pilot is intended to allow a partial scoping of the full ACPI to determine just what needs to be done and where the main difficulties are apt to occur. In order to do this, the pilot will take an 'end-to-end' approach, starting from the current state of the global climate system and ending up with quantitative statements about predicted anthropogenic impacts at the local and regional level. We expect these impact statements, while limited in number as befits a startup program, to be practically useful. Perhaps most important, the successful completion of the pilot will be a demonstration, or 'existence proof', that the ACPI is both feasible and likely endowed with major payoffs. The pilot will last 2-3 years and leave behind what will be the core of the full scale ACPI.
In brief, the pilot ACPI is composed of three main elements shown schematically
in Figure 1. Element
1 uses existing ocean observations and inverse techniques to quantitatively
establish the physical state of the global ocean in recent years. This
information will serve as initial conditions for a coupled global climate
model, which when forced with anthropogenic pollution scenarios for the
next century, will provide predictions of expected climate change globally.
Recent work has shown ensembles of such runs are required to adequately
identify the anthropogenic signals. The third program element 'downscales'
these large scale predictions of the global model to ensembles of regional
scale predictions, which are then used to estimate impacts on
hydrology, agriculture, energy utilization, etc. We have already demonstrated the ability to do nearly all of the steps described above. In short, there is essentially no doubt we have the tools and technology to successfully carry through the pilot ACPI, if the necessary resources (especially computer capacity) are made available.
Main Program Elements and their Status
The current status of the three program elements is described below. All are currently operational.
Element 1: Ocean Assimilation
It has recently become possible to assimilate observed ocean data on a global scale, thereby producing gridded fields of such quantities as temperature, salinity and velocity for the world oceans. An example of the ocean current velocity at depths of 87 m and 2540 m produced by this new technology is shown in Figure 2 (courtesy D. Stammer). The capability to produce such a product has been made possible by improvements in ocean models, adjoint and filtering techniques, computer speed and major observational campaigns (e.g. WOCE, TOPEX/Poseidon, etc.). The current status of the assimilation effort may be found at http://puddle.mit.edu/~detlef/OSE/global.html.
Support has been requested by Stammer et. al. from the National Ocean
Partnership Program (NOPP) to use historical ocean observations to produce
a four dimensional history of the global ocean's physical properties for
the period 1992-97. It is this information that we plan to use to initialize
a global climate model for anthropogenic signal prediction. We note that
this type of initialization of an anthropogenically forced climate model
has never before been done. Prior anthropogenic simulations have used a
variety of ways to spin up the ocean component of the coupled model. But
the memory time of the deep ocean is order 100-500 years or more. Since
no one knows what fluxes of heat, momentum and fresh water to the oceans
were for more than a few
decades into the past, all such spin up attempts will clearly be in error. This has given rise to the so called 'cold start' problem, a major uncertainty in any anthropogenic prediction. Initializing with observed data should greatly lessen this problem.
Element 2: Modeling Anthropogenic Climate Change
The ocean data described above will be used to initialize the DOE supported Parallel Climate Model (PCM), a coupled ocean/atmosphere/ice model being developed under the leadership of Warren Washington with mostly DOE support and collaborative NSF support. As the name indicates, the model is designed to work on highly parallel computers.
Details of the model may be found at http://www.cgd.ucar.edu/pcm. Suffice to say, the PCM has been developed with substantial distributed involvement of both government laboratories and universities. The NCAR/UCAR NSF supported Climate System Model (CSM) is working closely with the PCM community to develop new generation climate models. Recently, it was agreed that they would use the same components at several but the models will likely have different resolutions with and implementation on a variety of supercomputers. Cooperation with the CSM will provide added academic involvement in this project.
The atmospheric component of the PCM is the CCM3 atmospheric general circulation model (AGCM) developed at NCAR and is used at T42 resolution (about 280 by 280 km grid). The CCM3 includes a land surface model that accounts for soil moisture, vegetation types, etc, as well as a river transport model. The University of Texas (Austin) parallel river transport model is a new component to the PCM.
The ocean component of PCM is the Parallel Ocean Program (POP) model developed jointly by LANL, Naval Postgraduate School (NPS) and NCAR. We note that although it is not presently included in POP, a biogeo-chemical component could added to make this system serve the needs of the national carbon cycle plan. As part of the effort to develop a newer generation of the PCM, the Gent-McWilliam ocean diffusion parameterization of subgrid scale eddy contributions along isopycnal layers has been added to the ocean component along with the K-profile parameterization mixing in the upper ocean.
The final major model component of PCM is a sea ice model developed at NPS. The sea ice model in the first version of PCM is that used by Y. Zhang at NPS. In the second version of the PCM, the sea ice model uses the University of Washington multi-thickness thermodynamical model of C. Bitz and the elastic-viscous-plastic ice dynamics model from LANL by E. Hunke of LANL. These new features will make the sea ice treatment more realistic. The sea ice formulation treatment is especially important for reproducing a realistic feedback feedback mechanism between sea ice processes and climate change forcings.
The full PCM has been configured to run with a serial flux coupler that has been designed to perform the calculation of the components of the climate system as efficiently as possible on a variety of parallel high capacity supercomputers. Specifically, the PCM can run on the IBM SP, Origin SGI, Cray T3E, Compacand Compac and Linux Beowolf systems. The present version makes use of the message-passing interface (MPI) for passing information between processors and nodes. PCM version 2 will have a capability of using a hybrid approach in which MPI is used between nodes and open message passing within a node. This will allow efficient use on virtually all the cluster parallel computer systems that are being developed as supercomputers.
The PCM is currently fully operational. Analyses of ongoing simulations have shown realistic amplitude El Niñno, La Niña, North Atlantic Oscillation, and Antarctic Circumpolar Wave properties in the simulations. An example of the PCM's output, showing model estimated changes in surface temperature that would arise from a doubling of atmospheric carbon dioxide, can be seen in Figure 3.
To better facilitate use of the PCM model data for regional climate change studies, the PCM resolution is somewhat marginal at present, and thus a doubling of the number of gridpoints in each horizontal direction will be made in both the ocean and atmosphere models as the project progresses. These higher resolution component models already exist and are being tested. The present model is a T42 atmospheric CCM model with that has an approximately 2.5 degree horizontal resolution. A T85 version of CCM is being tested and some simulations with a resolution twice the present will be carried out over the two-year period. When the higher resolution atmospheric model is coupled into the PCM, they will provide better output for the downscaling efforts described below. However, for the first stages of the project, we will use the existing T42 resolution output from the ensemble of simulations. A list of CCM variables that can be used for analysis and boundary conditions for a regional climate model area regional climate model can be found at http://www.cgd.ucar.edu/pcm/PCMDI
Element 3: Regional Downscaling/Impacts
There is a surprisingly broad capability to take the large large-scale predictions made by global climate models and use them to force regional scale atmospheric models. The resulting high-resolution estimates of physical climate variables (temperature, precipitation, snow, etc.) are subsequently used in hydrological, agriculture, etc. models to provide estimates of economic impacts of climate change.
The pilot effort will concentrate on anthropogenic impacts on the hydrological system of the western United States. At least three different downscaling techniques and subsequent applications to water issues will be used. Two of the methods involve dynamical models driven by the atmospheric component of PCM, the CCM3. Both of these dynamic models are currently operational using input from CCM3 to make regional climate forecasts at seasonal-to-interannual time scales and of anthropogenic impacts; just what we want to do in the pilot. These models need to be run in the ensemble mode for our purposes. An example of the latter type of regional forecast from a single run is shown in Figure 4 where the change in snow fall at the time of the CO2 doubling is shown for the Northwest (courtesy R. Leung). More information on the downscaling aspects of the pilot may be found at http://www.pnl.gov/atmos_sciences/as_clim4.html and http://meteora.ucsd.edu/~meyer/caphome.html.
Virtually all of the tools required to complete the ACPI pilot are in use today. The challenge of the pilot will be to combine these various existing skills into a seamless end-to-end effort. The success of this effort will be largely dependent upon the computer resources dedicated to it.
Back to main page for ACPI at Scripps
Figure 1: General
Figure 2: Velocity vectors
Figure 3: PCM Surface Temperature
Figure 4: Pacific Northwest Impact
Last modified: 16 April 1999