By switching to dark mode you can reduce the energy consumption of our digital service.

Climate model uncertainty

Multiple Authors

Introduction

These notes have been collated from presentations given by Bruce Hewitson, Mark Tadross and David Stainforth, some of which can be found at the end of this page, and notes taken at the April 2008 ACCCA workshop on climate uncertainty.

Uncertainty

If we knew with certainty what the impacts of climate change would be at a local level, then adaptation would be easier; we could say this will be the impact in this location in this year, and then look at what would need to happen to avoid that impact. Unfortunately, we don’t, and it is likely that we will never be able to make predictions that are detailed enough and certain enough to make a ‘predict and adapt’ approach to adaptation a viable option. This page explains why we can’t take this approach, what can be done with the current climate projections, and underlines why the weADAPT approach is to look at the range of possibilities and provide support for decision-making in adaptation.

The majority of projections of future climate come from Global Circulation Models (GCMs), which vary in the way they model the climate system, and so produce different projections about what will occur in the future. These differences can be highly significant, for example some models may show a region getting wetter, and some would show it getting drier. The skill of GCMs is in the large-scale processes of the climate system; they cannot make projections below the size of one grid cell (typically 300km2) and perform best at much larger scales. Regional Climate Models (RCMs) and empirically downscaled data from GCMs allow projections to be made at a finer scale, but are still uncertain; RCM projections vary between model in the same way as GCMs and must be run within GCMs so contain some of the larger biases, and empirical downscaling does not attempt to correct any biases in the data from the GCMs.

Much of the difference in output between GCMs is due to the way that they parameterize different variables. This means that for certain variables, such as precipitation, the models can’t use their internal physics to make rainfall, so they define a relationship between, for example, humidity in the atmosphere and rainfall. Different models use different relationships, so their output varies. Different internal structures in the models also means that they position boundaries between wetting and drying in different areas, which can creates variation between the models. Uncertainty and differences between the models also arises because small differences in the starting conditions from which the models begin their runs vary the output and projections that they produce. Work done by the Climate Systems Analysis Group at the University of Cape Town shows that the skill of a GCM in simulating current climate cannot be taken as indicator of how good it is, as the change seen in the models under anthropogenic forcing does not depend on its skill in the present. Models with different simulations of current climate but which capture the broad processes show similar changes into the future, and it is this change that is taken and applied to current observed conditions. Therefore as long as a model captures the large-scale features of the climate system at present, it’s projections for the future must be taken as being credible possibilities.

To try to understand what the full range of possible futures is, experiments such as climateprediction.net are using many thousands of model runs and changing the values of parameters and initial conditions slightly in different models to produce the range of plausible possible changes. Experiments changing the starting conditions of models are called multi-ensemble model runs, and those changing model parameters are perturbed physics experiments. The more simulations there are, the more confidence there should be that we are looking at the full range of uncertainty in the system, although it may be that more models are needed in order to capture the full envelope of uncertainty.

Given the differences between models, it is important to look at the range of projections from different models rather than just relying on one outcome chosen from many possibilities. Reliance on a projection from one model may hide the fact the other models are projecting different changes, and if an adaptation option is based on that projection may mean it is unsuitable if the projection turns out to be incorrect. Some areas of uncertainty are likely to decrease, but some may not, for example the range of projections of change in temperature for 2050 has changed very little since initial calculations were made over 20 years ago, so it is important to recognise that we need to work with this uncertainty. Some aspects of the climate system may be too chaotic to say with certainty where within a range of possibilities the system may end. The Climate Information Portal enables users to explore this uncertainty by looking at projections from different models.

The important point for adaptation is how to deal with this uncertainty and make decisions which are robust against a range of future possibilities. This could be through looking at the range of projections from the different models to see which results are consistent and we can be confident about; for example if all models say it will get wetter in June then we can be confident about this projection. If the relevant results are uncertain, for example it may get wetter or drier, then it is important to choose adaptation options which will be effective regardless of which change occurs – that are robust against the range of future changes. This might involve building the resilience of the system and adaptive capacity rather than options which rely on the direction of change.

This approach to adaptation is fundamental to the way the weADAPT Group sees adaptation and is behind the development of guidance on decision-making under uncertainty and tools such as the Climate Information Portal to explore uncertainty in climate projections.

Notes direct from the presentations

  • there are about 10 global climate models in the world
  • even the best models only give one possibility
  • models may have done well in the past but what about the future?
  • there is a need to understand different forms of predictions in projections especially for communication – some things are intrinsically uncertain such as throwing die
  • climate sensitivity can be defined as equilibrium global mean surface temperature change for a doubling of CO2 levels
  • climateprediction.net: >300 000 participants, 10 000 years computing time, 110 000 completed simulations
  • proxy indicators might be more dangerous ground than including uncertainty
  • predictability is not sufficiently good in many places to base decisions on – knowledge of the regional climate system is simply not good enough
  • in terms of exposure there are only a fairly small number of variables that we are actually interested in – but they are often compound parameters
  • we want to know what the confidence is – how well do the models deal with the dynamics involved in the variable being requested
  • it is important to push climate scientists on these confidence issues and certain variable requests can inform the development pathways of the models
  • models are still missing critical modules of local physical responses to change – this is highly variable between locations (depending of capacity to improve model ability) – however sensitivity analyses, even in places where modelling capacity is high, still show large uncertainty
  • despite this we can not wait for better science before we act – we need to work out how best to use what we have now while informing the direction of model development

Dave Stainforth’s presentation

Key points from Dave Stainforth’s presentation:

  • Climate Models are the principle tools for climate prediction.
  • Most impacts studies are based on the predictions of AOGCMS.
  • Climate models as a forecasting tool

Method: Create the model based on physical understanding. Validate the model by “hindcasting” past climate. e.g. the 20th century or various paleo periods such as the Last Glacial Maximum (LGM) Run them for various scenarios of the future to make forecasts.

  • For some global variables complex 3D models can simulate observed climate change reasonably well.
  • Different models give different results; particularly at the regional / seasonal level.
  • Different models might be equally good at representing the past but respond quite differently to the levels of atmospheric greenhouse gases predicted for the future.
  • Climate forecasts are intrinsically uncertain but by using probabilistic techniques we can begin to extract confident statements about some aspects of future climate.
  • These could involve large levels of uncertainty but may still be useful in designing robust strategies for business and society.

Methods for making probabilistic forecasts: Ensembles of Opportunity. (IPCC TAR {2001}) Scaling single model simulations. (Stott and Kettleborough) Perturbed-physics:Take a model and vary the way it represents the processes involved e.g. change the values of uncertain parameters. Intermediate Complexity Models (Forest et al.{2002}, Knutti et al.{2002} etc.)) Simple Models (Frame et al. {2005}) Complex Models (GCMs) (Stainforth et al. {2005})

  • Climate sensitivity is defined as the equilibrium global mean surface temperature change for a doubling of CO2 levels.

Sources of uncertainty and how to include them in a climate forecast:

  • Forcing uncertainty: Changes due to factors external to the climate system e.g. greenhouse gas emissions (natural and anthropogenic), solar radiation etc. How much will mankind emit? Solution: Scenarios for possible futures.
  • Initial Condition Uncertainty: How is the prediction is affected by our imprecise knowledge of the current state of the system at even the smallest scales? Response: Initial Condition Ensembles
  • Model uncertainty:Different models could be as good at simulating the past but give a different forecast for the future? Solution: Perturbed-Physics Ensembles

Exploring uncertainty: the Climateprediction.net experiment

  • To quantify uncertainty we need 100s of thousands of simulations.
  • Impossible with super computers but possible with distributed computing.
  • At www.climateprediction.net people can download the model to their PC.
  • Using the latest, complex model means we can get regional detail as well as global averages.
  • Latest statistics: > 300,000 participants; > 24M years simulated; > 110,000 completed simulations (Each 45years of model time); 10000 years of computing time.

Add your project

Exchange your climate change adaptation projects and lessons learned with the global community.