This uncertainty is substantial. If warming occurs at the upper end of the range projected in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report1, then unmitigated climate change will probably prove disastrous worldwide, and rapid global decarbonization is paramount. If warming occurs at the lower end of this range, then decarbonization could proceed more slowly and some societies' resources may be better focused on local adaptation measures.
Reducing these uncertainties substantially will take a new generation of global climate simulators capable of resolving finer details, including cloud systems and ocean eddies. The technical challenges will be great, requiring dedicated supercomputers faster than the best today. Greater international collaboration will be needed to pool skills and funds.
Against the cost of mitigating climate change — conceivably trillions of dollars — investing, say, one quarter of the cost of the Large Hadron Collider (whose annual budget is just under US$1 billion) to reduce uncertainty in climate-change projections is surely warranted. Such an investment will also improve regional estimates of climate change — needed for adaptation strategies — and our ability to forecast extreme weather.

Grand challenges

The greatest uncertainty in climate projections is the role of the water cycle — cloud formation in particular — in amplifying or damping the warming effect of CO2 in the atmosphere2. Clouds are influenced strongly by two types of circulation in the atmosphere: mid-latitude, low-pressure weather systems that transport heat from the tropics to the poles; and convection, which conveys heat and moisture vertically.
Global climate simulators calculate the evolution of variables such as temperature, humidity, wind and ocean currents over a grid of cells. The horizontal size of cells in current global climate models is roughly 100 kilometres. This resolution is fine enough to simulate mid-latitude weather systems, which stretch for thousands of kilometres. But it is insufficiently fine to describe convective cloud systems that rarely extend beyond a few tens of kilometres.
Simplified formulae known as 'parameterizations' [i.e. fudge factors] are used to approximate the average effects of convective clouds or other small-scale processes within a cell. These approximations are the main source of errors and uncertainties in climate simulations3. As such, many of the parameters used in these formulae are impossible to determine precisely from observations of the real world. This matters, because simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems4.
Decreasing the size of grid cells to 1 kilometre or less would allow major convective cloud systems to be resolved. It would also allow crucial components of the oceans to be modelled more directly. For example, ocean eddies, which are important for maintaining the strength of larger-scale currents such as the Gulf Stream and the Antarctic Circumpolar Current, would be resolved.

Simulation of convective cloud systems in a limited-area high-resolution climate model.


The goal of creating a global simulator with kilometre resolution was mooted at a climate-modelling summit in 20095. But no institute has had the resources to pursue it. And, in any case, current computers are not up to the task. Modelling efforts have instead focused on developing better representations of ice sheets and biological and chemical processes (needed, for example, to represent the carbon cycle) as well as quantifying climate uncertainties by running simulators multiple times with a range of parameter values.
Running a climate simulator with 1-kilometre cells over a timescale of a century will require 'exascale' computers capable of handling more than 1018 calculations per second. Such computers should become available within the present decade, but may not become affordable for individual institutes for another decade or more.

Climate facilities

The number of low-resolution climate simulators has grown: 22 global models contributed to the IPCC Fourth Assessment Report in 2007; 59 to the Fifth Assessment Report in 2014. European climate institutes alone contributed 19 different climate model integrations to the Fifth Assessment database (go.nature.com/3gu8co). Meanwhile, systematic biases and errors in climate models have been only modestly reduced in the past ten years6...
Even with 1-kilometre cells, unresolved cloud processes such as turbulence and the effects of droplets and ice crystals will have to be parameterized [fudge-factored] (using stochastic modelling to represent uncertainty in these parameterizations9). How, therefore, can one be certain that global-warming uncertainty can be reduced? The answer lies in the use of 'data assimilation' software — computationally demanding optimization algorithms that use meteorological observations to create accurate initial conditions for weather forecasts. Such software will allow detailed comparisons between cloud-scale variables in the high-resolution climate models and corresponding observations of real clouds, thus reducing uncertainty and error in the climate models10.
High-resolution climate simulations will have many benefits beyond guiding mitigation policy. They will help regional adaptation, improve forecasts of extreme weather, minimize the unforeseen consequences of climate geoengineering, and be key to attributing current weather events to climate change.
High-energy physicists and astronomers have long appreciated that international cooperation is crucial for realizing the infrastructure they need to do cutting-edge science. It is time to recognize that climate prediction is 'big science' of a similar league.