Credit: GETTY

The campaign to establish 350 parts per million (p.p.m.) as a long-term target carbon dioxide concentration has acquired considerable momentum despite relatively little support for this specific number in the scientific literature. As one of the highest-profile scientific endorsements of 350 p.p.m., the essay by Rockström et al. (Nature 461, 472–475; 2009) will no doubt be heavily cited in the run-up to the UN climate negotiations in Copenhagen this December. While the underlying argument for limiting anthropogenic warming to below 2 °C is indisputable, attempts to define a 'climate boundary' in terms of long-term CO2 concentrations represent an unnecessary distraction. The problem is not that 350 p.p.m. is too high or too low a threshold, but that it misses the point. The actions required over the next couple of decades to avoid dangerous climate change are the same regardless of the long-term concentration we decide to aim for.

Rockström et al. define planetary boundaries as “scientifically informed values of the control variable established by societies at a 'safe' distance from dangerous thresholds”. The 350-p.p.m. boundary fails on at least two counts. First, the concentration of carbon dioxide at some unspecified date in the future is not a “control variable” in any recognizable sense. Keeping temperatures at no more than 2 °C above pre-industrial values, which Rockström et al. use as their starting point, will require substantial emissions reductions over the coming decades. Even then, it will probably be many centuries, and possibly millennia, before concentrations return naturally to 350 p.p.m. The time required will partly depend on the long-term behaviour of the carbon cycle, which is highly uncertain. Even more, it will depend on how our descendants manage the carbon budget over the ensuing centuries, which is more uncertain still. Although emissions over the next few decades could commit us to much higher atmospheric CO2 concentrations in the long term, whether they are 350 p.p.m. or 450 p.p.m. in the year 3000 is not something anyone living in the twenty-first century could meaningfully claim to control.

Second, the scientific justification that carbon dioxide levels must equilibrate at 350 p.p.m. or lower to avoid more than 2 °C of warming appears to depend on a rather questionable estimate of the 'climate sensitivity' — the very long-term warming response to a doubling of atmospheric carbon dioxide. Rockström et al. acknowledge that the strength of feedbacks in the present-day climate suggests a most likely value for climate sensitivity of 3 °C, with a 'likely' (one-standard-error) uncertainty range of 2–4.5 °C. Yet they cite evidence from paleoclimate research (Open Atmos. Sci. J. 2, 217–231; 2009) that, in the past, additional feedbacks due to polar ice-sheet melting and poleward shifts in vegetation resulted in a climate sensitivity of 6 °C, with a 'likely' range of 4–8 °C. They invoke this higher number, assuming these additional feedbacks, to justify their 350-p.p.m. target. But is it coherent to include these feedbacks? If stabilizing at 350 p.p.m. would prevent the collapse of the polar ice sheets, why use a value for climate sensitivity that assumes the ice sheets melt?

The same problem applies to the radiative forcing boundary of one watt per square metre (W m−2) suggested by Rockström et al. We cannot categorically rule out the possibility that our descendants may need to steer CO2 levels back below 350 p.p.m. or reduce radiative forcing to less than 1 W m−2 to avoid dangerous climate change, but it would be equally wrong to suggest that current evidence indicates this is the most likely course they will have to take.

There is, however, one important respect in which aiming for 350 p.p.m., even without a date attached, may be a helpful target. For reasons that do not depend on carbon-cycle models, 15–20 per cent of CO2 emissions remain in the atmosphere more or less indefinitely, until removed by chemical weathering or active sequestration (Proc. Natl Acad. Sci. USA 106, 1704–1709; 2009). Because of this lingering CO2, emitting 1 trillion tonnes of carbon over the entire 'anthropocene' era — half of which has already been released — would increase the long-term equilibrium CO2 concentration to at least 350 p.p.m. Hence 'target 350' implies, at a minimum, that we limit net anthropogenic carbon emissions to less than one trillion tonnes. But there is no need to invoke a long-term climate sensitivity of 6 °C or to speculate about multi-century draw-down of CO2 to justify limiting cumulative carbon emissions to less than one trillion tonnes: this is simply what we need to do to keep the most likely peak CO2-induced warming below 2 °C (Nature 458, 1163–1166; 2009).

The importance of cumulative emissions implies that, as far as climate change is concerned, the atmosphere should be treated as an exhaustible resource, which does not seem to fit into the framework of 'planetary boundaries within which we can safely continue to operate indefinitely' at all. Indeed, attempting to define time-invariant boundaries on atmospheric composition and radiative forcing focuses attention on quantities such as the long-term climate sensitivity that are very difficult to constrain, implying that the science is less certain than it actually is. There is no need to speculate about the behaviour of the climate system into the next millennium to make the case that emission reductions are urgently needed to avoid dangerous climate change.

Comment on this article at the Climate Feedback blog