There’s an array of economists expressing positions about climate policy in a special issue from 2007 of the web journal Economists’ Voice.
While I fail to see any special expertise in the article by Nobelist Thomas Schelling, I find his argument cogent. It’s a relief to see somebody besides myself making at least this point:
Now the critical question: what does uncertainty have to do with the question, proceed with costly efforts to reduce CO2 abatement in a hurry, or wait until we know more?
In some public discourse, and in sentiments emanating from the Bush Administration, it appears to be accepted that uncertainty regarding global warming is a legitimate basis for postponement of any action until more is known. The action to be postponed is usually identified as “costly.” (Little attention is paid to actions that have been identified as of little or no serious cost.) It is interesting that this idea that costly actions are unwarranted if the dangers are uncertain is almost unique to climate. In other areas of policy, such as terrorism, nuclear proliferation, inflation, or vaccination, some “insurance” principle seems to prevail: if there is a sufficient likelihood of sufficient damage we take some measured anticipatory action.
I go further than this, though. I argue that the less clear the science is, the greater the implied rational response to a credible threat. If we lack information about the safety of a particular action, typically we proceed with extra caution. It is as if the delayers want us to drive more recklessly because it’s dark and foggy and the highway lights are out. After all, it’s much harder to demonstrate the existence of a threat under such conditions, isn’t it?
This is why defending the science is not the right way to defend vigorous policy. When someone points out to you the weaknesses in climate science, if you aren’t well versed you are probably best off reframing the debate. “Look, we know there’s SOME effect of all these human activities, right? So if the climate scientists are wrong, they could just as easily be UNDERESTIMATING the problems!”
I also like Schelling’s argument against the “precautionary principle”, the irrational argument that all activities should be based on fearing the worst, summarized as “never do anything for the first time”. His concluding paragraphs:
How should we respond to that kind of uncertainty? Wait until the uncertainty has been resolved completely before we do anything, or act as if it’s certain until we have assurance that there’s no such danger?
Those two extremes are not the only alternatives!
A lot of the usual interesting questions about the uses of expertise in a democracy ensue from this conclusion, and Schelling does not take them on. In practice, though, we will continue to see a great deal of skepticism directed at climate science.
What would be the consequence if that skepticism were valid? In, short, what if rather than a factor of two, our uncertainty about the global temperature sensitivity were uncertain to a factor of ten? Then rather than looking at 1.5C to 6C per doubling we’d be looking at 0.3C to 30C. Neither of these outcomes strikes those of us familiar with the territory as plausible, but the point is that starting from an assumption of intellectually weak climatology, they are roughly equally probable. That would bring us into a world where total or near-total extinction is a substantial risk. It would dominate the calculation.
Rational behavior is risk weighted. What is widely missed is that the less confidence you have in climate science, the less the risks are constrained and thus the more you should be weighing the severe risks in your risk-weighted decisions.