I popped into a thread on geoengineering at the always interesting Overcoming Bias site. (Eliezer Yudkowsky in particular has an amazing way of thinking, and when I can follow his arguments I find them extremely interesting and stimulating.)
Nevertheless we see an overvaluing of the sagacity of the marketplace all over the site. I might tend to call that a bias, but that’s just me.
What’s really interesting and terrifying (and why this is news, at least to me) is the rationalist or pseudo-rationalist equivalent of last-days-fundamentalism. It’s easy to see why people who literally expect the Rapture don’t care much about preserving the environment. You wonder how and why their libertarian brethren on the right manage to go along with this, though.
Well, it appears there’s a materialist rapture in the offing, as well. We, or perhaps the robots we appoint as our successor sepcies, will have intellectual superpowers, by means of which we can recover from any damage we might incur. So don’t worry! We’ll be so much smarter later that none of this will matter!
Lest you think I’m exaggerating, here are some quotes, starting with a response as to whether one could set a low dollar value on a guarantee human extinction centuries into the future.
Prof. David Archer of the University of Chicago department of Geosciences is of the opinion that contemporary global warming left unchecked is in fact likely to set of a series of events leading to the relatively sudden release of seabed methane clathrates some thousands of years hence, possibly enough to trigger a much larger global warming event. He does raise the ethical implications of this scenario when he discusses it. So the 630 year question is not entirely a hypothetical.
We’ll have AI and nanotechnology within 50 years. That will make climate change into an irrelevant storm in a teacup.
I wrote about this issue recently. It could even be the subject of a post here: what is the rational way to approach problems of unsustainability if you expect a Singularity? The answer I proposed is essentially to compartmentalize: treat sustainability as a matter of mundane quantifiable governance, like macroeconomics, and treat the Singularity as a highly important contingency that can’t be timed, like a natural disaster. I would still defend that as a first approximation, but clearly the interaction can be more complex: if you really do think that the Singularity will almost certainly happen within 50 years, then you won’t care about environmental changes “thousands of years hence”, or even those slated for the second half of this century. In general, expectation of a near-term Singularity should skew preferences towards adaptation rather than mitigation.
I am definitely rooting against the Singularity in question. We have plenty enough Singularities to deal with as is. I think turning the planet over to machines of our devising is every bit as stupid an idea as boiling the ocean, but I suppose that’s just me and my biases again.
Anyway, the end of time as we know it is nigh; I suppose on this model the messiah will return as a cute pet puppy robot from Sony soon enough. So if you feel like boiling the ocean and burning the forests meanwhile, well, that is the least of your sins, compared to supporting public transportation or universal medicine, I suppose.
Reality is going to be replaced by a throwaway science fiction pulp. Is Phil Dick really dead, or is he still alive and we’re just part of his dream? An excellent basis for rational planning I must say.
I guess this wouldn’t be worth noting at all except that the site itself shows such intense intelligence along with this bafflingly lunatic wishful thinking.