Jun. 15th, 2014

openspace4life: (Default)

Failure to solve the climate crisis probably means plunging civilization into a new dark age -- but humanity has survived dark ages before. According to Bill McKibben, the Holocene era of stable climate is already over -- but before the dawn of civilization, humanity lived through several drastic changes in climate. There's no reason why anyone would want to return to either of those conditions, but what if the alternative is even worse? It's very difficult to tell whether that might be the case, because the alternative is plunging deep into the unknown.

What I'm talking about here is the accelerating rate of technological progress, which gives us our only real hope of averting global climate catastrophe. The problem, as I've mentioned, is that we're trying to slow and stabilize other accelerating processes, which is such a mammoth task that it essentially requires setting up new exponential-growth curves (such as the rate of renewable-energy installation) that might well carry their own ill-considered risks. To paraphrase the NRA, “the only thing that can stop a bad exponential curve is a good exponential curve” -- but is there really any such thing?

Paul Krafel certainly believes there is. His movie The Upward Spiral is actually named for the concept of a good exponential curve, one that creates ever-growing amounts of life and possibility. But Paul's upward spirals are very distributed and grassroots, starting by sharing small local solutions with as many people as possible and hoping they will eventually add up. Apart from tree-planting movements, though, the bulk of the progress we've made toward climate solutions so far has come thanks to megacorporations like GE and Vestas, which can act much faster to deploy solutions at a global scale, and can be motivated by equally centralized policy shifts like the renewable energy production tax credit. In an era of increasing and fully justified alarm about the limited time remaining to avert a collapse, the latter approach seems likely to continue to dominate our response. (Even the accelerating trend toward solar rooftops, which challenges the business model of centralized electric utilities, is driven by the relatively few companies that actually manufacture the solar panels. If those companies hadn't succeeded (with the help of a few big government research institutes) in making photovoltaics so cheap, they would still be a tiny niche market.)

And it's not only the unknown consequences of these panicked high-speed deployments of green technology that worries me. Even on an alternate Earth where the Industrial Revolution was based on non-polluting technology from the start, we would still face another terrifying unknown: what happens when technological progress accelerates to the point where mere human brains can no longer keep up?

It used to be typical to refer to this problem as “future shock,” based on the famous book by Alvin Toffler. These days it's gotten attached to the Technological Singularity concept, and hence to the various sci-fi scenarios where superhuman AIs take over the world. But I'd like to point out that we needn't postulate the development of strong AI to make accelerating progress scary. Consider this quote from the webcomic The Spiders by Patrick Farley:

“Unfortunately the biotechnology which created this virus is only getting more user-friendly. In 10 years it'll be possible for a small community of assholes with fast modems and a shared grudge to wipe out the entire human race.

“And this won't be a problem for the next 10 years, but the next ten thousand. Grok this fact, and then we can discuss ethics, Lieutenant.”

Considering the growing power of various potentially destructive technologies, and the depths of fanatical extremism that humans are capable of, and the difficulty of policing a world of billions to ensure that world-destroying plots are never brought to fruition, you have to wonder whether it would actually be less harmful in the long run to let civilization crash.

Then again, you also have to wonder whether it’s reasonable to base present-day policy decisions on a theoretical future in which some technology that can wipe out the human race could be secretly developed and deployed by a tiny terrorist group. “Comic-book politics” is the term that comes to mind here. That’s why I ultimately decided not to classify this entry as part of my “personal psychology of despair” series. Am I anxious about the dangers of overly rapid change? Yes. Does that alone constitute a reason for despair? No. If it did, I don’t think I could get up in the morning and go to work in the software industry, which changes faster than anything in human history.

March 2015

S M T W T F S
1234567
89101112 1314
15161718192021
22232425262728
293031    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 4th, 2025 12:41 pm
Powered by Dreamwidth Studios