15 November 2005

You ain't seen nothing yet

Sometimes it seems like things can’t get any worse. State education officials are redefining science to include supernatural explanations, congressional leaders confuse fact and faith, the president of the most powerful nation on the planet believes religion is an appropriate criterion for selection of a Supreme Court justice, and leading evangelists say hurricanes are punishment from God for a lack of piety.

But I’ve just finished reading a book that implies there is a much bigger culture war on the horizon, one that will make the current struggle between science and dogma look like an English tea party.

If inventor and futurist Ray Kurzweil is only half right, if his predictions exaggerate the pace of technology change by 100 per cent, then we are still in for a cataclysmic fight over the next 40 years. In The Singularity is Near Kurzweil says we are on the verge of revolutions that will not only remake the world around us, it will change the very essence of what it is to be human. Though the book is perhaps the most optimistic 500 pages I have ever read, it is also one of the most frightening. Scary or not, we have to start thinking about what lies ahead.

The basic thrust of Singularity revolves around recent baby steps, and future giant steps, in three technologies: genetic engineering, nano-scale machines and artificial intelligence based on reverse-engineering the human brain. Collectively, argues Kurzweil, these revolutions will soon allow humans to live forever, merge minds with computers and turn the entire universe into a single, god-like being. This will happen because the rate at which we are improving those technologies is exponential. By 2045 society will be unrecognizable.

For example, the kind of technological progress it took humankind 100 years to bring about in the 20th century would take only 20 today. What took 20 years (from 1980 to 2000) will only take us another 14, and so. What to us would be a 20,000-year research program our children will complete in a single year. Non-biological intelligence will supplement our own, our bodies will no longer require food, we will use all that new cleverness to undo the environmental damage of the industrial era, and no one will ever be poor again.

If it all sounds a bit on the wild side, it is. (It is perhaps fair to note that Kurzweil says he takes 250 supplement pills a day in an effort to extend his life long enough to be around when immortality technology arrives.) But that doesn’t mean he’s wrong. For one thing, Kurzweil is a bright guy, one of the most successful inventors of all time. For another, his vision of the future is a well-footnoted one, with ample support for most (though not all) of his assessments of the state of the art in each of the three fields. There just aren’t a lot of logical or factual holes in his argument.

But I see a problem, one that stems from the same source of the objections that motivate so many anti-science fundamentalists.

Kurzweil goes to great lengths to remind the reader that all this incredible software and hardware will be human products. We’ll still be human, he says, just a better kind of human, one that’s not hobbled by biology. He repeats this point many times, perhaps because, as author of the controversial The Age of Spiritual Machines, he knows a lot of people aren’t going to like the idea of billions of microscopic robots crawling around their bloodstream, re-sequencing their DNA, cloning their organs, and establishing telepathic-like links to conscious computers and other humans.

If that vision doesn’t make you a least a little uncomfortable, then you probably haven’t given it enough thought. He’s talking about the human race turning itself into something entirely new, something that approaches what most people think of as gods.

And this is what really bugs those who can’t let go of creationism. It isn’t the science of evolution -- the knowledge that species change over time through natural selection of those varieties most fit for a particular set of environmental conditions – that they find objectionable. It’s what evolution implies about humanity. If humans aren’t designed directly by a god, then we’re not special.

For the same reason, the Roman Catholic Church resisted the Copernican model of the solar system. The bishops didn’t particularly care whether the sun orbited the Earth or vice versa. It was the consequent loss of a special place in the universe for the human race that they couldn’t abide. Similarly, if we build machines that are more intelligent than ourselves, machines that will then re-engineer us, right down to our DNA, what does that say about old-fashioned humanity (what Kurzweil calls Human 1.0)?

I suspect a lot of people are going to resist Kurzweil’s future. They will fight tooth and nail against many of the new technologies now beginning to come on stream.

And make no mistake – the times are a-changing. A few hours after finishing The Singularity is Near, I was at a dinner for the Pediatric Brain Tumor Foundation, for which my wife works. There I heard Duke University brain tumor researcher Hai Yan talk about his effort to genetically reconfigure cancer. The next day I read the cover story in last week’s New Scientist about reverse engineering the human brain. Nanotechnology (machines smaller than 100 billionth of a meter) is so popular that just about every university is setting up a research institute devoted to the field.

Fear of these technologies could drive people further into the arms of fundamentalist religions, whether it’s evangelical Christianity or Islam. I’m not sure I’m ready for Kurzweil’s future. But I’m not looking forward to a resurgence of Ludditism, either. The one thing we can’t afford to do is dismiss Kurzweil’s vision. We have to start thinking about it. Now.

1 Comments:

Blogger Ron said...

As always, the people who obviously and directly benefit from these technologies will gladly embrace them, especially if they benefit financially. It's those that don't benefit, or are unaware of the benefits that may feel abandoned or worse.

1:56 PM  

Post a Comment

<< Home