As promised, here are comments on another chapter from The Lord of the Rings and Philosophy, this time from chapter 2, "The Cracks of Doom: The Threat of Emerging Technologies and Tolkien's Rings of Power," by Theodore Schick, Jr. The article has the Ring symbolizing a new discovery or technology that could Destroy the World As We Know It, then asks what we could or should do about it in the here and now. The decision of the Council of Elrond to attempt the risky destruction of the Ring was even a possibility only because the Ring could be destroyed and never refashioned. The destructive technologies we now face are more like genies that can't be put back in the bottle and pose sobering challenges (both practical and moral) as we stumble into the 21st century.
Schick identifies three technologies that could at some point unleash a Ring-like threat to modern civilization: nanotechnology, genetics, and robotics. I know, you've seen the generic plotlines at least a dozen times on Star Trek, and the Borg probably represent all three at once. But we're talking about the real world here, and dress rehearsals for The Big Mistake have happened before. The Black Death of the 14th century in Europe; the decimation of the native populations of the Americas due to (unintentional) biological warfare; the Thirty Years War of the 20th century, 1914-45. Yes, if we're not careful, we can actually screw up the world. Who do you think will save us ... the government? The UN? Elrond and Gandalf?
The potential danger isn't just hype. Go read Bill Joy's essay Why the Future Doesn't Need Us for a credible opinion that there is something to be concerned about. (Joy was the chief scientist at Sun Microsystems for many years; Schick discusses Joy's essay in some detail). First sentence of Joy's essay: "From the moment I became involved in the creation of new technologies, their ethical dimensions have concerned me, but it was only in the autumn of 1998 that I became anxiously aware of how great are the dangers facing us in the 21st century." A key aspect of the threat seems to be that destructive technologies will be (are already?) accessible to small groups if they have money and motivation. But the dynamic of new technologies is such that no malicious intent is really needed to create a continental equivalent of Chernobyl. Or worse. Self-replicating nanotechnology sounds like science fiction until you think about computer viruses, worms, etc. When teenagers can cook up bio-viruses with recipes downloaded from unleashdestruction.com, The End is near. [Good news: I checked the address and we're still safe.]
So what is to be done? Can we find the resolve and the moral justification for preventing research into potentially threatening technologies? Could the research be allowed but carefully contained? Would a strengthened regulation regime (strengthened both in terms of resources allocated to it and in terms of powers of investigation and regulation presently lacking in non-totalitarian countries) be effective in preventing or containing such research? Could the government that was unable to prepare for and respond to something as obvious as a hurricane striking New Orleans possibly identify and respond to future threats from new technologies? Or would a stronger government role actually make development more likely? Does anyone but an economist think the free market will produce an effective solution to these technological threats? If neither the free market nor the government offers much hope ... well, it's a short list. Schick finds none of these possibilities as offering much hope. The best spin he can put on things is to note that we've faced the thermonuclear threat for sixty years but haven't blown ourselves up yet.
Here's a new one: The Mormon Solution. Locate the main body of Mormons out in the middle of nowhere (i.e., Utah). Teach and encourage everyone to put a year's worth of food in the garage or the basement. Collect all the genealogy records and store them on the Server Under the Mountain (so the rest of the world will be gone but not forgotten). Carry on until Babylon self-destructs. Then pick up the pieces and rebuild. Sound familiar? The Mormon Solution offers two significant advantages over any other plan I've read about: (1) It is feasible. (2) It has already been implemented, at least through the "carry on" stage. Let's hope we carry on for some time.
How would one practially go about "...preventing research into potentially threatening technologies?" The problem with research is that so much of it is accidental-- in an attempt to seek a solution to one problem, another problem is accidentally solved. Monsanto's Roundup and Nylon fiber discoveries were both accidental. Viagra was developed from research that began seeking a treatment for strokes. The list could go on and on. While these happen to be products that, at the very least, seem to be morally neutral I'm sure there exist example of potentially "evil" products that developed out of "harmless" research. So I think trying to restrict what is researched would be ineffective at best and oppressive at its worst.
Posted by: Paul Mortensen | Mar 22, 2006 at 11:43 AM
Dave, if you want to go by sheer numbers, the killer technology is already here.
More people die every year in car accidents than died in most of our nation's wars.
The sheer suffering and human loss is overwhelming.
Had our society been more forward-thinking and less concerned with greedy self-opportunism, it might have been averted in large measure.
The thing about the REAL threats is that you never notice that they are there. In fact, we glorify them, praise them, and devote large portions of our lives in their service.
Posted by: Seth R. | Mar 22, 2006 at 01:55 PM