IN THE MATRIX (p4)

Considerations of the multiverse change the way we think about ourselves and our place in the world. Traditional religion is far too blinkered to encompass the complexities of mind and cosmos. All we can expect is to have a very incomplete and metaphorical view of this deep reality. The gulf between mind and matter is something we don't understand at all, and some minds can evolve to the stage that they can create other minds, there's real blurring between the natural and the supernatural.

My attitude towards religion is really two-fold. First, as far as the practice of religion is concerned, I appreciate it and participate in it, but I'm skeptical about the value of interactive dialogue There's no conflict between religion and science (except, of course, with naive creationism and suchlike) , but I doubt—unlike some members of the Templeton Foundation—that theological insights can help me with my physics. I'm fascinated to talk to philosophers (and with some theologians) about their work, but I don't believe they can help me very much. So I favor peaceful coexistence rather than constructive dialogue between science and theology

~~~

I am concerned about the threats and opportunities posed by 21st century science, and how to react to them. There are some intractable risks stemming from science, which we have to accept as the downside for our intellectual exhilaration and—even more—for its immense and ever more pervasive societal benefits. I believe there is an expectation of a 50% chance of a really severe setback to civilization by the end of the century. Some people have said that's unduly pessimistic. But I don't think it is: even if you just consider the nuclear threat, I think that's a reasonable expectation.

If we look back over the cold war era, we know we escaped devastation, but at the time of the Cuba crisis, as recent reminiscences of its 40th anniversary have revealed, we were really on a hair trigger, and it was only through the responsibility and good sense of Kennedy and Khrushchev and their advisers that we avoided catastrophe. Ditto on one or two other occasions during the cold war. And that could indeed have been a catastrophe. The nuclear arsenals of the superpowers have the explosive equivalent of one of the US Air Force's s daisy cutter bombs for each inhabitant of the United States and Europe. Utter devastation would have resulted had this all gone off.

The threat obviously abated at the end of the cold war, but looking a century ahead, we can't expect the present political assignments to stay constant. In the last century the Soviet Union appeared and disappeared, there were two world wars. Within the next hundred years, since nuclear weapons can't be disinvented,  there's quite a high expectation, there will be another standoff as fearsome as the cold war era, perhaps involving more participants than just two and therefore be more unstable. Even if you consider the nuclear threat alone, then there is a severe chance, perhaps a 50% chance, of some catastrophic setback to civilization.

There are other novel threats as well. Not only will technical change be faster in this century than before, but it will take place in more dimensions. Up to now, one of the fixed features over all recorded history has been human nature and human physique; human beings themselves haven't changed, even though our environment and technology has. In this century, human beings are going to change because of genetic engineering, because of targeted drugs, perhaps even because of implants into their brain to increase our mental capacity. Much that now seems science fiction might, a century ahead, become science fact. Fundamental changes like that—plus the runaway development of biotech, possibly nanotechnology, possibly computers reaching superhuman intelligence—open up exciting prospects, but also all kinds of potential scenarios for societal disruption—even for devastation.

We have to be very circumspect if we are to absorb these rapid developments without a severe disruption. In the near term the most severe threat stems from developments in biotechnology, and genetic engineering. An authoritative report just last year by the National Academy of Sciences, emphasized that large numbers of people would acquire the capability to engineer modified viruses to which existing vaccines might be ineffective, and thereby trigger some sort of epidemic.

The scary realization is that to do this would not require a large terrorist group, certainly not rogue states, but just by an individual with the same mind set as an arsonist. Such people might start hacking real viruses. If an epidemic ensued, it probably could be contained in this country, but as the SARS episode shows, infections rapidly spread around the world, and if any of these epidemics reach the mega-cities of the third world then they could really take off. I took a public bet—one of the 'long bets' set up by Wired Magazine—that within 20 years one instance of bio error or bio terror would lead to a million fatalities. That's not unduly pessimistic: it would require just one weirdo to release a virus that spread into the third world. That's a scarifying possibility, rendered possible by technology that we know is going to become available. And it's difficult to control as the kind of equipment that's needed is small-scale. Also, it's the kind of technology which is utilized for all kinds of benign purposes anyway and is bound to exist unless we entirely stop drug developments and other bio-technology.

I can foresee what might happen if there were one such event like this that happened in the United States. Supposing that there were a mysterious epidemic which maybe didn't kill a huge number of people, but which could be traced to some engineered virus, maliciously or erroneously released. Everyone would realize that if this happened once it could happen again, any time, anywhere. And they'd realize also that the only way to stop a repeat would be to have intrusive surveillance to keep tabs on everyone who had that expertise. There would be real pressure to infringe on basic freedoms, and a strong anti-science backlash, if one event like this happened. That's the near-term danger which I worry about most.

Previous | Page 1 2 3 4 5 6 Next