DARK MATERIAL

DARK MATERIAL

Martin Rees [6.12.06]

Nuclear scientist Joseph Rotblat campaigned against the atom bomb he had helped unleash. In the Rotblat Memorial Lecture, delivered recently at the Hay Literary Festival, Lord (Martin) Rees wonders whether it's time for today's cyber scientists to heed Rotblat's legacy

(MARTIN REES:) Scientists have had a bad literary press: Dr Frankenstein, Dr Moreau, and especially Dr Strangelove. This lecture commemorates a man who was the utter antithesis of Strangelove.

Jo Rotblat was a nuclear scientist. He helped to make the first atomic bomb. But for decades thereafter, he campaigned to control the powers he'd helped unleash. Until last few months of his long life, he pursued this aim with the dynamism of a man half his age, inspiring others to join the cause. Today, I want to talk about the threats and challenges of science in the 21st century and what younger scientists can learn from Jo's example.

A year ago, Robert McNamara, age 88, spoke here in this tent — his confessional movie 'Fog of War' had just appeared. Jo Rotblat, age 96, was due to be on the platform with him. This might have seemed an incongruous pairing. Back in the 1960s, McNamara was American Secretary of Defense — in charge of the nuclear arsenal. And Rotblat was an antinuclear campaigner. But in old age they converged — McNamara himself came to espouse the aim of eliminating nuclear weapons completely.

Sadly, Jo Rotblat wasn't well enough to come here last Summer He died later that year — after a long life scarred by the turmoils of the last century. Jo was born in Poland in 1908. His family suffered great hardship in World War 1. He was exceptionally intelligent and determined, and managed to become a nuclear physicist. After the invasion of Poland, he came as as a refugee to England to work with James Chadwick at Liverpool University — his wife became a victim of the Nazis.

He then went to Los Alamos as part of the British contingent involved in the Manhattan project to make the first atom bomb.

In his mind there was only one justification for the bomb project: to ensure that Hitler didn't get one first and hold us to ransom. As soon as this ceased to be a credible risk, Jo left Los Alamos — the only scientist to do so. Indeed, he recalls having been disillusioned by hearing General Groves, head of the project, saying as early as March 1944 that the main purpose of the bomb was "to subdue the Russians".

He returned to England; became a professor of medical physics, an expert on the effects of radiation; and a compelling and outspoken campaigner. In 1955, Jo met Bertrand Russell, and encouraged him to prepare a manifesto stressing the extreme gravity of the nuclear peril. Jo got Einstein to sign too — it was Einstein's last public act, he died a week later. This 'Einstein Russell manifesto' was then signed by ten other eminent scientists — all Nobel Prize winners. (Jo was diffident about signing, but Russell urged he should as he might one day earn one himself.) The authors claimed to be "speaking on this occasion not as members of this or that nation, continent or creed, but as human beings, members of the species Man, whose continued existence is in doubt". This manifesto led to the initiation of the Pugwash Conferences — so called after the village in Nova Scotia where the inaugural conference was held; in the decades since, there have been 300 meetings; Jo attended almost all of them.

When the achievements of these Conferences were recognised by the 1995 Nobel Peace Prize, half the award went to the Pugwash organisation, and half to Rotblat personally—as their 'prime mover' and untiring inspiration. Particularly during the 1960s, the Pugwash Conferences offered crucial 'back door' contact between scientists from the US and the Soviet Union when there were few formal channels — these contacts eased the path for the partial test ban treaty of 1963, and the later ABM treaty.

In the two World Wars and their aftermath, 187 million perished by war, massacre, persecution or policy-induced famine. But during the Cold War we were at still greater hazard: a nuclear war between the superpowers could have killed a billion people, and devastated the fabric of civilisation. The superpowers could have stumbled towards armageddon through muddle and miscalculation.

We're now very risk-averse. We fret about statistically tiny risks — carcinogens in food, one in a million chance of being killed in train crashes, and so forth. It's hard to contemplate just how great the risks of nuclear catastrophe once were. The Cuban Missile stand-off in 1962 was the most dangerous moment in history. and McNamara was then the US Secretary of Defense. He later wrote that " we came within a hairbreadth of nuclear war without realising it. It's no credit to us that we escaped — Khrushchev and Kennedy were lucky as well as wise." The prevailing nuclear doctrine was deterrence via the threat of 'mutual assured destruction' (with the eponymous acronym MAD). Each side put the 'worst case' construction on whatever the other did, overestimated the threat, and over-reacted. The net result was an arms race that made both sides less secure.

It wasn't until he'd long retired that McNamara spoke frankly about the events in which he'd been so deeply implicated. He noted that "virtually every technical innovation in the arms race came from the US. But it was always quickly matched by the other side". The decisions that ratcheted the arms race were political, but scientists who develop new weapons must themselves share the blame.

Another who spoke out after retirement was Solly Zuckerman, the UK government's longtime chief scientific advisor. He said "ideas for new weapon systems derived in the first place, not from the military, but from scientists and technologists merely doing what they saw to be their job.... the momentum of the arms race is fueled by technicians in governmental laboratories and in the armaments industries".

Anyone in weapons labs whose skills rose above routine competence, or who displayed any originality, added their iota to this menacing trend. In Zuckerman's view the weapons scientists were "the alchemists of our times, working in secret ... , casting spells which embrace us all".

The great physicist Hans Bethe also came round to this view. He was the chief theorist at Los Alamos. and worked on the H-bomb, but by 1995 his aversion to military research had hardened, and he urged scientists to " desist from work creating, developing, improving and manufacturing nuclear weapons and other weapons of potential mass destruction" Some of Bethe's concerned colleagues started a journal called the The Bulletin of Atomic Scientists. The 'logo' on its cover is a clock, the closeness of whose hands to midnight indicate the Editor's judgment on how precarious the world situation is. Every few year the minute hand is shifted, either forwards or backwards.

When the cold war ended, the nuclear threat plainly eased; the Bulletin's clock was put back to 17 minutes to midnight. There was thereafter far less chance of ten thousand bombs devastating our civilisation. But this catastrophic threat could be merely in temporary abeyance. In the last century the Soviet Union rose and fell, there were two world wars. In the next hundred years, geopolitical realignments could be just as drastic, leading to a nuclear standoff between new superpowers., which might be handled less well than the Cuba crisis was. I think you'd have be optimistic to rate the probability as much below 50 percent But there's now more chance then ever of a few nuclear weapon going off in a localised conflict. We are confronted by proliferation of nuclear weapons (in North Korea and Iran for instance). Al Queda-style terrorists might some day acquire a nuclear weapon. If they did, they would willingly detonate it in a city centre, killing tens of thousands along with themselves; and millions around the world would acclaim them as heroes. I've focused so far on the nuclear threat . It's still with us — it always will be. But it's based on basic science that dates from the 1930s, when Jo Rotblat was a young researcher.

But let's now look forward. What are the promises and threats from 21st century science? My main message is that science offers immense hope, and exciting prospects. But it may have a downside. It may not threaten a sudden world-wide catastrophe — the doomsday clock is not such a good metaphor — but the threats are, in aggregate, as worrying and challenging. But there's a real upside too: indeed there are grounds for being a techno-optimist.

The technologies that fuel economic growth today — IT, miniaturisation and biotech —- are environmentally and socially benign. They're sparing of energy, and of raw materials. They boost quality of life in the developing as well as the developed world, and have much further to go. That's good news. Not only is science advancing faster than ever, it's causing new dimension of change. Whatever else may have changed over preceding centuries, humans haven't — not for thousands of years. But in this century targeted drugs to enhance memory or change mood, genetic modification, and perhaps silicon implants into the brain, may alter human beings themselves — their minds and attitudes, even their physique That's something qualitatively new in our history. It means that our species could be transformed, not on the millions of years of Darwinian selection, but within a few centuries. And it raises all kinds of ethical conundrums. And the work of Ray Kurzweil and others like him reminds us that we should keep our minds open, or at least ajar, to things that today seem beyond the fringe of science fiction.

But we can plausibly predict some disquieting trends. Some are environmental: rising populations, especially in the megacities of the developing world, increasing energy consumption, etc. Indeed, collective human actions are transforming, even ravaging, the entire biosphere — perhaps irreversibly — through global warming and loss of biodiversity. We've entered the new geological era, the anthropocene. We don't fully understand the consequences of our many-faceted assault on the interwoven fabric of atmosphere, water, land and life. We are collectively endangering our planet.

But there's a growing danger from individuals too. Technology empowers each of us ever more and interconnects us more closely. So even a single person will have the capability to cause massive disruption through error or terror.

An organised network would not be required: just a fanatic, or a weirdo with the mindset of those who now design computer viruses — the mindset of an arsonist. There are such people, and some will be scientifically proficient. We're kidding ourselves if we think that technical education leads necessarily to balanced rationality. It can be combined with fanaticism —not just traditional fundamentalism — Christian in the US, Muslim in the East — but new age irrationalities. The Raelians and Heavens Gate cult are disquieting portents: their adherents claim to be 'scientific' but have a precarious foothold in reality. The techniques and expertise for bio or cyber attacks will be accessible to millions — they doesn't require large special purpose facilities like nuclear weapons. It would be hard to eliminate the risk, even with very intrusive surveillance.

The impact of even a local incident — "bio" or "cyber"— would be hyped and globalised by the media, causing wide disruption — psychic and economic. Everyone would be mindful that the same thing could happen again, anywhere, anytime.

There will always be disaffected loners in every country, and the 'leverage' each can exert is ever-growing. The global village will have its global village idiots.

[I recall a talk here by Francis Fukuyama, about his book Our Posthuman Future. He argued that habitual use of mood-altering medications would narrow the range of humanity. He cites the use of prozac to counter depression, and of ritalin to damp down hyperactivity in high-spirited but otherwise healthy children. He feared that drugs will become universally used to tone down extremes of behaviour and mood and that our species would degenerate into pallid acquiescent zombies.

But my worry is the opposite of Fukuyama's. 'Human nature' encompasses a rich variety of personality types, but these include those who are drawn towards the disaffected fringe. The destabilizing and destructive influence of just a few such people will be ever more devastating as their technical powers and expertise grow, and as the world we share becomes more interconnected.

Can civilisation be safeguarded, without humanity having to sacrifice its diversity and individualism? This is a stark questions, but I think it's a serious one.]

Some commentators on biotech, robotics and nanotech worry that when the genie is out of the bottle, the outcome may be impossible to control. They urge caution in 'pushing the envelope' in some areas of science — that we should guard against such nightmares by putting the brakes on the science they're based on.

But that's naive. We can't reap the benefits of science without accepting some risks — the best we can do is minimise the risks. The typical scientific discovery has many applications — some benign, others less so. Even nuclear physics has its upside — its medical uses have saved more people than nuclear weapons actually killed.

The uses of academic research generally can't be foreseen: Rutherford famously said, in the mid-thirties, that nuclear energy was 'moonshine'; the inventors of lasers didn't foresee that an early application of their work would be to eye surgery; the discoverer of x-rays was not searching for ways to see through flesh.

21st century science will present new threats more diverse and more intractable than nuclear weapons did. They'll pose ethical dilemmas. There surely will be more and more 'doors that we could open but which are best left closed' — for ethical or prudential reasons.

A blanket prohibition on all risky experiments and innovations would paralysed science and deny us all its benefits. In the early days of steam, hundreds of people died horribly when poorly designed boilers exploded. Most surgical procedures, even if now routine, were risky and often fatal when they were being pioneered.

But we do need to be more cautious today. The worst conceivable consequences of a boiler explosion are limited and localised. In contrast, some 21st century innovations or experiments, if they went wrong, could have global effects — we confront what some people call 'existential risks'.

Scientists sometimes abide by self-imposed moratoria on specific lines of research. A precedent for this was the so called "Asilomar declaration" in 1975 whereby prominent molecular biologists refrained from some experiments involving the then-new technique of gene-splicing. There are now even more reasons for exercising restraint — ethics, risk of epidemics, and the 'yuk' factor — Just this week there have been moves, again in California, to control the still more powerful techniques of 'synthetic biology'.

But a voluntary moratorium will be harder to achieve today: the academic community is far larger, and competition (enhanced by commercial pressures) is more intense. To be effective, the consensus must be worldwide. If one country alone imposed regulations, the most dynamic researchers and enterprising companies would migrate to another that was more sympathetic or permissive. This is happening already in stem cell research.

How can we prioritise and regulate, to maximise the chance that applications are benign, and restrain their 'dark side'? How can the best science be fed in to the political process?

We can't do everything in science. There's an ever-widening gap between what can be done and what can be afforded.

At the moment, scientific effort is deployed sub optimally. This seems so whether we judge in purely intellectual terms, or take account of likely benefit to human welfare. Some subjects have had the 'inside track' and gained disproportionate resources. Others, such as environmental researches, renewable energy sources, biodiversity studies and so forth, deserve more effort. Within medical research the focus is disproportionately on cancer and cardiovascular studies, the ailments that loom largest in prosperous countries, rather than on the infections endemic in the tropics. Choices on how science is applied shouldn't be made just by scientists. That's why everyone needs a 'feel' for science and a realistic attitude to risk — otherwise public debate won't get beyond sloganising. Jo Rotblat favoured a 'Hippocratic' Oath' whereby scientists would pledge themselves to use their talents to human benefit. Whether or not such an oath would have substance, scientists surely have a special responsibility. It's their ideas that form the basis of new technology.

We feel there is something lacking in parents who don't care what happens to their children in adulthood, even though it's generally beyond their control. Likewise, scientists shouldn't be indifferent to the fruits of their ideas — their intellectual creations. They should plainly forgo experiments that are themselves risky or unethical. More than that, they should try to foster benign spin-offs, but resist, so far as they can, dangerous or threatening applications. They should raise public consciousness of hazards to environment or to health.

The decisions that we make, individually and collectively, will determine whether the outcomes of 21st century sciences are benign or devastating. Some will throw up their hands and say that anything that is scientifically and technically possible will be done — somewhere, sometime — despite ethical and prudential objections, and whatever the laws say — that science is advancing so fast, and is so much influenced by commercial and political pressures, that nothing we can do makes any difference. Whether this idea is true or false, it's an exceedingly dangerous one, because it's engenders despairing pessimism, and demotivates efforts to secure a safer and fairer world. The future will best be safeguarded — and science has the best chance of being applied optimally — through the efforts of people who are less fatalistic. And here I am optimistic. The burgeoning technologies of IT, miniaturisation and biotech are environmentally and socially benign. The challenge of global warming should stimulate a whole raft of manifestly benign innovations — for conserving energy, and generating it by novel 'clean' means (biofuels, innovative renewables, carbon sequestration, and nuclear fusion). Other global challenges include controlling infectious diseases; and preserving biodiversity.

These challenging scientific goals should appeal to the idealistic young. They deserve a priority and commitment from governments, akin to that accorded to the Manhattan project or the Apollo moon landing.

I've spoken as a scientist. But my special subject is cosmology — the study of our environment in the widest conceivable sense. I can assure you, from having observed my colleagues, that a preoccupation with near-infinite spaces doesn't make cosmologists specially 'philosophical' in coping with everyday life. They're not detached from the problems confronting us on the ground, today and tomorrow. For me, a 'cosmic perspective' actually strengthens my concerns about what happens here and now: I'll conclude by explaining why. The stupendous timespans of the evolutionary past are now part of common culture. We and the biosphere are the outcome of more than four billion years of evolution,but most people still somehow think we humans are necessarily the culmination of the evolutionary tree. That's not so. Our Sun is less than half way through its life. We're maybe only the half way stage. Any creatures witnessing the Sun's demise 6 billion years hence won't be human — they'll be as different from us as we are from bacteria.

But, even in this 'hyper-extended' timeline — extending billions of years into the future, as well as into the past — this century may be a defining moment. The 21st-century is the first in our planet's history where one species has Earth's future in its hands, and could jeopardise life's immense potential. I'll leave you with a cosmic vignette. We're all familiar with pictures of the Earth seen from space — its fragile biosphere contrasting with the sterile moonscape where the astronauts left their footprints. Suppose some aliens had been watching our planet for its entire history, what would they have seen? Over nearly all that immense time, 4.5 billion years, Earth's appearance would have altered very gradually. The continents drifted; the ice cover waxed and waned; successive species emerged, evolved and became extinct.

But in just a tiny sliver of the Earth's history — the last one millionth part, a few thousand years — the patterns of vegetation altered much faster than before. This signaled the start of agriculture. The pace of change accelerated as human populations rose.

But then there were other changes, even more abrupt. Within fifty years — little more than one hundredth of a millionth of the Earth's age, the carbon dioxide in the atmosphere began to rise anomalously fast. The planet became an intense emitter of radio waves (the total output from all TV, cellphone, and radar transmissions.)

And something else unprecedented happened: small projectiles lifted from the planet's surface and escaped the biosphere completely. Some were propelled into orbits around the Earth; some journeyed to the Moon and planets.

If they understood astrophysics, the aliens could confidently predict that the biosphere would face doom in a few billion years when the Sun flares up and dies. But could they have predicted this unprecedented spike less than half way through the Earth's life — these human-induced alterations occupying, overall, less than a millionth of the elapsed lifetime and seemingly occurring with runaway speed?

If they continued to keep watch, what might these hypothetical aliens witness in the next hundred years? Will a final spasm be followed by silence? Or will the planet itself stabilise? And will some of the objects launched from the Earth spawn new oases of life elsewhere?

The answer depends on us. The challenges of the 21st century are more complex and intractable than those of the nuclear age. Wise choices will require idealistic and effective campaigners — not just physicists, but biologists, computer experts, and environmentalists as well: latter-day counterparts of Jo Rotblat , inspired by his vision and building on his legacy.