| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >

Evolutionary Biology, Reading University, England; External Professor, Santa Fe Institute, NM


We all develop from a single cell known as a zygote. This zygote divides and becomes two cells, then four, eight and so on. At first, most of the cells are alike, but as this division goes on something wondrous occurs: the cells begin to commit themselves to adopting different fates as eyes or ears, or livers or kidneys, or brains and blood cells. Eventually they produce a body of immense and unimaginable complexity, making things like supercomputers and space shuttles look like Lego toys. No one knows how they do it. No one is there to tell the cells how to behave, there is no homunculus directing cellular traffic, and no template to work to. It just happens.

If scientists could figure out how cells enact this miracle of development they could produce phenotypes—the outward form of our bodies—at will and from scratch, or at least from a zygote. This, or something close to it, will happen in our lifetimes. When we perfect it—and we are well on the way—we will be able to recreate ourselves, even redefine the nature of our lives.

The problem is that development isn't just a matter of finding a cell and getting it to grow and divide. As our cells differentiate into our various body parts they lose what is known as their 'potency', they forget how to go back to their earlier states where, like the zygote, all fates are possible. When we cut ourselves the skin nearby knows how to grow back, erasing all or most of the damage. But we can only do this on a very local scale. If you cut off your arm it does not grow back. What scientists are learning bit by bit to do is how to reverse cells back to their earlier potent states, how to re-program them so they could replace a limb.

Every year brings new discoveries and new successes. Cloning is one of the more visible. At the moment most cloning is a bit of a cheat, achieved by taking special cells from an adult animal's body that still retain some of their potency. But this will change as cell re-programming becomes possible, and the consequences could be alarming. Someone might be able to clone you by collecting a bit of your hair or other cells left behind when you touch something or sit somewhere. Why someone would want to do this—and wait for you to grow up—might limit this in practice but it could happen. You could become your own "father" or at least a very grown up twin.

More in the realm of the everyday and of real consequence is that once we can re-program cells, whole areas of science and medicine, including aging, injury and disease will vanish or become unimportant. All of the contentious work on 'embryonic stem cells' that regularly features in debates about whether it is moral to use embryos in research exists solely because scientists want a source of 'totipotent' cells, cells that haven't committed themselves to a fate. Embryos are full of them. Scientists aren't interested in embryonic stem cells per se, they simply want totipotent cells. Once scientists acquire the ability to return cells to their totipotent state, or even what is known as a 'multi-potent' state—a cell that is not quite yet fully committed—all this stem cell research will become unnecessary. This could happen within a decade.

School children learn that some lizards and crabs can re-grow limbs. What they are not taught is that this is because their cells retain multi- or even toti-potency. Because ours don't, this makes car crashes, ski accidents, gun shot wounds and growing old a nuisance. But once we unlock the door of development, we will be able to re-grow our limbs, heal our wounds and much more. Scientists will for once make the science-fiction writers look dull. The limbs (and organs, nerves, body parts, etc) that we re-grow will be real, making those bionic things like Anakin Skywalker gets fitted with after a light-sabre accident seem primitive. This will make transplants obsolete or just temporary, and things like heart disease will be treatable by growing new hearts. Nerve damage and paralysis will be reversible and some brain diseases will become treatable. Some of these things are already happening as scientists inch-by-inch figure out how to re-program cells.

If these developments are not life changing enough, they will, in the longer-term usher in a new era in which our minds, the thing that we think of as "us", can become separated from our body, or nearly separated anyway. I don't suggest we will be able to transplant our mind to another body, but we will be able to introduce new body parts into existing bodies with a resident mind. With enough such replacements, we will become potentially immortal: like ancient buildings that exist only because over the centuries each of their many stones has been replaced. An intriguing aspect of re-programming cells is that they can be induced to 'forget' how old they are. Aging will become a thing of the past if you can afford enough new pieces. We will then discover the extent to which our minds arise from perceptions of our bodies and the passage of time. If you give an old person the body of a teenager do they start to behave and think like one? Who knows, but it will be game-changing to find out.

Biologist, Schumacher College, Devon, UK; Author, How The Leopard Changed Its Spots


I anticipate that biology will go through a transforming revelation/revolution that is like the revolution that happened in physics with the development of quantum mechanics nearly 100 years ago. In biology this will involve the realisation that to make sense of the complexity of gene activity in development, the prevailing model of local mechanical causality will have to be abandoned. In its place we will have a model of interactive relationships within gene transcription networks that is like the pattern of interactions between words in a language, where ambiguity is essential to the creation of emergent meaning that is sensitive to cultural history and to context. The organism itself is the emergent meaning of the developmental process as embodied form, sensitive to both historical constraint within the genome and to environmental context, as we see in the adaptive creativity of evolution. What contemporary studies have revealed is that genes are not independent units of information that can be transferred between organisms to alter phenotypes, but elements of complex networks that act together in a morphogenetic process that produces coherent form and function as embodied meaning.

A major consequence that I see of this revelation in biology is the realisation that the separation we have made between human creativity as expressed in culture, and natural creativity as expressed in evolution, is mistaken. The two are much more deeply related than we have previously recognised. That humans are embedded in and dependent on nature is something that no-one can deny. This has become dramatically evident recently as our economic system has collapsed, along with the collapse of many crucial ecosystems, due to our failure to integrate human economic activity as a sustainable part of Gaian regulatory networks. We now face dramatic changes in the climate that require equally dramatic changes in our technologies connected with energy generation, farming, travel, and human life-style in general.

On the other hand, the recognition that culture is embedded in nature is not so evident but will, I believe, emerge as part of the biological revelation/revolution. Biologists will realise that all life, from bacteria to humans, involves a creative process that is grounded in natural languages as the foundation of their capacity for self-generation and continuous adaptive transformation. The complexity of the molecular networks regulating gene activity in organisms reveals a structure and a dynamic that has the self-similar characteristics and long-range order of languages. The coherent form of an organism emerges during its development as the embodied meaning of the historical genetic text, created through the process of resolving ambiguity and multiple possibilities of form into appropriate functional order that reflects sensitivity to context. Such use of language in all its manifestations in the arts and the sciences is the essence of cultural creativity.

In conclusion, I see the deep conceptual changes that are currently happening in biology as a prelude and accompaniment to the cultural changes that are occurring in culture, facilitating these and ushering in a new age of sustainable living on the planet.

Physicist, Université de la Mediterrané (Marseille, France); Author, Quantum Gravity


I grew up expecting that, when adult, I'd travel to Mars. I expected cancer and the flu—and all illnesses—to be cured, robots taking care of labor, the biochemistry of life fully unraveled, the possibility of recreating damaged organs in every hospital, the nations of the Earth living prosperously in peace thanks to new technology, and physics having understood the center of a black hole. I expected great changes, that did not came. Let's be open minded: it is still possible for them to come. It is possible for unexpected advances to change everything—it has happened in the past. But—let's indeed be open minded—it is also possible that big changes would not come.

Maybe I am biased by my own research field, theoretical physics. I grew up in awe for the physics of the second half of the XIX century and the first third of the XX century. What a marvel! The discovery of the electromagnetic field and waves, understanding thermodynamics with probability, special relativity, quantum mechanics, general relativity... Curved spacetimes, probability waves and black holes. What a feast! The world transforming every 10 years under our eyes; reality becoming more subtle, more beautiful. Seeing new worlds. I got into theoretical physics. What has happened big in the last 30 years? We are not sure. Perhaps not much. Big dreams, like string theory and multi-universes, but are they credible? We do not know. Perhaps the same passion that charmed me towards the future has driven large chunks of today's research into useless dead-end dreams. Maybe not. Maybe we are really understanding what happened before the Big Bang (a "Big Bounce"?) and what takes place deep down at the Planck scale ("loops"? space and time loosing their meaning?). Let's be open to the possibility we are getting there—let's work hard to get there. But let's also be ready to recognize that perhaps we are not there. Perhaps our dreams are just that: dreams. Too often I have been hearing that somebody is "on the brink of" the great leap ahead. I now tend to get asleep when I hear "on the brink of". In physics it is 15 years that I hear that we are "on the brink of observing supersymmetry". Please weak me up when we are actually there.

I do not want to sound pessimistic. I just want to put a word of caution in. Maybe what really changes everything is not something that sounds so glamourous. What did really change everything in the past? Here are two examples. Until no more than a couple of centuries ago, 95% of humanity worked the countryside as peasants. That is, humanity needed the labour of 95 out of 100 of its members just to feed the group. This left happy few for doing everything else. Today only a few percent of the humans work the fields. A few are enough to feed everybody else. This means that the large majority of us, including me and most probably you, my reader, are free to do something else, participating in constructing the world we inhabit, a better one, perhaps. What made such a huge change in our lives possible? Mostly, just one technological tool: the tractor. The humble rural machine has changed our life perhaps more than the wheel or electricity. Another example? Hygiene. Our life expectancy has nearly doubled from little more than washing hands and taking showers. Change comes often from where it is not expected. The famous note from the IBM top management at the beginning of the computer history estimated that: "there is no market for more than a few dozens of computers in the world".

So, what is my moral? Making predictions is difficult, of course, especially about the future. It is good to dream about big changes, actively seek them and be open minded to them. Otherwise we are stuck here. But let us not get blinded by hopes. Dreams and hopes of humanity sometimes succeed, sometime fail big. The century just ended has shown us momentous examples of both. The Edge question is about what will change everything, which I'll see in my lifetime: and if the answer was: "nothing"? Are we able to discern hype from substance? Dolly may be scientifically important, but I tend to see it just as a funny-born twin-sister: she hasn't changed much in my life, yet. Will she really?

Psychologist, University of Virginia; Author, The Happiness Hypothesis


The most offensive idea in all of science for the last 40 years is the possibility that behavioral differences between racial and ethnic groups have some genetic basis. Knowing nothing but the long-term offensiveness of this idea, a betting person would have to predict that as we decode the genomes of people around the world, we're going to find deeper differences than most scientists now expect. Expectations, after all, are not based purely on current evidence; they are biased, even if only slightly, by the gut feelings of the researchers, and those gut feelings include disgust toward racism..

A wall has long protected respectable evolutionary inquiry from accusations of aiding and abetting racism. That wall is the belief that genetic change happens at such a glacial pace that there simply was not time, in the 50,000 years since humans spread out from Africa, for selection pressures to have altered the genome in anything but the most trivial way (e.g., changes in skin color and nose shape were adaptive responses to cold climates). Evolutionary psychology has therefore focused on the Pleistocene era – the period from about 1.8 million years ago to the dawn of agriculture — during which our common humanity was forged for the hunter-gatherer lifestyle.

But the writing is on the wall. Russian scientists showed in the 1990s that a strong selection pressure (picking out and breeding only the tamest fox pups in each generation) created what was — in behavior as well as body — essentially a new species in just 30 generations. That would correspond to about 750 years for humans. Humans may never have experienced such a strong selection pressure for such a long period, but they surely experienced many weaker selection pressures that lasted far longer, and for which some heritable personality traits were more adaptive than others. It stands to reason that local populations (not continent-wide "races") adapted to local circumstances by a process known as "co-evolution" in which genes and cultural elements change over time and mutually influence each other. The best documented example of this process is the co-evolution of genetic mutations that maintain the ability to fully digest lactose in adulthood with the cultural innovation of keeping cattle and drinking their milk. This process has happened several times in the last 10,000 years, not to whole "races" but to tribes or larger groups that domesticated cattle.

Recent "sweeps" of the genome across human populations show that hundreds of genes have been changing during the last 5-10 millennia in response to local selection pressures. (See papers by Benjamin Voight, Scott Williamson, and Bruce Lahn). No new mental modules can be created from scratch in a few millennia, but slight tweaks to existing mechanisms can happen quickly, and small genetic changes can have big behavioral effects, as with those Russian foxes. We must therefore begin looking beyond the Pleistocene and turn our attention to the Holocene era as well – the last 10,000 years. This was the period after the spread of agriculture during which the pace of genetic change sped up in response to the enormous increase in the variety of ways that humans earned their living, formed larger coalitions, fought wars, and competed for resources and mates.

The protective "wall" is about to come crashing down, and all sorts of uncomfortable claims are going to pour in. Skin color has no moral significance, but traits that led to Darwinian success in one of the many new niches and occupations of Holocene life — traits such as collectivism, clannishness, aggressiveness, docility, or the ability to delay gratification — are often seen as virtues or vices. Virtues are acquired slowly, by practice within a cultural context, but the discovery that there might be ethnically-linked genetic variations in the ease with which people can acquire specific virtues is — and this is my prediction — going to be a "game changing" scientific event. (By "ethnic" I mean any group of people who believe they share common descent, actually do share common descent, and that descent involved at least 500 years of a sustained selection pressure, such as sheep herding, rice farming, exposure to malaria, or a caste-based social order, which favored some heritable behavioral predispositions and not others.)

I believe that the "Bell Curve" wars of the 1990s, over race differences in intelligence, will seem genteel and short-lived compared to the coming arguments over ethnic differences in moralized traits. I predict that this "war" will break out between 2012 and 2017.

There are reasons to hope that we'll ultimately reach a consensus that does not aid and abet racism. I expect that dozens or hundreds of ethnic differences will be found, so that any group — like any person — can be said to have many strengths and a few weaknesses, all of which are context-dependent. Furthermore, these cross-group differences are likely to be small when compared to the enormous variation within ethnic groups and the enormous and obvious effects of cultural learning. But whatever consensus we ultimately reach, the ways in which we now think about genes, groups, evolution and ethnicity will be radically changed by the unstoppable progress of the human genome project.

Philosopher and Cognitive Scientist, University of Edinburgh; Author, Supersizing the Mind


What will change everything is the onset of celebratory species self re-engineering.

The technologies are pouring in, from wearable, implantable, and pervasive computing, to the radical feature blends achieved using gene transfer techniques, to thought-controlled cursors freeing victims of locked-in syndrome, to funkier prosthetic legs able to win track races, and on to the humble but transformative iPhone.

But what really matters is the way we are, as a result of this tidal wave of self- re-engineering opportunity, just starting to know ourselves: not as firmly bounded biological organisms but as delightfully reconfigurable nodes in a flux of information, communcation, and action. As we learn to celebrate our own potential, we will embrace ever-more-dramatic variations in bodily form and in our effective cognitive profiles. The humans of the next century will be vastly more heterogeneous, more varied along physical and cognitive dimensions, than those of the past as we deliberately engineer a new Cambrian explosion of body and mind.

Ophthalmologist and Neurobiologist, University of California, Davis


In the 1960s movie "The Graduate" a young Dustin Hoffman is advised to go into plastics, presumably because that will be the next big thing.

Today, one might well advise the young person planning to pursue a degree in medicine or the biological sciences to go into brain plasticity. This refers to the fact that neurons are malleable throughout life, capable of being shaped by external experiences and endogenous events.

Recent imaging studies of single neurons have revealed that specialized parts of nerve cells, termed dendritic spines are constantly undergoing a process of rapid expansion and retraction. While brain cells are certainly capable of structural and functional changes throughout life, an extensive scientific literature has shown that plasticity in the nervous system is greatest early in development, during the so-called critical periods. This accounts for the marvelous ability of children to rapidly master various skills at different developmental stages. Toddlers have no difficulty in learning two, three and even more languages, and most adolescents can learn to ski black diamond slopes much before their middle-aged parents. The critical periods underlying such learning reflect the high degree of plasticity exhibited by specific brain circuits during the first two decades of life.

In recent years, developmental neurobiologists have made considerable progress in unraveling the myriad factors underlying the plasticity of neurons in the developing brain. For instance, a number of studies have now demonstrated that it is the formation of inhibitory circuits in the cortex that causes decreased plasticity in the maturing visual system. While no single event can entirely explain brain plasticity, progress is being attained at a rapid pace, and I am convinced that in my lifetime we will be able to control the level of plasticity exhibited by mature neurons.

Several laboratories have already discovered ways to manipulate the brain in ways to make mature neurons as plastic as during early development. Such studies have been done using genetically engineered mice with either a deletion or an over-expression of specific genes known to control plasticity during normal development. Moreover, drug treatments have now been found to mimic the changes observed in these mutant mice.

In essence this means that the high degree of brain plasticity normally evident only during early development can now be made to occur throughout the life span. This is undoubtedly a game changer in the brain sciences. Imagine being able to restore the plasticity of neurons in the language centers of your brain, enabling you to learn any and all languages effortlessly and at a rapid pace. The restoration of neuronal plasticity would also have important clinical implications since unlike in the mature brain, connections in the developing brain are capable of sprouting (i.e. new growth). For this reason, this technology could provide a powerful means to combat loss of neuronal connections, including those resulting from brain injury as well as various disease states.

I am optimistic that these treatments will be forthcoming in my lifetime. Indeed a research group in Finland is about to begin the first clinical study to assess the ability of drug treatments to restore plasticity to the visual system of adult humans. If successful this would provide a means for treating amblyopia in adults, a prevalent disorder of the visual system, which today can only be treated in young children whose visual cortex is still plastic.

Still there are a number of factors will need to be worked out before the restoration of neuronal plasticity becomes a viable procedure. For one thing, it will be necessary to devise a means of targeting specific groups of neurons, those controlling a function that one wants to attain enhanced plasticity. Many people might wish to have a brain made capable of effortlessly learning foreign languages, but few would be pleased if this were accompanied by a vocabulary limited to babbling sounds, not unlike those of my granddaughter who is beginning to learn to speak English and Ukrainian.

Professor of Geography and Earth & Space Sciences, UCLA


In the classic English fable Jack and the Beanstalk, the intrepid protagonist risks being devoured on sight in order to repeatedly raid the home of a flesh-eating giant for gold. All goes well until the snoring giant awakens and gives furious chase. But Jack beats him back down the magic beanstalk and chops it down with an axe, toppling the descending cannibal to its death. Jack thus wins back his life plus substantial economic profit from his spoils.

Industrialized society has also reaped enormous economic and social benefit from fossil fuels, so far without rousing any giants. But as geoscientists, my colleagues and I devote much of our time to worrying about whether they might be slumbering in the Earth's climate system.

We used to think climate worked like a dial — slow to heat up and slow to cool down — but we've since learned it can also act like a switch. Twenty years ago anyone who hypothesized an abrupt, show-stopping event — a centuries-long plunge in air temperature, say, or the sudden die-off of forests — would have been laughed off. But today, an immense body of empirical and theoretical research tells us that sudden awakenings are dismayingly common in climate behavior.

Ancient records preserved in tree rings, sediments, glacial ice layers, cave stalactites, and other natural archives tells us that for much of the past 10,000 years — the time when our modern agricultural society evolved — our climate was remarkably stable. Before then it was it was capable of wild fluctuations, even leaping eighteen degrees Fahrenheit in ten years. That's as if the average temperature in Minneapolis warmed to that of San Diego in a single decade.

Even during the relative calm of recent centuries, we find sudden lurches that exceed anything in modern memory. Tree rings tell us that in the past 1,000 years, the western United States has seen three droughts at least as bad as the Dust Bowl but lasting three to seven times longer. Two of them may have helped collapse past societies of the Anasazi and Fremont people.

The mechanisms behind such lurches are complex but decipherable. Many are related to shifting ocean currents that slosh around pools of warm or cool seawater in quasi-predictable ways. The El Niño/La Niña phenomenon, which redirects rainfall patterns around the globe, is one well-known example. Another major player is the Atlantic thermohaline circulation (THC), a massive density-driven "heat conveyor belt" that carries tropical warmth northwards via the Gulf Stream. The THC is what gifts Europe with relatively balminess despite being as far north as some of Canada's best polar bear habitat.

If the THC were to weaken or halt, the eastern U.S. and Europe would become something like Alaska. While over-sensationalized by The Day After Tomorrow film and a scary 2003 Pentagon document imagining famines, refugees, and wars, a THC shutdown nonetheless remains an unlikely but plausible threat. It is the original sleeping giant of my field.

Unfortunately, we are discovering more giants that are probably lighter sleepers than the THC. Seven others — all of them potential game-changers — are now under scrutiny: (1) the disappearance of summer sea-ice over the Arctic Ocean, (2) increased melting and glacier flow of the Greenland ice sheet, (3) "unsticking" of the frozen West Antarctic Ice Sheet from its bed, (4) rapid die-back of Amazon forests, (5) disruption of the Indian Monsoon, (6) release of methane, an even more potent greenhouse gas than carbon dioxide, from thawing frozen soils, and (7) a shift to a permanent El Niño-like state. Like the THC, should any of these occur there would be profound ramifications — like our food production, the extinction and expansion of species, and the inundation of coastal cities.

To illustrate, consider the Greenland and Antarctic ice sheets. The water stored in them is enormous, enough to drown the planet under more than 200 feet of water. That will not happen anytime soon but even a tiny reduction in their extent — say, five percent — would significantly alter our coastline. Global sea level is already rising about one-third of a centimeter every year and will reach at least 18 to 60 centimeters higher just one long human lifetime from now, if the speeds at which glaciers are currently flowing from land to ocean remain constant. But at least two warming-induced triggers might speed them up: percolation of lubricating meltwater down to the glaciers' beds; and the disintegration of floating ice shelves that presently pin glaciers onto the continent. If these giants awaken happen our best current guess is 80 to 200 centimeters of sea level rise. That's a lot of water. Most of Miami would either be surrounded by dikes or underwater.

Unfortunately, the presence of sleeping giants makes the steady, predictable growth of anthropogenic greenhouse warming more dangerous, not less. Alarm clocks may be set to go off, but we don't what their temperature settings are. The science is too new, and besides we'll never know for sure until it happens. While some economists predicted that rising credit-default swaps and other highly leveraged financial products might eventually bring about an economic collapse, who could have foreseen the exact timing and magnitude of late 2008? Like most threshold phenomena it is extremely difficult to know just how much poking is needed to disturb sleeping giants. Forced to guess, I'd mutter something about decades, or centuries, or never. On the other hand, one might be stirring already: In September 2007, then again in 2008, for the first time in memory nearly 40% of the later-summer sea ice in the Arctic Ocean abruptly disappeared.

Unlike Jack, the eyes of scientists are slow to adjust to the gloom. But we are beginning to see some outlines and unfortunately, discern not one but many sleeping forms. What is certain is that our inexorable loading of the atmosphere with heat-trapping greenhouse gases increases the likelihood that one or more of them will wake up.

Psychologist, UC, Berkeley; Author, The Scientist in the Crib


The world is transforming from an agricultural and manufacturing economy to an information economy. This means that people will have to learn more and more. The best way to make it happen is to extend the period when we learn the most — childhood. Our new scientific understanding of neural plasticity and gene regulation, along with the global spread of schooling, will make that increasingly possible. We may remain children forever — or at least for much longer.

Humans already have a longer period of protected immaturity — a longer childhood — than any other species. Across species, a long childhood is correlated with an evolutionary strategy that depends on flexibility, intelligence and learning. There is a developmental division of labor. Children get to learn freely about their particular environment without worrying about their own survival — caregivers look after that. Adults use what they learn as children to mate, predate, and generally succeed as grown-ups in that environment. Children are the R & D department of the human species. We grown-ups are production and marketing. We start out as brilliantly flexible but helpless and dependent babies, great at learning everything but terrible at doing just about anything. We end up as much less flexible but much more efficient and effective adults, not so good at learning but terrific at planning and acting.

These changes reflect brain changes. Young brains are more connected, more flexible and more plastic, but less efficient. As we get older, and experience more, our brains prune out the less-used connections and strengthen the connections that work. Recent developments in neuroscience show that this early plasticity can be maintained and even reopened in adulthood. And, we've already invented the most unheralded but most powerful brain-altering technology in history — school.

For most of human history babies and toddlers used their spectacular, freewheeling, unconstrained learning abilities to understand fundamental facts about the objects, people and language around them — the human core curriculum. At about 6 children also began to be apprentices. Through a gradual process of imitation, guidance and practice they began to master the particular adult skills of their particular culture — from hunting to cooking to navigation to childrearing itself. Around adolescence motivational changes associated with puberty drove children to leave the protected cocoon and act independently. And by that time their long apprenticeship had given children a new suite of executive abilities — abilities for efficient action, planning, control and inhibition, governed by the development of prefrontal areas of the brain. By adolescence children wanted to end their helpless status and act independently and they had the tools to do so effectively.

School, a very recent human invention, completely alters this program. Schooling replaces apprenticeship. School lets us all continue to be brilliant but helpless babies. It lets us learn a wide variety of information flexibly, and for its own sake, without any immediate payoff. School assumes that learning is more important than doing, and that learning how to learn is most important of all. But school is also an extension of the period of infant dependence — since we don't actually do anything useful in school, other people need to take care of us — all the way up to a Ph.D. School doesn't include the gradual control and mastery of specific adult skills that we once experienced in apprenticeship. Universal and extended schooling means that the period of flexible learning and dependence can continue until we are in our thirties, while independent active mastery is increasingly delayed.

Schooling is spreading inexorably throughout the globe. A hundred years ago hardly anyone went to school, even now few people are schooled past adolescence. A hundred years from now we can expect that most people will still be learning into their thirties and beyond. Moreover, the new neurological and genetic developments will give us new ways to keep the window of plasticity open. And the spread of the information economy will make genetic and neurological interventions, as well as educational and behavioral interventions, more and more attractive.

These accelerated changes have radical consequences. Schooling alone has already had a revolutionary effect on human learning. Absolute IQs have increased at an astonishing and accelerating rate, "the Flynn effect". Extending the period of immaturity indeed makes us much smarter and far more knowledgeable. Neurological and genetic techniques can accelerate this process even further. We all tend to assume that extending this period of flexibility and openness is a good thing — who would argue against making people smarter?

But there may be an intrinsic trade-off between flexibility and effectiveness, between the openness that we require for learning and the focus that we need to act. Child-like brains are great for learning, but not so good for effective decision-making or productive action. There is some evidence that adolescents even now have increasing difficulty making decisions and acting independently, and pathologies of adolescent action like impulsivity and anxiety are at all-time historical highs. Fundamental grown-up human skills we once mastered through apprenticeship, like cooking and caregiving itself, just can't be acquired through schooling. (Think of all those neurotic new parents who have never taken care of a child and try to make up for it with parenting books). When we are all babies for ever, who will be the parents? When we're all children who will be the grown-ups?

John D. Barrow
Physicist, Director, Millennium Mathematics Project, Cambridge; Author, 100 Essential Things You Didn't Know You Didn't Know


Physicist, Director, Origins Initiative, Arizona State University; Author, Hiding in the Mirror


"With Nuclear Weapons, everything has changed, save our way of thinking."  So said Albert Einstein, sixty three years ago, following the Hiroshima and Nagasaki bombings at the end of World War II.  Having been forced to choose a single game changer, I have turned away from the fascinating scientific developments I might like to see, and will instead focus on the one game changer that I will hopefully never directly witness, but nevertheless expect will occur during my lifetime:  the use of nuclear weapons against a civilian population.  Whether used by one government against the population of another, or by a terrorist group, the detonation of even a small nuclear explosive, similar in size, for example, to the one that destroyed hiroshima, would produce an impact on the economies, politics, and lifestyles of the first world in a way that would make the impact of 9/11 seem trivial.   I believe the danger of nuclear weapons use remains one of the biggest dangers of this century.  It is remarkable that we have gone over 60 years without their use, but the clock is ticking.  I fear that Einstein's admonition remains just as true today as it did then, and I that we are unlikely to go another half century with impunity, at least without confronting the need for a global program of disarmament that goes far beyond the present current Nuclear Non-Proliferation, and strategic arms treaties.

Following forty years of Mutually Assured Destruction, with the two Superpowers like two scorpions in a bottle, each held at bay by the certainty of the destruction that would occur at the first whiff of nuclear aggression on the part of the other, we have become complacent.  Two generations have come to maturity in a world where nuclear weapons have not been used.  The Nuclear Non-Proliferation Treaty has been largely ignored, not just by nascent nuclear states like North Korea, or India and Pakistan, or pre-nuclear wannabies like Iran.  Together the United States and Russia possess 26,000 of the world's 27,000 known nuclear warheads.  This in spite of the NPT's strict requirement for these countries to significantly reduce their arsenals.   Each country has perhaps 1000 warheads on hair trigger full alert.  This in spite of the fact that there is no strategic utility at the current time associated with possessing so many nuclear weapons on alert.  

Ultimately, what so concerned Einstein, and is of equal concern today, is the fact that first use of nuclear weapons cannot be justified on moral or strategic grounds. Nevertheless, it may surprise some people to learn that the United States has no strict anti-first-use policy. In fact, in its 2002 Nuclear Posture Review, the U.S. declared that nuclear weapons "provide credible military options to deter a wide range of threats" including "surprising military developments."  

And while we spend $10 billion/yr on flawed ballistic missile defense systems against currently non-existent threats, the slow effort to disarm means that thousands of nuclear weapons remain in regions that are unstable, and which could, in principle, be accessed by well organized and well financed terrorist groups.  We have not spent a noticeable fraction of the money spent supposedly defending against ballistic missiles instead outfitting ports and airports to detect against possible nuclear devices smuggled into this country in containers. 

Will it take a nuclear detonation used against a civilian population to stir a change in thinking?  The havoc wreaked on what we now call the civilized world, no matter where a nuclear confrontation takes place, would be orders of magnitude greater than that which we have experienced since the Second World War.   Moreover, as recent calculations have demonstrated, even a limited nuclear exchange between, say India and Pakistan, could have a significant global impact for almost a decade on world climates and growing seasons.  

I sincerely hope that whatever initiates a global realization that the existence of large nuclear stockpiles throughout the world is a threat to everyone on the planet, changing the current blind business-as-usual mentality permeating global strategic planning, does not result from a nuclear tragedy.  But physics has taught me that the world is the way it is whether we like it or not.  And my gut tells me that to continue to ignore the likelihood that a game changer that exceeds our worst nightmares will occur in this century is merely one way to encourage that possibility.

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >