| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >


NICHOLAS A. CHRISTAKIS
Physician and social scientist, Harvard

THE ANTHROPOSPHERE

We will create life from inanimate compounds, and we will find life on Mars or in space. But the life that more immediately interests me lies between these extremes, in the middle range we all inhabit between our genes and our stars. It is the thin bleeding line within the thin blue line, the anthroposphere within the biosphere, the part of the material world in which we live out our lives. It is us.

And we are rapidly and inexorably changing. I do not mean that our numbers are exploding — a topic that has been attracting attention since Malthus. I mean a very modern and massive set of changes in the composition of the human population.

The global population stood at one million at 10,000 BC, 50 million at 1,000 BC, and 310 million in 1,000 AD. It stood at about one billion in 1800, 1.65 billion in 1900, and 6.0 billion in 2000. Analysis of these macro-historical trends in human population usually focuses on this population growth and on the "demographic transition" underlying it.

During the first stage of the demographic transition, life — as Hobbes rightly suggested — was nasty, brutish, and short. There was a balance between birth rates and death rates, and both were very high (30-50 per thousand people per year). The human population grew less than 0.05% annually, with a doubling time of over 1,000 years. This state of affairs was true of all human populations everywhere until the late 18th century.

Then, during the second stage, the death rate began to decline — first in northwestern Europe, but then spreading over the next 100 years to the south and east. The decline in the death rate was due initially to improvements in food supply and in public health, both of which reduced mortality, particularly in childhood. As a consequence, there was a population explosion.

During the third stage, birth rates dropped for the first time in human history. The prior decline in childhood mortality probably prompted parents to realize they did not need as many children; and increasing urbanization, increasing female literacy, and (eventually) contraceptive technology also played a part.

Finally, during the fourth stage — in which the developed world presently finds itself — there is renewed stability. Birth and death rates are again in balance, but now both are relatively low. Causes of mortality have shifted from the pre-Modern pattern dominated by infectious diseases, perinatal diseases, and nutritional diseases, to one dominated by chronic diseases, mental illnesses, and behavioral conditions.

This broad story, however, conceals as much as it reveals. There are other demographic developments worldwide beyond the increasing overall size of the population, developments that are still unfolding and that matter much more. Changes in four aspects of population structure are key: (1) sex ratio, (2) age structure, (3) kinship systems, and (4) income distribution.

Sex ratios are becoming increasingly unbalanced in many parts of the world, especially in China and India (which account for 37% of the global population). The normal sex ratio at birth is roughly 106 males for every 100 females, but it may presently be as high as 120 for young people in China, or as high as 111 in India. This shift, much discussed, may arise from preferential abortion or the neglect of baby girls relative to boys. Gender imbalance may also have other determinants, such as large-scale migration of one or the other sex in search of work. This shift has numerous implications. For example, given the historical role of females as caregivers to elderly parents, a shortage of woman to fill this role will induce large-scale social adjustments. Moreover, an excess of low-status men unable to find wives results in an easy (and large) pool of recruits for extremism and violence.

This shift in gender ratios may have other, less heralded implications, however. Some of our own work has suggested that this shift may actually shorten men's lives, reversing some of the historic progress we have made. Across a range of species, skewed sex ratios result in intensified competition for sexual partners and this induces stress for the supernumerary sex. In humans, it seems, a 5% excess of males at the time of sexual maturity shortens the survival of men by about three months in late life, which is a very substantial loss.

On the other hand, the population worldwide is getting older, especially in the developed world. Globally, the UN estimates that the proportion of people aged 60 and over will double between 2000 and 2050, from 10% to 21%, and the proportion of children will drop from 30% to 21%. This change also has numerous implications, including on the "dependency ratio," meaning that fewer young people are available to provide for the medical and economic needs of the elderly. Much less heralded, however, is the fact that war is a young person's activity, and it is entirely likely that, as populations age, they may become less aggressive.

The changing nature of kinship networks, such as the growth in blended families — whether due to changing divorce patterns in the developed world or AIDS killing off parents in Africa — has implications for the network of obligations and entitlements within families. Changing kinship systems in modern American society (with complex mixtures of remarried and cohabiting couples, half-siblings, step-siblings, and so on) are having profound implications for caregiving, retirement, and bequests. Who cares for Grandma? Who gets her money when she dies?

Finally, it is not just the balance between males and females, or young and old, that is changing, but also the balance between rich and poor. Income inequality is reaching historic heights throughout the world. The top 1% of the people in the world receives 57% of the income. Income inequality in the US is presently at its highest recorded levels, exceeding even the Roaring Twenties. And while economic development in China has proceeded with astonishing rapidity, income is not evenly distributed; the prospects for conflict in that country as a result seem very high in the coming decades.

Lacking any real predators, a key feature of the human environment is other humans. In our rush to focus on threats such as global warming and environmental degradation, we should not overlook this fact. It is well to look around at who, and not just what, surrounds us. Population structure will change everything. Our health, wealth, and peace depend on it.


NEIL GERSHENFELD
Physicist, MIT; Author, FAB

THE RE-IMPLEMENTATION OF LIFE IN ENGINEERED MATERIALS

Life is defined by organic chemistry. There's software for artificial life and artificial intelligence, but these are, well, artificial — they exist in silico rather than in vivo. Conversely, synthetic biology is re-coding genes, but it isn't very synthetic; it uses the same sets of proteins as the rest of molecular biology. If, however, bits could carry mass as well as information, the distinction between artificial and synthetic life would disappear. Virtual and physical replication would be equivalent.

There are in fact promising laboratory systems that can compute with bits represented by mesoscopic materials rather than electrons or photons. Among the many reasons to do this, the most compelling is fabrication: instead of a code controlling a machine to make a thing, the code can itself become a thing (or many things).

That sounds a lot like life. Indeed, current work is developing micron-scale engineered analogs to amino acids, proteins, and genes, a "millibiology" to complement the existing microbiology. By working with components that have macroscopic physics but microscopic sizes, the primitive elements can be selected for their electronic, magnetic, optical or mechanical properties as well as active chemical groups.

Biotechnology is booming (if not bubbling). But it is very clearly segregated from other kinds of technology, which contribute to the study of, but not the identity of, biology. If, however, life is understood as an algorithm rather than a set of amino acids, then the creation of such really-artificial or really-synthetic life can enlarge the available materials, length, and energy scales. In such a world, biotechnology, nanotechnology, information technology, and manufacturing technology merge into a kind of universal technology of embodied information. Beyond the profound practical implications, forward- rather than reverse-engineering life may be the best way to understand it.


ANTON ZEILINGER
University of Vienna and Scientific Director, Institute of Quantum Optics and Quantum Information, Austrian Academy of Sciences

THE BREAKDOWN OF ALL COMPUTERS

Some day all semiconductors will break down and therefore all computers as, besides historic instruments no computers exist today which are nor based on semiconductor technology. The breakdown will be caused by a giant electromagnetic pulse (EMP) created by a nucler explosion outside Earth's athmosphere. It will cover large areas on Earth up to the size of a continent. Where it will happen is unpredictable. But it will happen since it is exteremely unlikely that we will be able to get rid of all nuclear weapons and the probabilty for it to happen at any given time will never be zero.

The implications of such an event will be enormous. If it happens to one of our technology based societies literally everything will break down. You will realize that none your phones does work. There is no way to find out via the internet what happened. Your car will not start anymore as it is also controlled by computer chips, unless you are lucky to own an antique car. Your local supermarket is unable to get new supplies.There will be no trucks operating anymore, no trains, no elctricity, no water supplies Society will completely break down.

There will be small exceptions in those countries where military equipment has been hardened against EMPs making the army available for emrgency relief. In some countires even some emergency civilian infrastructure has been hardened against EMPs. But these are exceptions as most goverments simply ignore that danger.

 


YOCHAI BENKLER
Berkman Professor of Entrepreneurial Legal Studies, Harvard; Author, The Wealth of Networks: How Social Production Transforms Markets and Freedom

RECOMBINATIONS OF THE NEAR POSSIBLE

What will change everything within forty to fifty years (optimistic assumptions about my longevity, I know)? One way to start to think about this is to look at the last “change everything” innovation, and work back fifty years from it. I would focus on the Internet's generalization into everyday life as the relevant baseline innovation that changed everything. We can locate its emergence to widespread use to the mid-1990s. So what did we have that existed in the mid-1940s that was a precursor? We had mature telephone networks, networked radio stations, and point-to-point radio communications. We had the earliest massive computers. So to me the challenge is to look at what we have now, some of which may be quite mature; other pieces of which may be only emerging; and to think of how they could combine in ways that will affect social and cultural processes in ways that will “change everything,” which I take to mean: will make a big difference to the day to day life of many people. Let me suggest four domains in which combinations and improvements of existing elements, some mature, some futuristic, will make a substantial difference, not all of it good.

Communications

We already have handsfree devices. We already have overhead transparent display in fighter pilot helmets. We already have presence-based and immediate communications. We already upload images and movies, on the fly, from our mobile devices, and share them with friends. We already have early holographic imaging for conference presentations, and high-quality 3D imaging for movies. We already have voice-activated computer control systems, and very very early brainwave activated human-computer interfaces. We already have the capacity to form groups online, segment and reform them according to need, be they in World of Warcraft or Facebook groups. What is left is to combine all these pieces into an integrated, easily wearable system that will, for all practical purposes, allow us to interact as science fiction once imagined telepathy working. We will be able to call upon another person by thinking of them; or, at least, whispering their name to ourselves. We will be able to communicate and see them; we will be able to see through their eyes if we wish to, in real time in high resolution to the point that it will seem as though we were in fact standing there, next to them or inside their shoes. However much we think now that collaboration at a distance is easy; what we do today will seem primitive. We won't have “beam me up, Scotty” physically; but we will have a close facsimile of the experience. Coupled with concerns over global warming, these capabilities will make business travel seem like wearing fur. However much we talk now about telecommuting today; these new capabilities, together with new concerns over environmental impact, will make virtual workplaces in the information segments of the economy as different from today's telecommuting as today's ubiquitous computing and mobile platforms are from the mini-computer “revolution” of the 1970s.

Medicine

It is entirely plausible that 110 or 120 will be an average life expectancy; with senescence delayed until 80 or 90. This will change the whole dynamic of life: how many careers a lifetime can support; what the ratio or professional moneymaking to volunteering; how early in life one starts a job; length of training. But this will likely affect, if at all within the relevant period, only the wealthiest societies. Simple innovations that are more likely will have a much wider effect on many more people. A cheap and effective malaria vaccine. Cheap and ubiquitous clean water filters. Cheap and effective treatments and prevention techniques against parasites. All these will change life in the Global South on scales and with values that they will swamp, from the perspective of a broad concern with human values, whatever effects lengthening life in the wealthier North will have.

Military Robotics

We are already have unmanned planes that can shoot live targets. We are seeing land robots, for both military and space applications. We are seeing networked robots performing functions in collaboration. I fear that we will see a massive increase in the deployment and quality of military robotics, and that this will lead to a perception that war is cheaper, in human terms. This, in turn, will lead democracies in general, and the United States in particular, to imagine that there are cheap wars, and to overcome the newly-learned reticence over war that we learned so dearly in Iraq.

Free market ideology

This is not a technical innovation but a change in realm of ideas. The resurgence of free market ideology, after its demise in the Great Depression, came to dominance between the 1970s and the late 1990s as a response to communism. As communism collapsed, free market ideology triumphantly declared its dominance. In the U.S. And the UK it expressed itself, first, in the Reagan/Thatcher moment; and then was generalized in the Clinton/Blair turn to define their own moment in terms of integrating market-based solutions as the core institutional innovation of the “left.” It expressed itself in Europe through the competition-focused, free market policies of the technocratic EU Commission; and in global systems through the demands and persistent reform recommendations of the World Bank, the IMF, and the world trade system through the WTO. But within less than two decades, its force as an idea is declining. On the one hand, the Great Deflation of 2008 has shown the utter dependence of human society on the possibility of well-functioning government to assure some baseline stability in human welfare and capacity to plan for the future. On the other hand, a gradual rise in volunteerism and cooperation, online and offline, is leading to a reassessment of what motivates people, and how governments, markets, and social dynamics interoperate. I expect the binary State/Market conception of the way we organize our large systems to give way to a more fluid set of systems, with greater integration of the social and commercial; as well as of the state and the social. So much of life, in so many of our societies, was structured around either market mechanisms or state bureaucracies. The emergence of new systems of social interaction will affect what we do, and where we turn for things we want to do, have, and experience.


PAUL DAVIES
Physicist, Arizona State University; Director, Beyond; Author, The Cosmic Jackpot

SHADOW BIOSPHERE

A hundred and fifty years ago, Charles Darwin gave us a convincing theory of how life has evolved, over billions of years, from primitive microbes, to the richness and diversity of the biosphere we see today. But he pointedly left out of account how life got started in the first place. "One might as well speculate about the origin of matter," he quipped in a letter to a friend. How, where and when life began remain some of the greatest unsolved problems of science. Even if we make life in the laboratory in the near future, it still won't tell us how Mother Nature did it without expensive equipment, trained biochemists and - the crucial point — a pre-conception of the goal to be achieved. However, we might be able to discover the answer to a more general question: did life originate once, or often?

The subject of astrobiology is predicated on the hope and expectation that life emerges readily in earthlike conditions, and is therefore likely to be widespread in the universe. The assumption that, given half a chance, life will out, is sometime called biological determinism. Unfortunately, nothing in the known laws of physics and chemistry single out the state of matter we call "living" as in any way favored. There is no known law that fast-tracks matter to life. If we do find life on another planet and we can be sure it has started there from scratch, completely independently of life on Earth, biological determinism will be vindicated. With NASA scaling back its activities, however, the search for extraterrestrial life has all but stalled.

Meanwhile, there is an easy way to test biological determinism right here and now. No planet is more earthlike than Earth itself, so biological determinism predicts that life should have started many times on our home planet. That raises the fascinating question of whether there might be more than one form of life inhabiting the terrestrial biosphere. Biologists are convinced that all known species belong to the same tree of life, and share a common origin. But almost all life on Earth is microbial, and only a tiny fraction of microbes have been characterized, let alone sequenced and positioned on the universal tree. You can't tell by looking what makes a microbe tick; you have to study its innards. Microbiologists do that using techniques carefully customized to life as we know it. Their methods wouldn't work for an alternative form of life. If you go looking for known life, you are unlikely to find unknown life.

I believe there is a strong likelihood that Earth possesses a shadow biosphere of alternative microbial life representing the evolutionary products of a second genesis. Maybe also a third, fourth... I also think we might very well discover this shadow biosphere soon. It could be ecologically separate, confined to niches beyond the reach of known life by virtue of extreme heat, cold, acidity or other variables. Or it could interpenetrate the known biosphere in both physical and parameter space. There could be, in effect, alien microbes right under our noses (or even in our noses). Chances are, we would not yet be aware of the fact, especially if the weird shadow life is present at relatively low abundance. But a targeted search for weird microbes, and the weird viruses that prey on them, could find shadow life any day soon.

Why would it change everything? Apart from the sweeping technological applications that having a second form of life would bring, the discovery of a shadow biosphere would prove biological determinism, and confirm that life is indeed widespread in the universe. To expect that life would start twice on Earth, but never on another planet like Earth, is too improbable. And to know that the universe is teeming with life would make it far more likely that there is also intelligent life elsewhere in the universe. We might then have greater confidence that the answer to the biggest of the big questions of existence — Are we alone in the universe? — is very probably, no.


Stewart Brand
Founder, Whole Earth Catalog, cofounder; The Well; cofounder, Global Business Network; Author, How Buildings Learn

CLIMATE

To take mastery of climate as we once took mastery of fire, then of genetics (agriculture), then of communication (music, writing, math, maps, images, printing, radio, computers) will require mathematics we don't have, physics and biology we don't have, and governnance we don't have.

Our climate models, sophisticated and muscular as they are (employing more teraflops than any other calculation), still are just jumped-up weather prediction models. The real climate system has more levels and modes of hyper-connected nonlinearity than we can yet comprehend or ask computers to replicate, because so far we lack the math to represent climate dynamics with the requisite variety to control it. Acquiring that math will change everything.

Materials scientist and engineer Saul Griffith estimates that humanity must produce 13 terawatts of greenhouse-free energy in order to moderate global warming to a just-tolerable increase of 2° Celsius. (Civilization currently runs on about 16 terawatts of energy, most of it from burning fossil fuels.) Griffith calculates that deploying current clean technologies — nuclear, wind, geothermal, biofuels, and solar technology — to generate 13 terawatts would cover an area the size of Australia. It is imaginable but not feasible. Just improving the engineering of nuclear and solar won't get us what we need; new science is needed. The same goes for biofuels: the current state of genetic engineering is too crude to craft truly efficient organisms for sequestering carbon and generating usable energy. The science of molecular biology has to advance by leaps. Applied science that powerful will change everything.

Climate change is a global problem that cannot be fixed with global economics, which we have; it requires global governance, which we don't have. Whole new modes of international discourse, agreement, and enforcement must be devised. How are responsibilities to be shared for legions of climate refugees? Who decides which geoengineering projects can go forward? Who pays for them? Who adjudicates compensation for those harmed? How are free riders dealt with? Humans have managed commons before — fisheries, irrigation systems, fire regimes — but never on this scale. Global governance will change everything.

Of course, these radical adjustments may not happen, or not happen in time, and then climate will shift to either a chaotic mode or a different stable state with the carrying capacity for just a fraction of present humanity, and that will really change everything.


David G. Myers
Social psychologist, Hope College; author A Friendly Letter to Skeptics and Athiests

INEXPENSIVE, CUSTOMIZABLE, INTERACTIVE e-TEXTS FOR WORLDWIDE USE

Speaking recently to university colleagues in southern Africa I heard their wish for teaching materials that, for them, would change everything. If only there could be a way for their students, who cannot afford even greatly discounted Euro-American textbooks, to have access to low cost, state-of-the-art textbooks with culturally relevant examples.

For students in Africa and around the world, this utopian world may in the next decade become the real world, thanks to:

• Interactive textbooks: Various publishers are developing web-based interactive e-books with links to tutorials, simulations, quizzes, animations, virtual labs, discussion boards, and video clips. (These are not yesterday’s e-textbooks.)

• Customizability: Instructors, and regional instructor networks, will be able to rearrange the content, delete unwanted material, and add (or link to) materials pertinent to their students' worlds and their own course goals.

• Affordability: In the North American context, students will pay for course access tied to their names. With no hard copy book production and shipping, and no used books, publishers will stay afloat with a much smaller fee paid by many more students, or via a site license. For courses in economically impoverished regions, benevolent publishers could make access available for very low cost per student.

• Student accountability: Instructors will track their students’ engagement in advance of class sessions, thus freeing more class time for discussion.

• Expanding broadband access: Thanks partly to a joint foundation initiative by Rockefeller, Carnegie, Ford, and others, “information technologies and connectivity to the Internet” are coming to African universities. As yet, access is limited and expensive. But with increased bandwidth and the prospect of inexpensive, wireless personal reading devices, everything may change.

This is not pie in the sky. African researchers are eager to explore the effectiveness of the new interactive content when it becomes accessible to their students. The hope is that such will combine the strengths of existing texts?which are comprehensive, expertly reviewed, painstakingly edited, attractively packaged, and supported with teaching aids?at reduced cost and with the possibility of locally adapted illustrations and content.

Textbooks are sometimes faulted for being biased, dated, or outrageously expensive. But say this much for them, whether in traditional book or new web-based formats: By making the same information available to rich and poor students at rich and poor schools in rich and poor countries, they are egalitarian. They flatten the world. And as James Madison noted in 1825, “the advancement and diffusion of knowledge is the only guardian of true liberty.


MARTIN SELIGMAN
Psychologist; Director, the Positive Psychology Center, University of Pennsylvania;
Author,
Learned Optimism


MUCH SMARTER PEOPLE

If we could teach intuition, people would be smarter.

Most of real world “intelligent” performance is based on intuition, not on reasoning. The expert surgeon just “knows” where to cut. The experienced farmer just “knows” that it is going to rain. The expert firefighter just “knows” that the roof is about to collapse. The judge just “knows” the defendant is lying. These finely honed intuitions are fast, unconscious, multidimensional, inarticulate, and are made confidently. What separates intelligent from mechanically stupid action is wrapped up in this mysterious process. If we could only teach intuition, we could raise human intelligence substantially.

I believe that the teaching of intuition is on the horizon by computationally driven simulation.

Intuition is a species of recognition, formally akin to the way we recognize that a table is a table. We are now close to understanding how natural classes are recognized. Consider the universe of objects all people agree are tables. There are a great many features of tables that are potentially relevant (but neither necessary nor sufficient singly or jointly) to being a table: e.g., flatness of the surface, number of legs, capacity for supporting other objects, function, compatibility with chairs, etc., etc. Each of these features can be assigned some value, which could either be binary (present vs. absent) or continuous. Different instances of tables will have different values along several of the dimensions, e.g., some, like dining room tables are flat, whereas others, like pool tables, have pockets. This means that the process of categorization is stochastic in nature. Upon observing a new object one can decide whether it is a table by comparing its features with the features of stored tables in memory. If the sum of its similarity to all of the tables in memory is higher than the sum of its similarity to other objects (e.g., chairs, animals, etc.) then one “knows” that it too is a table.

Now consider an “eagle” lieutenant recognizing a likely ambush. Here, is what this eagle has stored in her brain. She has a list of the dimensions relevant to an ambush site versus a non-ambush site. She has values along each of these dimensions for each of the ambush and non-ambush sites that she has experienced or learned about. She has a mental model which assigns weights to each of these basic dimensions or features (and to higher order features, such as the interaction between two dimensions). Based on past experiences with similar sets of features and knowledge of the outcomes of those feature sets, she can predict the outcome of the present feature set and based on her predictive model, choose how to respond to the possibility of this being an ambush.

This strongly implies that intuition is teachable, perhaps massively teachable. One way is brute force: simple repeated experience with forced choice seems to build intuition, and chicken-sexing is an example of such brute force. Professional Japanese chicken-sexers can tell male from female chicks at a glance, but they cannot articulate how they do it. With many forced choice trials with feedback, naïve people can be trained to very high accuracy and they too cannot report how they do it.

A better way is virtual simulation. A sufficient number of simulations, with the right variations, to allow a buildup of the mental model will result in a commander or surgeon who when it happens in real life has “seen it before,” will recognize it, and take the life saving action at zero cost in blood. It would be a waste of training to simulate obvious decisions in which most commanders or surgeons would get it right without training. Computational modeling of the future can derive a decision contour, along which “close calls” occur. These are the scalpel-edge cases that yield the slowest response times and are most prone to error. One could also systematically morph material along the decision contour and thereby over-represent cases near the boundary.

By so simulating close decisions in almost every human domain, vastly better intuition becomes teachable. Hence many more intelligent surgeons, judges, commanders, investors, and scientists.


Max Tegmark
Physicist, MIT; Researcher, Precision Cosmology; Member, Scientific Directorate, Foundational Questions Institute

ACCIDENTAL NUCLEAR WAR

A serial killer is on the loose! A suicide bomber! Beware the West Nile Virus! Although headline-grabbing scares are better at generating fear, boring old cancer is more likely to do you in. Although you have less than a 1% chance per year to get it, live long enough, and it has a good chance of getting you in the end. As does accidental nuclear war.

During the half-century that we humans have been tooled up for nuclear Armageddon, there has been a steady stream of false alarms that could have triggered all-out war, with causes ranging from computer malfunction, power failure and faulty intelligence to to navigation error, bomber crash and satellite explosion. Gradual declassification of records has revealed that some of them carried greater risk than was appreciated at the time. For examples, it became clear only in 2002 that during the Cuban Missile Crisis, the USS Beale had depth-charged an unidentified submarine which was in fact Soviet and armed with nuclear weapons, and whose commanders argued over whether to retaliate with a nuclear torpedo.

Despite the end of the Cold War, the risk has arguably grown in recent years. Inaccurate but powerful ICBMs undergirded the "mutual assured destruction" stability, because a first strike could not prevent massive retaliation. The shift toward more accurate missile navigation, shorter flight times and better enemy submarine tracking erodes this stability. A successful missile defense system would complete this erosion process. Both Russia and the US retain their "launch-on-warning" strategy, requiring launch decisions to be made on 5-15 minute timescales where complete information may be unavailable. On January 25 1995, Russian President Boris Yeltsin came within minutes of initiating a full nuclear strike on the United States because of an unidentified Norwegian scientific rocket. Concern has been raised over a recent US project to replace the nuclear warheads on 2 of the 24 D5 ICBMs carried by Trident Submarines by conventional warheads, for possible use against Iran or North Korea: Russian early warning systems would be unable to distinguish them from nuclear missiles, expanding the possibilities for unfortunate misunderstandings. Other worrisome scenarios include deliberate malfeasance by military commanders triggered by mental instability and/or fringe political/religious agendas.

But why worry? Surely, if push came to shove, reasonable people would step in and do the right thing, just like they have in the past?

Nuclear nations do indeed have elaborate countermeasures in place, just like our body does against cancer. Our body can normally deal with isolated deleterious mutations, and it appears that fluke coincidences of as many as four mutations may be required to trigger certain cancers. Yet if we roll the dice enough times, shit happens — Stanley Kubrick's dark nuclear comedy "Dr. Strangelove" illustrates this with a triple coincidence.

Accidental nuclear war between two superpowers may or may not happen in my lifetime, but if it does, it will obviously change everything. The climate change we are currently discussing pales in comparison with nuclear winter, and the current economic turmoil is of course nothing compared to the resulting global crop failures, infrastructure collapse and mass starvation, with survivors succumbing to hungry armed gangs systematically pillaging from house to house. Do I expect to see this in my lifetime? I'd give it about 30%, putting it roughly on par with me getting cancer. Yet we devote way less attention and resources to reducing this risk than we do for cancer.


STEPHEN M. KOSSLYN
Dean of Social Science, John Lindsley Professor of Psychology Harvard University; Author, Wet Mind

LEVERAGING DIFFERENCES

We humans are more alike than different, but our differences are nonetheless pervasive and substantial. Think of differences in height and weight, of shoe size and thumb diameter. In this context, it's not surprising that our brains also differ. And in fact, ample research has documented individual differences not only in the sizes of specific brain regions, but also in how strongly activated the same parts of the brain are when people perform a given task. And more than this, both sorts of neural differences – structural and functional – have been shown to predict specific types of behavior. For example, my collaborators and I have shown that the strength of activation in one part of visual cortex predicts the ease of visualizing shapes.

Nevertheless, our society and institutionalized procedures rarely acknowledge such individual differences, and instead operate on a philosophy (usually implicit) of "one size fits all." It is impossible to estimate how much leverage we could gain if we took advantage of individuals' strengths, and avoided falling prey to their weaknesses. The technology now exists to do this in multiple domains.

How can we gain leverage from exploiting individual differences? One approach is first to characterize each person with a mental profile. This profile would rely on something like a "periodic table of the mind," which would characterize three aspects of mental function, pertaining to: (1) information processing (i.e., the ease of representing and processing information in specific ways), (2) motivation (what one is interested in, as well as his or her goals and values), and (3) the contents of one's knowledge (in particular, what knowledge base one has in specific areas, which then can be built upon). A value would be assigned to each cell of the table for a given person, creating an individual profile.

Being able to produce such profiles would open up new vistas for personalizing a wide range of activities. For example:

Learning. Researchers have argued that some people learn more effectively by verbal means, others by visual means if shapes are used, others by visual means if spatial relations are used, and so on. People no doubt vary in a wide range of ways in their preferred and most effective learning styles. Knowing the appropriate dimensions of the relevant individual differences will put us in a position to design teaching regimens that fit a given person's information processing proclivities, motivation, and current level of knowledge.

Communicating. A picture may be worth 1,000 words for many people, but probably not for everyone. The best way to reach people is to ensure that they are not overloaded with too much information or bored with too little, to appeal to what interests them, and to make contact with what they already know. Thus, both the form and content of a communication would profit from being tailored to the individual.

Psychotherapy. Knowing what motivates someone obviously is a key to effective psychotherapy, but so is knowing a client's or patient's strengths and weaknesses in information processing (especially if a cognitive therapy is used).

Jobs. Any sort of work task could be analyzed in terms of the same dimensions that are used to characterize individual differences (such as, for example, the importance of having a large working memory capacity or being interested in finding disparities in patterns). Following this, we could match a person's strengths with the necessary requirements of a task. In fact, by appropriate matching, a person could be offered jobs that are challenging enough to remain interesting, but not so challenging as to be exhausting and not so easy as to be stultifying.

Teams. Richard Hackman, Anita Wooley, Chris Chabris, and our colleagues used knowledge of individual differences to compose teams. We showed that teams are more effective if the individuals are selected to have complementary cognitive strengths that are necessary to perform the task. Our initial demonstrations are just the beginning; a full characterization of individual differences will promote much more effective composition of teams.

However, these worthy goals are not quite as simple to attain as they might appear. A key problem is that the periodic table approach suggests that each facet of information processing, motivation, and content is independent. That is, the approach suggests that each of these facets can be combined as if they were "mental atoms" – and, like atoms, that each function retains its identity in all combinations.

But in fact the various mental functions are not entirely independent. This fact has been appreciated almost since the inception of scientific psychology, when researchers identified what they called "the fallacy of pure insertion": A given mental process does not operate the same way in different contexts. For instance, one could estimate the time people require to divide a number by 10; this value could then be subtracted from the time people require to find the mean of 10 numbers, with the idea being that the residual should indicate the time to add up the numbers. But it does not. 

In short, a simple "periodic table of the mind," where a given mental function is assumed to operate the same way when inserted in the context of other functions, does not work. Depending on what other functions are in play, we are more or less effective at a given one – and there will no doubt be individual differences in the degree to which context modulates processing.

To begin to use individual differences in the ways summarized above, we need to pursue two strategies in parallel, one for the short term and one for the long term. First, a short-term strategy is simply to work backwards from a specific application: Do we want to teach someone calculus? For that person, we would to assess the relevant information processes within the context of their motivation and prior knowledge. Given current computer technology, this can easily be done. The teaching method (or psychotherapy technique, etc.) would then be tailored for the content for that person in that particular context.

Second, a long-term strategy is to identify higher-order regularities that not only characterize information processing, motivation, and content but also characterize the ways in which these factors interact. Such regularities may be almost entirely statistical, and may end up having the same status as some equations in physics (sophisticated algorithms already exist to perform such analyses); other regularities may be easier to interpret. For example, some people may discount future rewards more than others – but especially rewards that do not bear heavily on their key values, which in turn reduces the effort they will put into attaining such rewards. Once we characterize such regularities in how mental functions interact, we can then apply them to individuals and specify individual differences at this more abstract level.

Much will be gained by leveraging individual differences, instead of ignoring them as is commonly done today. We will not only make human endeavors more effective, but also make them more satisfying for the individual.


| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >