| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

next >

Psychologist, Cornell University; Researcher in Moral Judgment

Everyday Apophenia

The human brain is an amazing pattern-detecting machine. We possess a variety of mechanisms that allow us to uncover hidden relationships between objects, events, and people. Without these, the sea of data hitting our senses would surely appear random and chaotic. But when our pattern-detection systems misfire they tend to err in the direction of perceiving patterns where none actually exist.

The German neurologist Klaus Conrad coined the term "Apophenia" to describe this tendency in patients suffering from certain forms of mental illness. But it is increasingly clear from a variety of findings in the behavioral sciences that this tendency is not limited to ill or uneducated minds; healthy, intelligent people make similar errors on a regular basis: a superstitious athlete sees a connection between victory and a pair of socks, a parent refuses to vaccinate her child because of a perceived causal connection between inoculation and disease, a scientist sees hypothesis-confirming results in random noise, and thousands of people believe the random "shuffle" function on their music software is broken because they mistake spurious coincidence for meaningful connection.

In short, the pattern-detection that is responsible for so much of our species' success can just as easily betray us. This tendency to oversee patterns is likely an inevitable by-product of our adaptive pattern-detecting mechanisms. But the ability to acknowledge, track, and guard against this potentially dangerous tendency would be aided if the simple concept of "everyday Apophenia" were an easily accessible concept.

Architect, Researcher, MIT; Founder, Materialecology

It Ain't Necessarily So

Preceding the scientific method is a way of being in the world that defies the concept of a solid, immutable reality. Challenging this apparent reality in a scientific manner can potentially unveil a revolutionary shift in its representation and thus recreate reality itself. Such suspension of belief implies the temporary forfeiting of some explanatory power of old concepts and the adoption of a new set of assumptions in their place.

Reality is the state of things as they actually exist, rather than the state by which they may appear or thought to be — a rather ambiguous definition given our known limits to observation and comprehension of concepts and methods. This ambiguity, captured by the aphorism that things are not what they seem, and again with swing in Sportin' Life's song It Ain't Necessarily So, is a thread that seems to consistently appear throughout the history of science and the evolution of the natural world. In fact, ideas that have challenged accepted doctrines and created new realities have prevailed in fields ranging from warfare to flight technology, from physics to medicinal discoveries.

Recall the battle between David and Goliath mentioned in Gershwin's song. The giant warrior, evidently unbeatable by every measure of reality, is at once defeated by a lyre-playing underdog who challenges this seemingly apparent reality by devising a nearly scientific and unconventional combat strategy.

The postulation that mighty opponents have feeble spots also holds true for the war against ostensibly incurable diseases. Edward Jenner's inoculation experiments with the cowpox virus to build immunity against the deadly scourge of smallpox gave rise to the vaccine that later helped prevent diseases such as Polio, Malaria and HIV. The very idea that an enemy — a disease — is to be overcome exclusively by brute force was defied by the counter-intuitive hypothesis that the disease itself — or a mild version of its toxins — might be internally memorized by the human immune system as a preventive measure.

Da Vinci's flying machine is another case in point. Challenging the myth of Icarus and its moral that humans should not attempt flying, Leonardo designs a hanger glider inspired by his studies into the structure-function relationships of bird wings. This is the first flying machine known to men on the basis of which our entire avionic industry has evolved.

Challenging what was assumed to be the nature of reality, conveniently supported by religious authorities, Copernicus disputes the Ptolemaic model of the heavens, which postulated the Earth at the center of the universe, by providing the heliocentric model with the Sun at the center of our solar system. The Scientific Revolution of the 16th century then followed, laying the foundations for modern science.

But the Gospel takes many forms besides religion or received wisdom. Occasionally the Gospel emerges as science itself at a particular moment in history. Einstein challenged the Gospel of his day by introducing the concept of space-time and upending our perception of the universe.

It Ain't Necessarily So is a drug dealer's attempt to challenge the gospel of religion by expressing doubts in the Bible: the song is indeed immortal, but Sportin' himself does not surpass doubt. In science, Sportin's attitude is an essential first step forward but it ain't sufficiently so. It is a step that must be followed by scientific concepts and methods. Still, it is worth remembering to take your Gospel with a grain of salt because, sometimes, it ain't nessa, ain't nessa, it ain't necessarily so.

Mathematician and Economist; Principal, Natron Group


The sophisticated "scientific concept" with the greatest potential to enhance human understanding may be argued to come not from the halls of academe, but rather from the unlikely research environment of professional wrestling.

Evolutionary biologists Richard Alexander and Robert Trivers have recently emphasized that it is deception rather than information that often plays the decisive role in systems of selective pressures. Yet most of our thinking continues to treat deception as something of a perturbation on the exchange of pure information, leaving us unprepared to contemplate a world in which fakery may reliably crowd out the genuine. In particular, humanity's future selective pressures appear likely to remain tied to economic theory which currently uses as its central construct a market model based on assumptions of perfect information.

If we are to take selection more seriously within humans, we may fairly ask what rigorous system would be capable of tying together an altered reality of layered falsehoods in which absolutely nothing can be assumed to be as it appears. Such a system, in continuous development for more than a century, is known to exist and now supports an intricate multi-billion dollar business empire of pure hokum. It is known to wrestling's insiders as "Kayfabe".

Because professional wrestling is a simulated sport, all competitors who face each other in the ring are actually close collaborators who must form a closed system (called "a promotion") sealed against outsiders. With external competitors generally excluded, antagonists are chosen from within the promotion and their ritualized battles are largely negotiated, choreographed, and rehearsed at a significantly decreased risk of injury or death. With outcomes predetermined under Kayfabe, betrayal in wrestling comes not from engaging in unsportsmanlike conduct, but by the surprise appearance of actual sporting behavior. Such unwelcome sportsmanship which "breaks Kayfabe" is called "shooting" to distinguish it from the expected scripted deception called "working".

Were Kayfabe to become part of our toolkit for the twenty-first century, we would undoubtedly have an easier time understanding a world in which investigative journalism seems to have vanished and bitter corporate rivals cooperate on everything from joint ventures to lobbying efforts. Perhaps confusing battles between "freshwater" Chicago macro economists and Ivy league "Saltwater" theorists could be best understood as happening within a single "orthodox promotion" given that both groups suffered no injury from failing (equally) to predict the recent financial crisis. The decades old battle in theoretical physics over bragging rights between the "string" and "loop" camps would seem to be an even more significant example within the hard sciences of a collaborative intra-promotion rivalry given the apparent failure of both groups to produce a quantum theory of gravity.

What makes Kayfabe remarkable is that it gives us potentially the most complete example of the general process by which a wide class of important endeavors transition from failed reality to successful fakery. While most modern sports enthusiasts are aware of wrestling's status as a pseudo sport, what few alive today remember is that it evolved out of a failed real sport (known as "catch" wrestling) which held its last honest title match early in the 20th century. Typical matches could last hours with no satisfying action, or end suddenly with crippling injuries to a promising athlete in whom much had been invested. This highlighted the close relationship between two paradoxical risks which define the category of activity which wrestling shares with other human spheres:

• A) Occasional but Extreme Peril for the participants.

• B) General: Monotony for both audience and participants.

Kayfabrication (the process of transition from reality towards Kayfabe) arises out of attempts to deliver a dependably engaging product for a mass audience while removing the unpredictable upheavals that imperil participants. As such Kayfabrication is a dependable feature of many of our most important systems which share the above two characteristics such as war, finance, love, politics and science.

Importantly, Kayfabe also seems to have discovered the limits of how much disbelief the human mind is capable of successfully suspending before fantasy and reality become fully conflated. Wrestling's system of lies has recently become so intricate that wrestlers have occasionally found themselves engaging in real life adultery following exactly behind the introduction of a fictitious adulterous plot twist in a Kayfabe back-story. Eventually, even Kayfabe itself became a victim of its own success as it grew to a level of deceit that could not be maintained when the wrestling world collided with outside regulators exercising oversight over major sporting events.

At the point Kayfabe was forced to own up to the fact that professional wrestling contained no sport whatsoever, it did more than avoid being regulated and taxed into oblivion. Wrestling discovered the unthinkable: its audience did not seem to require even a thin veneer of realism. Professional wrestling had come full circle to its honest origins by at last moving the responsibility for deception off of the shoulders of the performers and into the willing minds of the audience.

Kayfabe, it appears, is a dish best served client-side.

Cognitive Scientist, NYU; Author, Kluge: The Haphazard Evolution of the Human Mind

Cognitive Humility

Hamlet may have said that human beings are noble in reason and infinite in faculty, but in reality — as four decades of experiments in cognitive psychology have shown — our minds are very finite, and far from noble. Knowing the limits of our minds can help us to make better reasoners.

Almost all of those limits start with a peculiar fact about human memory: although we are pretty good at storing information in our brains, we are pretty poor at retrieving that information. We can recognize photos from our high school yearbooks decades later—yet still find it impossible to remember what we had for breakfast yesterday. Faulty memories have been known to lead to erroneous eyewitness testimony (and false imprisonment), to marital friction (in the form of overlooked anniversaries), and even death (skydivers, for example have been known to forget to pull their ripcords — accounting, by one estimate, for approximately 6% of skydiving fatalities).

Computer memory is much more better than human memory because early computer scientists discovered a trick that evolution never did: organizing information according by assigning every memory to a sort of master map, in which each bit of information that is to be stored is assigned a specific, uniquely identifiable location in the computer's memory vaults. Human beings, in contrast. appear to lack such master memory maps, and instead retrieve information in far more haphazard fashion, by using clues (or cues) to what it's looking for, rather than knowing in advance where in the brain a given memory lies.

In consequence, our memories cannot be searched as systematically or as reliably as those of us a computer (or internet database). Instead, human memories are deeply subject to context. Scuba divers, for example, are better at remembering the words they study underwater when they are tested underwater (relative to when they were a tested on land), even if the words have nothing to do with the sea.

Sometimes this sensitivity to context is useful. We are better able to remember what we know about cooking when we are in the kitchen than when we are skiing, and vice versa.

But it also comes at a cost: when we need to remember something in a situation other than the one in which it was stored, it's often hard to retrieve it. One of the biggest challenges in education, for example, is to get children to take what they learn in school and apply it to real world situations, in part because context-driven memory means that what is learned in school tends to stay in school.

Perhaps the most dire consequence is that human beings tend almost invariably to be better at remembering evidence that is consistent with their beliefs than evidence that might disconfirm them. When two people disagree, it is often because their prior beliefs lead them to remember (or focus on) different bits of evidence. To consider something well, of course, is to evaluate both sides of an argument, but unless we also go the extra mile of deliberately forcing ourselves to consider alternatives—not something that comes naturally—we are more prone to recalling evidence consistent with a proposition than inconsistent with it.

Overcoming this mental weakness, known as confirmation bias, is a lifelong struggle; recognizing that we all suffer from it is a important first step.To the extent that we can beware of this limitation in our brains, we can try to work around it, compensating for our in-born tendencies towards self-serving and biased recollections by disciplining ourselves to consider not just the data that might fit with our own beliefs, but also the data that might lead other people to have beliefs that differ from our own.

Professor of Astronomy, Director, Harvard Origins of Life Initiative

The Other

The concept of 'otherness' or 'the Other' is about how a conscious human being perceives their own identity: "Who am I and how do I relate to others?"; a part of what defines the self and is constituent in self-consciousness. It is a philosophical concept widely used in psychology and social science. Recent advances in the life and physical sciences have opened the possibility for new and even unexpected expansions of this concept.

Starting with the map of the human genome, to the diploid human genomes of individuals, and to mapping humans' geographic spread, then moving back in time with the mapping of the Neanderthal genome, these are new tools to address the age-old problem of human unity and human diversity. Reading the 'life code' of DNA does not stop here – it places humans in the vast and colorful mosaic of Earth life. 'Otherness' is placed in totally new light. Our microbiomes – the trillions of microbes on and in each of us, that are essential to a person's physiology, become part of our self.

Astronomy and space science are intensifying the search for life on other planets – from Mars and the outer reaches of the Solar system, to Earth-like planets and super-Earths orbiting other stars. The chances of success may hinge on our understanding of the possible diversity of the chemical basis of life itself. 'Otherness': not among DNA-encoded species, but among life forms using different molecules to encode traits. Our 4-billion-years-old heritage of molecular innovation and design, versus 'theirs'. This is a cosmic first encounter that we might experience in our labs first. Last year's glimpse at JCVI-syn1.0 – the first bacteria controlled completely by a synthetic genome, is a prelude to this brave new field.

It is probably timely to ponder 'otherness' and its wider meaning yet again, as we embark on a new age of exploration. And as T.S. Eliot once predicted, we might arrive where we started and know our self for the first time.

Professor of Physics, University of Illinois at Urbana-Champaign


When you are facing in the wrong direction, progress means walking backwards. History suggests that our world view undergoes disruptive change not so much when science adds new concepts to our cognitive toolkit, but when it takes away old ones. The sets of intuitions that have been with us since birth define our scientific prejudices, and not only are poorly-suited to the realms of the very large and very small, but also fail to describe everyday phenomena. If we are to identify where the next transformation of our world view will come from, we need to take a fresh look at our deep intuitions. In the two minutes that it takes you to read this essay, I am going to try and rewire your basic thinking about causality.

Causality is usually understood as meaning that there is a single, preceding cause for an event. For example in classical physics, a ball may be flying through the air, because of having been hit by a tennis racket. My 16 year-old car always revs much too fast, because the temperature sensor wrongly indicates that the engine temperature is cold, as if the car was in start-up mode. We are so familiar with causality as an underlying feature of reality that we hard-wire it into the laws of physics. It might seem that this would be unnecessary, but it turns out that the laws of physics do not distinguish between time going backwards and time going forwards. And so we make a choice about which sort of physical law we would like to have.

However, complex systems, such as financial markets or the Earth's biosphere, do not seem to obey causality. For every event that occurs, there are a multitude of possible causes, and the extent to which each contributes to the event is not clear, not even after the fact! One might say that there is a web of causation. For example, on a typical day, the stock market might go up or down by some fraction of a percentage point. The Wall Street Journal might blithely report that the stock market move was due to "traders taking profits" or perhaps "bargain-hunting by investors". The following day, the move might be in the opposite direction, and a different, perhaps contradictory, cause will be invoked. However, for each transaction, there is both a buyer and a seller, and their world views must be opposite for the transaction to occur. Markets work only because there is a plurality of views. To assign single or dominant cause to most market moves is to ignore the multitude of market outlooks and to fail to recognize the nature and dynamics of the temporary imbalances between the numbers of traders who hold these differing views.

Similar misconceptions abound elsewhere in public debate and the sciences. For example, are there single causes for diseases? In some cases, such as Huntingdon's disease, the cause can be traced to a unique factor, in this case extra repetitions of a particular nucleotide sequence at a particular location in an individual's DNA, coding for the amino acid glutamine. However, even in this case, the age of onset and the severity of the condition are also known to be controlled by environmental factors and interactions with other genes. The web of causation has been for many decades a well-worked metaphor in epidemiology, but there is still little quantitative understanding of how the web functions or forms. As Krieger poignantly asked in a celebrated 1994 essay, "Has anyone seen the spider?"

The search for causal structure is nowhere more futile than in the debate over the origin of organismal complexity: intelligent design vs. evolution. Fueling the debate is a fundamental notion of causality, that there is a beginning to life, and that such a beginning must have had a single cause. On the other hand, if there is instead a web of causation driving the origin and evolution of life, a skeptic might ask: has anyone seen the spider?

It turns out that there is no spider. Webs of causation can form spontaneously through the concatenation of associations between the agents or active elements in the system. For example, consider the Internet. Although a unified protocol for communication (TCP/IP etc) exists, the topology and structure of the Internet emerged during a frenzied build-out, as Internet service providers staked out territory in a gold-rush of unprecedented scale. Remarkably, once the dust began to settle, it became apparent that the statistical properties of the resulting Internet were quite special: the time delays for packet transmission, the network topology, and even the information transmitted exhibit fractal properties.

However, you look at the Internet, locally or globally, on short time scales or long, it looks exactly the same. Although the discovery of this fractal structure around 1995 was an unwelcome surprise, because standard traffic control algorithms as used by routers were designed assuming that all properties of the network dynamics would be random, the fractality is also broadly characteristic of biological networks. Without a master blueprint, the evolution of an Internet is subject to the same underlying statistical laws that govern biological evolution, and structure emerges spontaneously without the need for a controlling entity. Moreover, the resultant network can come to life in strange and unpredictable ways, obeying new laws whose origin cannot be traced to any one part of the network. The network behaves as a collective, not just the sum of parts, and to talk about causality is meaningless because the behavior is distributed in space and in time.

Between 2.42pm and 2.50pm on May 6 2010, the Dow-Jones Industrial Average experienced a rapid decline and subsequent rebound of nearly 600 points, an event of unprecedented magnitude and brevity. This disruption occurred as part of a tumultuous event on that day now known as the Flash Crash, which affected numerous market indices and individual stocks, even causing some stocks to be priced at unbelievable levels (e.g. Accenture was at one point priced at 1 cent).

With tick-by-tick data available for every trade, we can watch the crash unfold in slow motion, a film of a financial calamity. But the cause of the crash itself remains a mystery. The US Securities and Exchange Commission report on the flash crash was able to identify the trigger event (a $4 billion sale by a mutual fund), but could provide no detailed understanding of why this event caused the crash. The conditions that precipitate the crash were already embedded in the market's web of causation, a self-organized rapidly evolving structure created by the interplay of high frequency trading algorithms. The Flash Crash was the birth cry of a network coming to life, eerily reminiscent of Arthur C. Clarke's science fiction story "Dial F for Frankenstein", which begins "At 0150 GMT on December 1, 1975, every telephone in the world started to ring." I'm excited by the scientific challenge of understanding all this in detail, because … well, never mind. I guess I don't really know.

Architect, teaching at Politecnico of Milan, visiting professor at Harvard GSD, editor in chief of the Abitare monthly/magazine

Proxemic of Urban Sexuality

In every room, in every house, in every street, in every city, movements, relations and spaces are also defined with regards to logics of attraction-repulsion between the sexuality of individuals.

Even the most insurmountable ethnic or religious barriers can suddenly disappear with the furor of an intercourse; even the warmest and cohesive community can rapidly dissolve in absence of erotic tension.

To understand how our cosmopolitan and multi-gendered cities work, today we need a Proxemic of Urban Sexuality.

Founder, Whole Earth Catalog, cofounder; The Well; cofounder, Global Business Network; Author, Whole Earth Discipline

Microbes Run the World

That opening sentence of The New Science of Metagenomics sounds reveille for a new way of understanding biology, and maybe of understanding society as well.

The breakthrough was shotgun sequencing of DNA, the same technology that gave us the human genome years ahead of schedule. Starting in 2003, Craig Venter and others began sequencing large populations of bacteria. The thousands of new genes they found (double the total previously discovered) showed what proteins the genes would generate and therefore what function they had, and that began to reveal what the teeming bacteria were really up to. This "meta"-genomics revolutionized microbiology, and that revolution will reverberate through the rest of biology for decades.

Microbes make up 80 percent of all biomass, says Carl Woese. In one fifth of a teaspoon of seawater there's a million bacteria (and 10 million viruses), Craig Venter says, adding, "If you don't like bacteria, you're on the wrong planet. This is the planet of the bacteria." That means most of the planet's living metabolism is microbial. When James Lovelock was trying to figure out where the gases come from that make the Earth's atmosphere such an artifact of life (the Gaia Hypothesis), it was microbiologist Lynn Margulis who had the answer for him. Microbes run our atmosphere. They also run much of our body. The human microbiome in our gut, mouth, skin, and elsewhere, harbors 3,000 kinds of bacteria with 3 million distinct genes. (Our own cells struggle by on only 18,000 genes or so.) New research is showing that our microbes-on-board drive our immune systems and important portions of our digestion.

Microbial evolution, which has been going on for over 3.6 billion years, is profoundly different from what we think of as standard Darwinian evolution, where genes have to pass down generations to work slowly through the selection filter. Bacteria swap genes promiscuously within generations. They have three different mechanisms for this "horizontal gene transfer" among wildly different kinds of bacteria, and thus they evolve constantly and rapidly. Since they pass on the opportunistically acquired genes to their offspring, what they do on an hourly basis looks suspiciously Lamarckian — the inheritance of acquired characteristics.

Such routinely transgenic microbes show that there's nothing new, special, or dangerous about engineered GM crops. Field biologists are realizing that the the biosphere is looking like what some are calling a pangenome,  an interconnected network of continuously circulated genes that is a superset of all the genes in all the strains of a species that form. Bioengineers in the new field of synthetic biology are working directly with the conveniently fungible genes of microbes.

This biotech century will be microbe enhanced and maybe microbe inspired. "Social Darwinism" turned out to be a bankrupt idea. The term "cultural evolution" never meant much, because the fluidity of memes and influences in society bears no relation to the turgid conservatism of standard Darwinian evolution. But "social microbialism" might mean something as we continue to explore the fluidity of traits and vast ingenuity of mechanisms among microbes — quorum sensing, biofilms, metabolic bucket brigades, "lifestyle genes," and the like.

Confronting a difficult problem we might fruitfully ask, "What would a microbe do?"

Richard Clarke Cabot Professor of Social Ethics, Department of Psychology, Harvard University

A Solution for Collapsed Thinking: Signal Detection Theory

We perceive the world through our senses. The brain-mediated data we receive in this way form the basis of our understanding of the world. From this become possible the ordinary and exceptional mental activities of attending, perceiving, remembering, feeling, and reasoning. Via these mental processes we understand and act on the material and social world.

In the town of Pondicherry in South India, where I sit as I write this, many do not share this assessment. There are those, including some close to me, who believe there are extrasensory paths to knowing the world that transcend the five senses, that untested "natural" foods and methods of acquiring information are superior to those based in evidence. On this trip, for example, I learned that they believe that a man has been able to stay alive without and caloric intake for months (although his weight falls, but only when he is under scientific observation).

Pondicherry is an Indian Union Territory that was controlled by the French for 300 years (staving off the British in many a battle right outside my window) and held on to until a few years after Indian independence. It has, in addition to numerous other points of attraction, become a center for those who yearn for spiritual experience, attracting many (both whites and natives) to give up their worldly lives to pursue the advancement of the spirit, to undertake bodily healing, and to invest in good works on behalf of a larger community.

Yesterday, I met a brilliant young man who had worked as a lawyer for eight years who now lives in the ashram and works in their book sales division. Sure, you retort, the profession of the law would turn any good person toward spirituality but I assure you that that the folks here have given up wealth and professional life of a wide variety of sorts to pursue this manner of life. The point is that seemingly intelligent people seem to crave non-rational modes of thinking and the Edge question this years forced me to think not only about the toolkit of the scientist but every person.

I do not mean to pick on any one city, and certainly not this unusual one in which so much good effort is put towards the arts and culture and on social upliftment of the sort we would admire. But this is a town that also attracts a particular type of European, American, and Indian — those whose minds seem more naturally prepared to believe that unprocessed "natural" herbs do cure cancer and that standard medical care is to be avoided (until one desperately needs chemo), that Tuesdays are inauspicious for starting new projects, that particular points in the big toe control the digestive system, that the position of the stars at the time of their birth led them to Pondicherry through an inexplicable process emanating from a higher authority and through a vision from "the mother", a deceased French woman, who dominates the ashram and surrounding area in death more than many successful politicians ever do in their entire lives.

These types of beliefs may seem extreme but they are not considered as such in most of the world. Change the content and the underlying false manner of thinking is readily observed just about anywhere — the new 22 inches of snow that has fallen where I live in the United States while I'm away will no doubt bring forth beliefs of a god angered by crazy scientists toting global warming.

As I contemplate the single most powerful tool that could be put into the heads of every growing child and every adult seeking a rational path, scientists included, it is the simple and powerful concept of "signal detection". In fact, the Edge question this year happens to be one I've contemplated for a while — should anybody ever ask such a question, the answer I've known would be an easy one: I use Green & Swets Signal detection theory and Psychophysics as the prototype, although the idea has its origins in earlier work among scientists concerned with the fluctuations of photons and their influence on visual detection and sound waves and their influence on audition.

The idea underlying the power of signal detection theory is simple: The world gives noisy data, never pure. Auditory data, for instance, are degraded for a variety of reasons having to do with the physical properties of the communication of sound. The observing organism has properties that further affect how those data will be experienced and interpreted, such as ability (e.g., a person's auditory acuity), the circumstances under which the information is being processed (e.g., during a thunderstorm), and motivation (e.g., disinterest). Signal detection theory allows us to put both aspects of the stimulus and the respondent together to understand the quality of the decision that will result given the uncertain conditions under which data are transmitted, both physically and psychologically.

To understand the crux of signal detection theory, each event of any data impinging on the receiver (human or other) is coded into four categories, providing a language to describe the decision:

    Did the event occur?  
    Yes No
  Yes Hit False Alarm
Did the received detect it?      
  No Miss

Correct Rejection

Hit: A signal is present and the signal is detected (correct response)

False Alarm:
No signal is presented but a signal is detected (incorrect response)

A signal is present but no signal is detected (incorrect response)

Correct Rejection:
No signal is present and no signal is detected (correct response)

If the signal is clear, like a bright light against a dark background, the decision maker has good visual acuity and is motivated to watch for the signal, we should see a large number of Hits and Correct Rejections and very few False Alarms and Misses. As these properties change, so does the quality of the decision. Whether the stimulus is a physical one like a light or sound, or a piece of information requiring an assessment about its truth, information is almost always deviates from goodness.

It is under such ordinary conditions of uncertainty that signal detection theory yields a powerful way to assess the stimulus and respondent qualities including the respondent's idiosyncratic criterion (or cutting score, "c") for decision-making. The criterion is the place along the distribution at which point the respondent switches from the saying "no" to a "yes".

The applications of signal detection theory have been in areas as diverse as locating objects by sonar, the quality of remembering, the comprehension of language, visual perception, consumer marketing, jury decisions, price predictions in financial markets, and medical diagnoses.

The reason signal detection theory should be in the toolkit of every scientist is because it provides a mathematically rigorous framework to understand the nature of decision processes. The reason its logic should be in the toolkit of every thinking person is because it forces a completion of the four cells when analyzing the quality of any statement such as "Good management positions await Saggitarius this week".

President Emeritus, The Royal Society; Professor of Cosmology & Astrophysics; Master, Trinity College, University of Cambridge; Author, Our Final Century: The 50/50 Threat to Humanity's Survival

"Deep Time" And The Far Future

We need to extend our time-horizons. Especially, we need deeper and wider awareness that far more time lies ahead than has elapsed up till now.

Our present biosphere is the outcome of more than four billion years of evolution; and we can trace cosmic history right back to a "big bang" that happened about 13.7 billion years ago. The stupendous time-spans of the evolutionary past are now part of common culture and understanding — even though the concept may not yet have percolated all parts of Kansas, and Alaska.

But the immense time-horizons that stretch ahead — though familiar to every astronomer — haven't permeated our culture to the same extent. Our Sun is less than half way through its life. It formed 4.5 billion years ago, but it's got 6 billion more before the fuel runs out. It will then flare up, engulfing the inner planets and vaporising any life that might then remain on Earth. But even after the Sun's demise, the expanding universe will continue — perhaps for ever — destined to become ever colder, ever emptier. That, at least, is the best long range forecast that cosmologists can offer, though few would lay firm odds on what may happen beyond a few tens of billions of years.

Awareness of the "deep time" lying ahead is still not pervasive. Indeed, most people — and not only those for whom this view is enshrined in religious beliefs —envisage humans as in some sense the culmination of evolution. But no astronomer could believe this; on the contrary, it would be equally plausible to surmise that we are not even at the halfway stage. There is abundant time for posthuman evolution, here on Earth or far beyond, organic or inorganic, to give rise to far more diversity, and even greater qualitative changes, than those that have led from single-celled organisms to humans. Indeed this conclusion is strengthened when we realise that future evolution will proceed not on the million-year timescale characteristic of Darwinian selection, but at the much accelerated rate allowed by genetic modification and the advance of machine intelligence (and forced by the drastic environmental pressures that would confront any humans who were to construct habitats beyond the Earth.

Darwin himself realised that "No living species will preserve its unaltered likeness into a distant futurity". We now know that "futurity" extends far further, and alterations can occur far faster — than Darwin envisioned. And we know that the cosmos, through which life could spread, is far more extensive and varied than he envisaged. So humans are surely not the terminal branch of an evolutionary tree, but a species that emerged early in cosmic history, with special promise for diverse evolution. But this is not to diminish their status. We humans are entitled to feel uniquely important as the first known species with the power to mould its evolutionary legacy.

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

next >