| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

index >



Genome Scientist; Sequenced first genome of a living species, the human genome, and created the first synthetic life; Author, A Life Decoded

We Are Not Alone In The Universe

I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. There is a human-centric, Earth-centric view of life that permeates most cultural and societal thinking. Finding that there are multiple, perhaps millions of origins of life and that life is ubiquitous throughout the universe will profoundly affect every human.

We live on a microbial planet. There are one million microbial cells per cubic centimeter of water in our oceans, lakes and rivers; deep within the Earth's crust and throughout our atmosphere. We have more than 100 trillion microbes on and in each of us. The Earth's diversity of life would have seemed like science fiction to our ancestors. We have microbes that can withstand millions of Rads of ionizing radiation; such strong acid or base that it would dissolve our skin; microbes that grow in ice and microbes that grow and thrive at temperatures exceeding 100 degrees C. We have life that lives on carbon dioxide, on methane, on sulfur, or on sugar. We have sent trillions of bacteria into space over the last few billion years and we have exchanged material with Mars on a constant basis, so it would be very surprising if we do not find evidence of microbial life in our solar system, particularly on Mars.

The recent discoveries by Dimitar Sasselov and colleagues of numerous Earth and super-Earth-like planets outside our solar system, including water worlds, greatly increases the probability of finding life. Sasselov estimates approximately 100,000 Earth and super-Earths within our own galaxy. The universe is young so wherever we find microbial life there will be intelligent life in the future.

Expanding our scientific reach further into the skies will change us forever.

Artist; Composer; Recording Producer: U2, Cold Play, Talking Heads, Paul Simon; Recording Artist; Author, A Year With Swollen Appendices


That idea, or bundle of ideas, seems to me the most important revolution in general thinking in the last 150 years. It has given us a whole new sense of who we are, where we fit, and how things work. It has made commonplace and intuitive a type of perception that used to be the province of mystics — the sense of wholeness and interconnectedness.

Beginning with Copernicus, our picture of a semi-divine humankind perfectly located at the centre of The Universe began to falter: we discovered that we live on a small planet circling a medium sized star at the edge of an average galaxy. And then, following Darwin, we stopped being able to locate ourselves at the centre of life. Darwin gave us a matrix upon which we could locate life in all its forms: and the shocking news was that we weren't at the centre of that either — just another species in the innumerable panoply of species, inseparably woven into the whole fabric (and not an indispensable part of it either). We have been cut down to size, but at the same time we have discovered ourselves to be part of the most unimaginably vast and beautiful drama called Life.

Before ''ecology'' we understood the world in the metaphor of a pyramid — a heirarchy with God at the top, Man a close second and, sharply separated, a vast mass of life and matter beneath. In that model, information and intelligence flowed in one direction only — from the intelligent top to the ''base'' bottom — and, as masters of the universe, we felt no misgivings exploiting the lower reaches of the pyramid.

The ecological vision has changed that: we now increasingly view life as a profoundly complex weblike system, with information running in all directions, and instead of a single heirarchy we see an infinity of nested-together and co-dependent heirarchies — and the complexity of all this is such to be in and of itself creative. We no longer need the idea of a superior intelligence outside of the system — the dense field of intersecting intelligences is fertile enough to account for all the incredible beauty of ''creation''.

The ''ecological'' view isn't confined to the organic world. Along with it comes a new understanding of how intelligence itself comes into being. The classical picture saw Great Men with Great Ideas...but now we tend to think more in terms of fertile circumstances where uncountable numbers of minds contribute to a river of innovation. It doesn't mean we cease to admire the most conspicuous of these — but that we understand them as effects as much as causes. This has ramifications for the way we think about societal design, about crime and conflict, education, culture and science.

That in turn leads to a re-evaluation of the various actors in the human drama. When we realise that the cleaners and the bus drivers and the primary school teachers are as much a part of the story as the professors and the celebrities, we will start to accord them the respect they deserve.

The Father of Behavioral Economics; Director, Center for Decision Research at the University of Chicago Graduate School of Business; Coauthor, Nudge: Improving Decisions About Health, Wealth, and Happiness


I recently posted a question in this space asking people to name their favorite example of a wrong scientific belief. One of my favorite answers came from Clay Shirky. Here is an excerpt:

The existence of ether, the medium through which light (was thought to) travel. It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.

It's also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn't exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

Several other entries (such as the "force of gravity") shared the primary function of ether: they were convenient fictions that were able to "explain" some otherwise ornery facts. Consider this quote from Max Pettenkofer, the German chemist and physician, is disputing the role of bacteria as a cause of the cholera. "Germs are of no account in cholera! The important thing is the disposition of the individual."

So in answer to the current question I am proposing that we now change the usage of the word Aether, using the old spelling, since there is no need for a term that refers to something that does not exist. Instead, I suggest we use that term to describe the role of any free parameter used in a similar way: that is, Aether is the thing that makes my theory work. Replace the word disposition with Aether in Pettenkofer's sentence above to see how it works.

Often Aetherists (theorists who rely on an Aether variable) think that their use of the Aether concept renders the theory untestable. This belief is often justified during their lifetimes, but then along comes clever empiricists such as Michelson and Morley and last year's tautology become this year's example of a wrong theory.

Aether variables are extremely common in my own field of economics. Utility is the thing you must be maximizing in order to render your choice rational.

Both risk and risk aversion are concepts that were once well defined, but are now in danger of becoming Aetherized. Stocks that earn surprisingly high returns are labeled as risky, because in the theory, excess returns must be accompanied by higher risk. If, inconveniently, the traditional measures of risk such as variance or covariance with the market are not high, then the Aetherists tell us there must be some other risk; we just don't know what it is.

Similarly, traditionally the concept of risk aversion was taken to be a primitive; each person had a parameter, gamma, that measured her degree of risk aversion. Now risk aversion is allowed to be time varying, and Aetherists can say with a straight face that the market crashes of 2001 and 2008 were caused by sudden increases in risk aversion. (Note the direction of the causation. Stocks fell because risk aversion spiked, not vice versa.)

So, the next time you are confronted with such a theory, I suggest substituting the word Aether for the offending concept. Personally, I am planning to refer to the time-varying variety of risk aversion as Aether aversion.

Neuroscientist; Director, Center for Brain and Cognition, University of California, San Diego; Author, The Tell-Tale Brain: Unlocking the Mystery of Human Nature

Chunks With "Handles"

Do you need language — including words — for sophisticated thinking or do they merely facilitate thought? This question goes back to a debate between two Victorian scientists Max Mueller and Francis Galton.

A word that has made it into the common vocabulary of both science and pop culture is "paradigm" (and the converse "anomaly") having been introduced by the historian of science Thomas Kuhn. It is now widely used and misused both in Science and in other disciplines almost to the point where the original meaning is starting to be diluted. (This often happens to "memes" of human language and culture; which don't enjoy the lawful, particulate transmission of genes.) The word "paradigm" is now often used inappropriately — especially in the US — to mean any experimental procedure such as "The Stroop paradigm" or " A reaction time paradigm" or "fMR paradigm".

However, its appropriate use has shaped our culture in significant ways; even influencing the way scientists work and think. A more prevalent associated word is "skepticism", originating from the name of a Greek school of philosophy . This is used even more frequently and loosely than "anomaly" and "paradigm shift".

One can speak of reigning paradigms; what Kuhn calls normal science — What I cynically refer to as a "mutual admiration club trapped in a cul-de-sac of specialization". The club usually has its Pope(s), hierarchical priesthood, acolytes and a set of guiding assumptions and accepted norms that are zealously guarded almost with religious fervor. (They also fund each other and review each other’s papers and grants and give each other awards.)

This isn't entirely useless; its called "normal science" that grows by progressive accretion, employing the bricklayers rather than architects of science. If a new experimental observation (e.g. bacterial transformation; Ulcers cured by antibiotics) threatens to topple the edifice, its called an anomaly and the typical reaction of those who practice normal science is to ignore it or brush it under the carpet — a form of psychological denial surprisingly common among my colleagues.

This is not an unhealthy reaction since most anomalies turn out to be false alarms; the baseline probability of their survival as real" anomalies is small and whole careers have been wasted pursuing them (think "poly water", cold fusion".) Yet even such false anomalies serve the useful purpose of jolting scientists from their slumber by calling into question the basic axioms that drive their particular area of science. Conformist science feels cozy given the gregarious nature of humans and anomalies force periodic reality checks even if the anomaly turns out to be flawed.

More important, though, are genuine anomalies that emerge every now and then, legitimately challenging the status quo, forcing paradigm shifts and leading to scientific revolutions. Conversely, premature skepticism toward anomalies can lead to stagnation of science. One needs to be skeptical of anomalies but equally skeptical of the status quo if science is to progress.

I see an analogy between the process of science and of evolution by natural selection. For evolution, too, is characterized by periods of stasis (= normal science) punctuated by brief periods of accelerated change (= paradigm shifts) based on mutations (= anomalies) most of which are lethal (false theories) but some lead to the budding off of new species and phylogenetic trends (=paradigm shifts).

Since most anomalies are false alarms (spoon bending, telepathy, homeopathy) one can waste a lifetime pursuing them. So how does one decide which anomalies to invest in? Obviously one can do so by trial and error but that can be tedious and time consuming.

Let's take four well-known examples: (1) Continental drift; (2) Bacterial transformation; (3) cold fusion; (4) telepathy. All of these were anomalies when first discovered because they didn't fit the big picture of normal science at that time. The evidence that all the continents broke off and drifted away from a giant super-continent was staring at peoples faces — as Wagener noted in the early 20th century. (The coastlines coincided almost perfectly; certain fossils found on the east coast of Brazil were exactly the same as the ones on the west coast of Africa etc.) Yet it took fifty years for he idea to be accepted by the skeptics.

The second anomaly (2) — observed a decade before DNA and the genetic code — was that if you incubate one species of bacterium (pneumococcus A) with another species in a test tube (Pneumococcus B) then bacterium A becomes transformed into B! (Even the DNA —rich juice from B will suffice — leading Avery to suspect that heredity might have a chemical basis) Others replicated this. It was almost like saying put a pig and donkey into a room and two pigs emerge — yet the discovery was largely ignored for a dozen years. Until Watson and Crick pointed out the mechanism of transformation. The third anomaly — telepathy — is almost certainly a false alarm.

You will see a general rule of thumb emerging here. Anomalies (1) and (2) were not ignored because of lack of empirical evidence. Even a school child can see the fit between continental coastlines or similarity of fossils. It was ignored solely because it didn't fit the big picture — the notion of terra firma or a solid, immovable earth — and there was no conceivable mechanism that would allow continents to drift (until plate tectonics was discovered). Likewise (2) was repeatedly confirmed but ignored because it challenged the fundamental doctrine of biology — the stability of species. But notice that the third (telepathy) was rejected for two reasons; first, it didn't fit the big picture and second because it was hard to replicate

This gives us the recipe we are looking for; focus on anomalies that have survived repeated attempts to disprove experimentally, but are ignored by the establishment solely because you cant think of a mechanism. But don't waste time ones that have not been empirically confirmed despite repeated attempts (or the effect becomes smaller with each attempt — a red flag!)

"Paradigm" and "paradigm shift" have now migrated from science into pop culture (not always with good results) and I suspect many other words and phrases will follow suit — thereby enriching our intellectual and conceptual vocabulary and day-to-day thinking.

Indeed, words themselves are paradigms or stable "species" of sorts that evolve gradually with progressively accumulating penumbras of meaning, or sometimes mutate into new words to denote new concepts. These can then consolidate into chunks with "handles" (names) for juggling ideas around generating novel combinations. As a behavioral neurologist I am tempted to suggest that such crystallization of words and juggling them is unique to humans and it occurs in brain areas in and near the left TPO (temporal-parietal-occipital junction). But that's pure speculation.

Evolutionary Zoologist, University of Oxford. Author, The Blind Watchmaker; The Greatest Show on Earth

The Double-Blind Control Experiment

Not all concepts wielded by professional scientists would improve everybody's cognitive toolkit. We are here not looking for tools with which research scientists might benefit their science. We are looking for tools to help non-scientists understand science better, and equip them to make better judgments throughout their lives.

Why do half of all Americans believe in ghosts, three quarters believe in angels, a third believe in astrology, three quarters believe in Hell? Why do a quarter of all Americans and believe that the President of the United States was born outside the country and is therefore ineligible to be President? Why do more than 40 percent of Americans think the universe began after the domestication of the dog?

Let's not give the defeatist answer and blame it all on stupidity. That's probably part of the story, but let's be optimistic and concentrate on something remediable: lack of training in how to think critically, and how to discount personal opinion, prejudice and anecdote, in favour of evidence. I believe that the double-blind control experiment does double duty. It is more than just an excellent research tool. It also has educational, didactic value in teaching people how to think critically. My thesis is that you needn't actually do double-blind control experiments in order to experience an improvement in your cognitive toolkit. You only need to understand the principle, grasp why it is necessary, and revel in its elegance.

If all schools taught their pupils how to do a double-blind control experiment, our cognitive toolkits would be improved in the following ways:

1. We would learn not to generalise from anecdotes.

2. We would learn how to assess the likelihood that an apparently important effect might have happened by chance alone.

3. We would learn how extremely difficult it is to eliminate subjective bias, and that subjective bias does not imply dishonesty or venality of any kind. This lesson goes deeper. It has the salutary effect of undermining respect for authority, and respect for personal opinion.

4. We would learn not to be seduced by homeopaths and other quacks and charlatans, who would consequently be put out of business.

5. We would learn critical and sceptical habits of thought more generally, which not only would improve our cognitive toolkit but might save the world.

Psychologist, Princeton; Recipient, 2002 Nobel Prize in Economic Sciences

Focusing Illusion

"Nothing In Life Is As Important As You Think It Is, While You Are Thinking About It"

Education is an important determinant of income — one of the most important — but it is less important than most people think. If everyone had the same education, the inequality of income would be reduced by less than 10%. When you focus on education you neglect the myriad other factors that determine income. The differences of income among people who have the same education are huge.

Income is an important determinant of people's satisfaction with their lives, but it is far less important than most people think. If everyone had the same income, the differences among people in life satisfaction would be reduced by less than 5%.

Income is even less important as a determinant of emotional happiness. Winning the lottery is a happy event, but the elation does not last. On average, individuals with high income are in a better mood than people with lower income, but the difference is about 1/3 as large as most people expect. When you think of rich and poor people, your thoughts are inevitably focused on circumstances in which their income is important. But happiness depends on other factors more than it depends on income.

Paraplegics are often unhappy, but they are not unhappy all the time because they spend most of the time experiencing and thinking about other things than their disability. When we think of what it is like to be a paraplegic, or blind, or a lottery winner, or a resident of California we focus on the distinctive aspects of each of these conditions. The mismatch in the allocation of attention between thinking about a life condition and actually living it is the cause of the focusing illusion.

Marketers exploit the focusing illusion. When people are induced to believe that they "must have" a good, they greatly exaggerate the difference that the good will make to the quality of their life. The focusing illusion is greater for some goods than for others, depending on the extent to which the goods attract continued attention over time. The focusing illusion is likely to be more significant for leather car seats than for books on tape.

Politicians are almost as good as marketers in causing people to exaggerate the importance of issues on which their attention is focused. People can be made to believe that school uniforms will significantly improve educational outcomes, or that health care reform will hugely change the quality of life in the United States — either for the better or for the worse. Health care reform will make a difference, but the difference will be smaller than it appears when you focus on it.

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

index >