Home
About
Features
Editions
Press
Events
Dinner
Question Center
Video
Subscribe

Edge 333 — November 23, 2010
(13,900 words)

THE THIRD CULTURE

THALER'S QUESTION
An Edge Special Event!

EDGE IN THE NEWS

60 MINUTES, INSTITUTE FOR ETHICS AND EMERGING TECHNOLOGIES, N, DIE WELT, O ESTADO DE S.PAULO/CULTURE


subscribe


The flat earth and geocentric world are examples of wrong scientific beliefs that were held for long periods. Can you name your favorite example and for extra credit why it was believed to be true?

THALER'S QUESTION
An Edge Special Event!

To selected Edge contributors:

I am doing research for a new book and would hope to elicit informed responses to the following question:

The flat earth and geocentric world are examples of wrong scientific beliefs that were held for long periods. Can you name your favorite example and for extra credit why it was believed to be true?

Please note that I am interested in things we once thought were true and took forever to unlearn. I am looking for wrong scientific beliefs that we've already learned were wrong, rather than those the respondent is predicting will be wrong which makes it different from the usual Edge prediction sort of question.

Several responders pointed out that the phrase "scientific belief" in my question was not well defined. Did I mean beliefs held by scientists or beliefs by the lay public about science. The answer is that I am interested in both, though I should stress that this is not at all what my next book will be about. I do not know enough about science to write anything about the subject. However, for the book I am thinking about stuff that we get wrong, often for long periods of time, and am doing some wondering about whether there are some principles defining when such mistakes are more likely to happen.

This exercise has been fantastically interesting, and if anyone is prompted by this to send in more ideas please do. I am also interested if anyone has thoughts about what the principles might be, if, indeed there are any.

Richard Thaler

RICHARD H. THALER, Director of the Center for Decision Research at the University of Chicago Graduate School of Business, is the father of Behavioral Economics. He is coauthor (with Cass Sunstein) of Nudge: Improving Decisions About Health, Wealth, and Happiness, and he writes a column that appears in the Business section of The Sunday New York Times.

Richard Thaler's Edge Bio Page

__

Thaler on Edge: "A Short Course In Behavioral Economics with Richard Thaler, Daniel Kahneman, Sendhil Mullainathan", The Edge Master Class 2008

PERMALINK


THALER'S QUESTION

The flat earth and geocentric world are examples of wrong scientific beliefs that were held for long periods. Can you name your favorite example and for extra credit why it was believed to be true?




55 Contributors (to date): Neil Shubin, Garrett Lisi, Peter Schwartz, David Deutsch, Haim Harari, Alun Anderson, Irene Pepperberg, John Holland, Derek Lowe, Charles Simonyi, Nathan Myhrvold, Lawrence Krauss, Steven Strogatz, Cesar Hidalgo, Eric Topol, Christian Keysers, Simona Morini, Ross Anderson, James Croak, Rob Kurzban, Lewis Wolpert, Howard Gardner, Ed Regis, Robert Trivers, Frank Tipler, Joan Chaio, Jeremy Bernstein, Matthew Ritchie, Clay Shirky, Roger Schank, Gary Klein, Gregory Cochran, Eric Weinstein , Geoffrey Carr, James O'Donnell, Lane Greene, Jonathan Haidt, Juan Enriquez, Scott Atran, Rupert Sheldrake, Emanuel Derman, Charles Seife, Milford H. Wolpoff, Robert Shapiro, Judith Harris, Jordan Pollack, Sue Blackmore, Nicholas G. Carr, Lee Smolin, Marti Hearst, Gino Segre, Carl Zimmer, Gregory Paul, Alison Gopnik, George Dyson


NEIL SHUBIN
Evolutionary Biologist; Robert R. Bensley Distinguished Service Professor; University of Chicago; Author, Your Inner Fish

One wrong idea in my field was that the map of the earth was fixed...that the continents stayed in one place over time. This notion held despite the fact that anyone, including small children, could see that the coasts of Africa and South America (like many places) fit together like a jigsaw puzzle. Evidence for moving continents piled up (fossils from different places, similar rocks...), but still there was strong resistance. Part of the problem is that nobody could imagine a mechanism for the continents to move about...did they raft like icebreakers through the ocean mushing the sea bottom as they did so? Nobody saw how this could possibly happen.


GARRETT LISI
Independent Theoretical Physicist; Author, "An Exceptionally Simple Theory of Everything"

One wrong scientific belief held by cosmologists until recently was that the expansion of the universe would ultimately cease, or even that the universe would re-contract. Evidence now shows that the expansion of the universe is accelerating. This came as quite a shock, although the previous belief was held on scant evidence. Many physicists liked the idea of a closed universe, and expressed distaste at the idea of galaxies accelerating off to infinity, but nature often contradicts our intuition.


PETER SCHWARTZ
Futurist, Business Strategist; Cofounder. Global Business Network, a Monitor Company; Author, Inevitable Surprises

There are several things we believed not true and now believe to be true for example prions did not exist and now are a major field of study and quantum entanglement was impossible, even to Einstein ("spooky action at a distance", he called it. ) and now it is the basis of quantum computing.


DAVID DEUTSCH
Quantum Physicist, Oxford; Author, The Fabric of Reality

Surely the most extreme example is the existence of a force of gravity.

It's hard to say when this belief began but it surely predates Newton. It must have existed from the time when concepts such as force were first formulated until the general theory of relativity superseded it in 1915.

Why did scientists hold that belief for so long? Because it was a very good explanation and there was no rival theory to explain observations such as heavy objects exerting forces on whatever they were resting on. Since 1915 we have known the true explanation, namely that when you hold your arm out horizontally, and think you are feeling it being pulled downwards by a force of gravity, the only force you are actually feeling is the upward force exerted by your own muscles in order to keep your arm accelerating continuously away from a straight path in spacetime. Even today, it is hard to discipline our intuition to conceive of what is happening in that way, but it really is.


HAIM HARARI
Physicist, former President, Weizmann Institute of Science; Author, A View from the Eye of the Storm

The earth is flat and the sun goes around it for the same reason that an apple appears to be more strongly attracted by the earth than a leaf, the same reason that when you add 20% and then subtract 20% you return to the same value, and the same reason that the boat is heavier than water. All of these statements appear to be correct, at first sight, and all of them are wrong. The length of time it takes to figure it out is a matter of history and culture. Religion gets into it, psychology, fear of science, and many other factors. I do not believe that there is one parameter that determines how these things are found to be wrong.

The guy who sold me a carpet last month truly insisted that people in Australia are standing on their heads and could not understand how they manage to do it. He still believes that the earth is flat and is ashamed of his belief, but refuses to accept my explanations. I know a union that got a substantial pay raise because a politician did not understand that adding and then subtracting 20% gets you to another result from the one you started. Religious people of all religions believe even more ridiculous things than all of the above. These are examples of the last 10 years, not of the middle ages.

Part of the problem is that, in order to find the truth, in all of these cases, you need to ask the right question. This is more important, and often more difficult, than to find the answer. The right questions in the above cases are of different levels of complexity.


ALUN ANDERSON
Senior Consultant (and former Editor-in-Chief and Publishing Director of New Scientist); Author, After the Ice: Life, Death, and Geopolitics in the New Arctic

The Great Chain of Being is another great example of a long-held, still not fully displaced, false view and also stems from the same kind of "wrongly centered " thinking.

Essentially the view is that humans stand at the pinnacle of creation (or just below God) and all other life forms are less perfect to a varying degree.

Evolutionary theory teaches that all creatures are equally adapted to the niches in which they live; every branch of the tree is thus in a sense equally perfect.

There was a critical moment in the early 1970s when the new view swept into psychology. I was a student at the time, looking at so-called comparative psychology. The dominant view, put forward by ME Bitterman, was that you could classify "learning ability" and arrange animals according to the level they had reached e.g. fish were incapable of "reversal learning" but rats were, or some such. A paper was then published (by Hodos and Campbell 1969) on the false notion of the Great Chain of Being in psychology and that every animal's learning ability fitted the particular use it made of it (e.g. honey bees are brilliant at learning the time of day at which particular flowers produce nectar, a subject I later researched). This change in the way of thinking reflects also a move way from the US Skinnerian school of lab studies of animals to the European ethological school (pioneered by Novel prize winner Niko Tinbergen who I worked with) of studying animals in their own environments.

The view also fits Native American conceptions of a Creator who does not favour any particular one of his creations but is at odds with the Christian view, which is why it lingers on in the US.


IRENE PEPPERBERG
Psychologist, Research Associate, Harvard University; Author, Alex and Me

That all birds were stupid.

It was believed to be true because (a) early neurobiologists couldn't find anything in the avian brain that looked like the primate cortex (although the more enlightened did argue that there was a 'striatal' area that seemed to work in a somewhat comparable manner for some birds) and (b) many studies on avian intelligence, using operant conditioning, focused on pigeons — which are not the most intelligent birds — and the pigeon generally never did even as well as the rat on the type of tasks used.

A corollary: That parrots were not only stupid, but also could never learn to do anything more than mimic human speech.

It was believed to be true because the training techniques initially used in laboratories were not appropriate for teaching heterospecific communication.


JOHN HOLLAND
Professor of Psychology, Computer Science and Engineering, University of Michigan, Ann Arbor; Author, Emergence: From Chaos to Order

From the time of Aristotle onward, natural philosophers believed that the basic law underlying motion was that all objects (eventually) come to rest. It took Newton to lay aside the myriad details (friction, etc.) in order to build an idealized model that requires 'forces' to change direction or velocity. Subsequently, everything from models of fluid flow to slinging satellites to the outer solar system used Newton's model as a starting point.


DEREK LOWE
Medicinal Chemist

My nominees are:

(1) The "four humours" theory of human physiology. That one, coming most prominently from Galen, persisted for centuries. Although the pure humoural theory gradually eroded, it lived on in the shape of contra-therapeutic bloodletting until the 19th century, and you'd have to think that in the vast majority of those cases it was harmful.

Why it persisted so long is the tough part. My guess is that it was hard (it still is!) for both physicians and patients to realize or admit that very little could be done for most physical ailments. Bloodletting might not always, work, but it had to be better than just standing there doing nothing, right? And in those cases susceptible to it, bloodletting must have had a pretty strong placebo effect, as dramatic as it is.

(2) The "bad air" theory of infectious disease. This is another one that you can find persisting for centuries. I'd say that it lasted for several factors: there were indeed such things as poisonous vapors which could make a person feel sick, for one thing. And the environments that were felt to have the worst vapors were often ones that had higher rates of disease due to the real factors (standing water, poor hygiene, overcrowded dwellings, and so on). Finally, there's the factor that's kept all sorts of erroneous beliefs alive — lack of a compelling alternative. The idea of strange-looking living creatures too small to see being the cause of infections wouldn't have gotten much of a hearing, not in the face of more tangible explanations.

That last point brings up another reason that error persists — the inability (or unwillingness) to realize that man is not the measure of all things. Unaided human perceptions on the human scale don't take you very far on the macro-scale of astronomy, or the micro-scale of cell biology (much less that of subatomic physics). To me, the story of science has been the story of augmenting our perceptions, and realizing that they had to be augmented in the first place.


CHARLES SIMONYI
Computer Scientist, International Software; Former Chief Architect, and Distinguished Engineer, Microsoft Corporation

One short answer is this: Peripatetic Mechanics of Aristotle was probably the longest running wrong scientific idea which went from the Greek times up until practically Newton. The reason for the longevity was that it (namely Aristotle's mechanics) corresponded well to the crude and complicated word around us: with two horses the heavy cart moves indeed faster (without careful measurements we could easily say: two times faster) than with just one horse. When you give a shove to something, it will start moving and then soon stop. Heavy things move down, light things (feathers, smoke) move up. The normal world is just not friendly to the kind of abstraction that allows the setting up of general natural laws like Newton's.

I am of course aware of the currently popular belief that "flat earth" was somehow a widely held "scientific" idea, but I do not know what evidence supports this belief. It was certainly not part of the Antique inheritance (who had pretty good estimates for the diameter of the earth and excellent estimates for the ratio of Earth's and Moon's diameters); It was not part of Aristotle, or Aquinus, or any of the authorities that the Church relied on. No doubt, there were some creation myths or fanciful publications that might have illustrated the world as being flat but it is a stretch to call these "scientific" even by standards of the age, when learned men would have been able to refute such a thesis easily — and probably did as part of their exams.

With the geocentric world it is a different matter — geocentrism was indeed scientifically held (with Ptolemy being the best proponent) and it is indeed false — but not to the same extent as the Peripatetic Mechanics. The real issue was precision of prediction — and the complicated system of Ptolemy gave excellent results, indeed better results than Copernicus (which made the breakthrough idea of Copernicus a difficult sell — just put yourself into the shoes of someone in his time.)

Real improvement in precision came only with Kepler and the elliptical orbits which were arrived at in part by scientific genius, by being a stickler for accuracy, and in part by mad superstition (music of the spheres, etc.) From his point of view, putting the coordinate system around the sun simplified his calculations. The final significance of putting the sun into the center was to be able to associate a physical effect — gravitation — with the cause of that effect, namely with the sun. But this did not really matter before Newton.

In any of the cases a common thread seems to be that the "wrong" scientific ideas were held as long as the difference between "wrong" and "right" did not matter or was not even apparent given the achievable precision, or, in many cases the differences actually favored the "wrong" theory — because of the complexity of the world, the nomenclature, the abstractions.

I think we are all too fast to label old theories "wrong" and with this we weaken the science of today — people say — with some justification from the facts as given to them — that since the old "right" is now "wrong" the "right" of today might be also tainted. I do not believe this — today's "right" is just fine, because yesterday's "wrong" was also much more nuanced "more right" that we are often led to believe.


NATHAN MYHRVOLD
CEO, Managing Director, Intellectual Ventures; Former Director, Microsoft Research and Chief Technology Officer, Microsoft

Here is a short list:

1. Stress theory of ulcers — it turns out they are due to infection with Heliobacter pylori. Barry Marshall won Nobel Prize for that.

2. Continental drift was proposed in the 1920-30s by Alfred Wegner, but was totally dismissed until the 1960s when it ushered in plate tectonics.

3. Conventional belief was the eye evolved many, many times. Then they discovered the PAX genes that regulate eyes and are found throughout the animal kingdom — eyes evolved ONCE.

4. Geoffrey St. Hillare was a French scientist who had a theory that invertebrates and vertebrates shared a common body plan. He was widely dismissed until the HOX genes were discovered.


LAWRENCE KRAUSS
Physicist, Director, Origins Initiative, Arizona State University; Author, Hiding in the Mirror

Intelligent design... special creation... the reason... a long age of the earth is so long that people didn't realize that evolution could occur.


STEVEN STROGATZ
Applied mathematician, Cornell University; Author, Sync

Another classic wrong belief is that light propagates through a medium, the "ether," that pervades the universe. This was believed to be true until the early 1900s because all other waves known at that time required a medium in which to propagate. Sound waves travel through air or water; the waves on a plucked guitar string travel down the string itself. Yet on the face of it, light seemed to need no medium — it could travel through seemingly empty space. Theorists concluded that empty space must not really be empty — it must contain a light-bearing medium, the "luminiferous ether".

But the ether was always a very problematic notion. For one thing, it had to be extremely stiff to propagate a wave as fast as light — yet how could empty space be "stiff"?

The existence of the ether was disproved experimentally by the Michelson Morley experiment, and theoretically by Einstein's special theory of relativity.


CÉSAR A. HIDALGO
Assistant Professor, MIT Media Lab; Faculty Associate, Harvard Center for International Development

The age of the earth... which was believed to be only a few thousand years old, due to biblical calculations, until Charles Lyell (who was a good friend of Darwin) begun to come up with estimates of millions of years based on erosion.... the advanced age of the world was heavily refuted by scientists, particularly by Lord Kelvin, who made calculations of the rate at which earth must have cooled down and concluded that this could have only happened in a few thousand years... he did not know about the radioactive decay taking place at the earth's core...

The model that was used to explain mountains was based not on tectonic plates, but rather on a shrinking earth, by assuming that as the earth cooled down it shrunk and creased up....

The humors theory of disease v/s the germ theory of disease.

Basically... any change of paradigm that went on during the 19th century in England...


ERIC TOPOL
Cardiologist; Director, Scripps Translational Science Institute, La Jolla

In medicine there are many of these wrong scientific beliefs (so many it is frankly embarrassing). Here are a couple:

We were taught (in med school and as physicians) that when cells in the body differentiate to become heart muscle or nerve tissue/brain, they can never regenerate and there is no natural way for new cells/tissue to form. Wrong!! Enter the stem cell and regenerative medicine era.

Until the mid 1980s, a heart attack was thought to be a fait accomplit, that there was nothing that could ever be done to stop the damage from occurring...just give oxygen, morphine, and say prayers. Then we discovered that we could restore blood supply to the heart and abort the heart attack or prevent much of the damage. The same is now true for stroke. It took almost 80 years for that realization to be made!


CHRISTIAN KEYSERS
Neuroscientist; Scientific Director, Neuroimaging Center, University Medical Center Groningen

For a long time the brain was thought to contain separate parts designed for motor control and visual perception. Only in the 1990's, through the discovery of mirror neurons, did we start to understand that the brain did not work along such divisions, but was instead using motor areas also for perception and perceptual areas also for vision.

I believe that this wrong belief was so deeply engrained because of AI, in which there is no link between what a computer sees human do and the computers routines for moving a robot. Instead, in the human brain the situation is different: the movements we program for our own body look exactly the same as those other humans do. Hence, our motor programs and body are a match for those we observe, and hence afford a strong system for simulating and perceiving the actions of others.

I call this the computer fallacy: thinking of the brain as a computer turned out to harm our understanding of the brain.


SIMONA MORINI
Philosopher; Dipartimento delle Arti e del Disegno Industriale, IUAV University Venice

My preference goes to euclidean geometry. It's axioms were considered true for centuries on the basis of intuition (shall we say prejudice?) about space.


ROSS ANDERSON
FRS; Professor, Security Engineering, Cambridge Computer Laboratory; Researcher in Security Psychology

In the field of security engineering, a persistent flat-earth belief is 'security by obscurity': the doctrine that security measures should not be disclosed or even discussed.

In the seventeenth century, when Bishop Wilkins wrote the first book on cryptography in English in 1641, he felt the need to justify himself: "If all those useful Inventions that are liable to abuse, should therefore be concealed, there is not any Art or Science which might be lawfully profest". In the nineteenth century, locksmiths objected to the publication of books on their craft; although villains already knew which locks were easy to pick, the locksmiths' customers mostly didn't. In the 1970s, the NSA tried to block academic research in cryptography; in the 1990s, big software firms tried to claim that proprietary software is more secure than its open-source competitors.

Yet we actually have some hard science on this. In the standard reliability growth model, it is a theorem that opening up a system helps attackers and defenders equally; there's an empirical question whether the assumptions of this model apply to a given system, and if they don't then there's a further empirical question of whether open or closed is better.

Indeed, in systems software the evidence supports the view that open is better. Yet the security-industrial complex continues to use the obscurity argument to prevent scrutiny of the systems it sells. Governments are even worse: many of them would still prefer that risk management be a matter of doctrine rather than of science."


JAMES CROAK
Artist

The first wrong notion that comes to mind, one that lasted centuries, is from Thales of Miletus, regarded as the "father of science" as he rejected mythology in favor of material explanations. He believed everything was water, a substance that in his experience could be viewed in all three forms: liquid, solid, gas. He further speculated that earthquakes were really waves and that the earth must be floating on water because of this.

The idea that matter is one thing in different appearances is regarded as true even today.


ROB KURZBAN
Psychologist, UPenn; Director, Penn Laboratory for Experimental Evolutionary Psychology (PLEEP); Author, Why Everyone (Else) is a Hypocrite

I'm guessing you'll get some of the more obvious ones, so I want to offer an instance a little off the beaten path. I came across it doing research of my own into this issue of closely held beliefs that turn out to be wrong.

There was a court case in New York in 1818 surrounding the question of whether a whale was a fish or a mammal. Obviously, we now know not only that there is a correct answer to this question (for a time this wasn't obvious) but also what that answer is (mammal, obviously). Even after some good work in taxonomy, the idea that a whale was a fish persisted. Why?

This one is probably reasonably clear. Humans assign animals to categories because doing so supports inferences. (There's great work by Ellen Markman and Frank Keil on this.) Usually, shared physical features supports inferences about categorization, which then supports inferences about form and behavior. In this case, the phylogeny just happens to violate what usually is a very good way to group animals (or plants), leading to the persistence of the incorrect belief.


LEWIS WOLPERT
Biologist, University College; Author, Six Impossible Things to Do Before Breakfast

That force causes movement — it causes acceleration. That heavy bodies fall faster than lighter ones.


HOWARD GARDNER
Psychologist, Harvard University; Author, Changing Minds

Among cognitive psychologists, there is widespread agreement that people learn best when they are actively engaged with a topic, have to actively problem solve, as we would put it 'construct meaning.' Yet, among individuals young and old, all over the world, there is a view that is incredibly difficult to dislodge. To wit: Education involves a transmission of knowledge/information from someone who is bigger and older (often called 'the sage on the stage') to someone who is shorter, younger, and lacks that knowledge/information. No matter how many constructivist examples and arguments are marshaled, this view — which I consider a misconception — bounces back. And it seems to be held equally by young and old, by individuals who succeeded in school as well as by individuals who failed miserably.

Now this is not a scientific misconception in the sense of flat earth or six days of creation, but it is an example of a conception that is extraordinarily robust, even though almost no one who has studied cognition seriously believes it hold water.

Let me take this opportunity to express my appreciation for your many contributions to our current thinking.


ED REGIS
Science Writer, Author, What Is Life?

Vitalism, the belief that living things embody a special, and not entirely natural, animating force or principle that makes them fundamentally different from nonliving entities. (Although rejected by scientists, I would hazard the guess that vitalism is not entirely dead today among many members of the general public.) This belief's persistence over the ages is explained by the obvious observable differences between life and nonlife.

Living things move about under their own power, they grow, multiply, and ultimately die. Nonliving objects like stones, beer bottles and grains of sand don't do any of that. It's the overwhelming nature of these perceptible differences that accounts for the belief's longevity. In addition, there is still no universally accepted scientific explanation of how life arose, which only adds to the impression that there's something scientifically unexplainable about life.


ROBERT TRIVERS
Evolutionary Biologist, Rutgers University; Coauthor, Genes In Conflict: The Biology of Selfish Genetic Elements

For more than 100 years after Darwin (1859) people believed that evolution favored what was good for the group or the species — even though Darwin explicitly rejected this error

Probable cause: the false theory was just what you would expect people to propagate in a species whose members are concerned to increase the group-orientation of others.


FRANK TIPLER
Professor of Mathematical Physics, Tulane University; Author, The Physics of Christianity

I myself have been working a book on precisely the same topic, but with a slightly different emphasis: why did scientists not accept the obvious consequences of their own theories?

Here are three examples of false beliefs long accepted:

(1) The false belief that stomach ulcers were caused by stress rather than bacteria. I have some information on this subject that has never been published anywhere. There is a modern Galileo in this story, a scientist convicted of a felony in criminal court in the 1960's because he thought that bacteria caused ulcers.

(2) The false belief that the continents do not move. The drifting continents were an automatic mathematical consequence of the fact that the Earth was at least 100 million years old, and the fact that the Earth formed by the gravitational collapse of a gas and dust cloud. One of Lord Kelvin's students pointed out the essential idea in a series of papers in Nature. This was long before Wegener.

(3) The false belief that energy had to be a continuous variable. James Clerk Maxwell, no less, realized that this was a false belief. The great mathematician Felix Kelin, of Klein Bottle fame, discussed the question with Erwin Schrödinger of why the fact of quantized energy was not accepted in the 19th century.


JOAN CHIAO
Assistant Professor, Brain, Behavior, and Cognition; Social Psychology; Northwestern University

Early pioneering cultural anthropologists, such as Lewis Morgan who penned the influential 1877 work Ancient Society and others, were heavily influenced by Darwinian notions of biological evolution to consider human culture as itself evolving linearly in stages.

Morgan in particular proposed the notion that all human cultures evolved in three basic stages: from savagery, to barbarism to finally, civilization and that technological progress was the key to advancing from one stage to the next. Morgan was by no means an arm chair academic; he lived with Native Americans and and studied their culture extensively. Through these first-hand experiences, Morgan sought to reconcile what he observed to be vast diversity in human cultural practices, particularly between Native Americans and Europeans, with emerging ideas of Darwinian biological evolution.

Morgan was one of several anthropologists at the time who proposed various forms of unilinear cultural evolution, the idea that human culture evolved in stages from simple to more sophisticated and complex, which ultimately later became tied to colonialist ideology and social Darwinism.

Such dangerous ideas then became the catalyst for Franz Boas and other 20th century anthropologists to challenge ideas by Morgan with concepts such as ethnocentrism. By arguing how belief in the superiority of one's own culture guided anthropological theories of unilinear evolution, rather than scientific objectivity per se, Boas and his colleagues exposed an important human and scientific bias in the study of human culture that later gave way to revised theories of cultural evolution, namely multilinear evolution, and the emergence of cultural relativism.


JEREMY BERNSTEIN
Professor of Physics, Stevens Institute of Technology; Author, Nuclear Weapons: What You Need to Know,

It was generally believed until the work of Hubble that the universe was static and that the Milky Way was everything.


MATTHEW RITCHIE
Artist

An example of a correct theory that was extensively accepted by the public, then displaced by an alternate interpretation, which has since been problematized without resolution.

Although the 19th century idea that the fourth dimension was an extra dimension of space was in many senses correct, it was invalidated in the cultural imagination by Minkowski and Einstein's convincing and influential representation of time as the fourth dimension of space-time.

For example: the polychora in Picasso & Duchamp's early cubist works were far more directly influenced by Hinton's essays "What is the Fourth Dimension?" and "A Plane World", than Minkowski & Einstein's work — but the general acceptance of

Einstein's theory encouraged art historians to interpret cubist work as being directly influenced by the theory of relativity — which was entirely inaccurate. (This is discussed in depth in Henderson's definitive work The Fourth Dimension and Non-Euclidean Geometry in Modern Art)

Overall, the cultural displacement of the theory of 4-D space has required a series of re-statements of the idea of the fourth dimension — which have so far failed to properly define the nature of the fourth dimension either in time or space to the larger public.


CLAY SHIRKY
Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Cognitive Surplus

The existence of ether, the medium though which light (was thought to) travel.

Extra credit: It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.

It's also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn't exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

When Michelson and Morley devised an apparatus sensitive enough to detect characteristic differences in the behavior of light based on the angle through which it traveled through the ether (relative to the earth's motion), and could detect no such differences, they spent considerable time and energy checking their equipment, so sure were they that ether's existence-by-analogy operated as something like proof. (Danny Kahneman calls this 'theory-induced blindness.')

And of course, the failure of ether to appear opened the intellectual space in which Einstein's work was recognized.


ROGER C. SCHANK
Psychologist; Computer Scientist; AI Researcher. Author, The Future of Decision-Making

The obvious candidate for failed theory in the world of learning is the stimulus-response theory (called behaviorism) that dominated psychology for many years. Yes, my dog gets excited when I make coffee because that is when he knows he will get a treat, but that kind of learned behavior is hardly all there is to learning.

In my first year on the faculty at Stanford, Ken Colby offered to share his time in front of the first-year computer science graduate students. At that time each professor in Artificial Intelligence would have a week of an introductory course in which he would say what he was doing. In their second quarter, students would choose one of those professors to work with in a seminar. Ken invited me to share his time and his seminar.

It took a while to get the "results." The next quarter we met with the students who had signed up for our seminar. While other seminars given by other professors had attracted one or two students we have gotten about 20. Boy was my ego fired up. I was succeeding at this new game. At least that was what I thought until all the students went around the room to say why they were there. They were all there because of Ken — none were there because of me.

I wondered what had happened. Ken had given a very glib funny speech without much content. He seemed to be a lightweight, although I knew he wasn't. I, on the other hand, had given a technical speech about my ideas about how language worked and how to get computers to comprehend.

I asked Ken about this and he told me: if you can say everything you know in an hour, you don't know much.

It was some of the best advice I ever got. You can't tell people everything you know without talking way too fast and being incomprehensible. Ken was about hoping to be understood and to be listened to. I was about being serious and right. I never forgot his words of wisdom. These days I am much funnier.

And, I realize that I do know a lot more than can fit into an hour long speech. Maybe then I actually didn't know all that much.

So, I learned a great deal from Ken's just in time advice which I then had to think about. That is one kind of learning. And then, that experience became one my stories and thus a memory (which is another aspect of learning.)  Learning is also about constructing explanations of events that were not predicted so that you can predicted them better next time. And learning is about constructing and trading stories with others to give you nuances of experiences to ponder, which is a very important part of learning.

Learning has more aspects to it than just those things of course. Stimulus-response doesn't cover much of the turf.


GARY KLEIN
Research Psychologist; Founder, Klein Associates; Author, The Power of Intuition

Here are some of my favorites:

1. Ulcers are created by stress. Some research on monkeys seemed to bear this out, and it fit a comforting stereotype that Type A individuals were burning up inside.

2. Genes are made of protein. This was more reasonable — the complex protein molecules matched the complexity of the proteins that genes were building.

3. Yellow Fever is caused by miasma and filth. I think this was sustained by a natural repugnance when entering homes that smelled bad — a feeling of "wow, that can't be good — I need to get out of here as soon as possible." Plus a class judgment that poor people live in less sanitary conditions and are more susceptible. Plus a belief that the mosquito theory had been discredited. (In fact, the study on mosquitoes failed to take into account a 12-day incubation period.)

4. Cholera is caused by miasma and filth. Ditto. Now the part I can't really understand is why John Snow was so effective in changing this mindset about cholera in England and how his views quickly spread to the U.S., whereas 50 years later even after Walter Reed and his staff eliminated Yellow Fever from Cuba, his subordinate Gorgas (who was in charge of eliminating Yellow Fever in Havana) was so unsuccessful in convincing the authorities when Gorgas was subsequently posted to Panama to control Yellow Fever during the building of the canal.


GREGORY COCHRAN
Consultant, Adaptive Optics; Adjunct Professor of Anthropology, University of Utah; Coauthor, The 10,000 Year Explosion: How Civilization Accelerated Human Evolution

Educated types in the western world have known that the shape of the Earth for a long time, about 2500 years — the idea that they believed in flat Earth is a canard. The notion that the Earth was the center was popular for much longer, largely because parallax was too small to measure, since distances to the stars are enormous compared with the radius of the Earth's orbit. Many people will undoubtedly tell you this.

One favorite is helicobacter pylori as the main cause of stomach ulcers. This was repeatedly discovered and then ignored and forgotten: doctors preferred 'stress' as the the cause, not least because it was undefinable. Medicine is particularly prone to such shared mistakes. I would say this is the case because human biology is complex, experiments are not always permitted, and MDs are not trained to be puzzle-solvers — instead, to follow authority. A lot of this traces back to medical traditions, which developed over long periods during which medicine was an ineffective pseudoscience. Freudian analysis was another such madness of crowds.

I would guess that most basic anthropological doctrine is false — for example. the 'psychic unity of mankind'. but then most practitioners don't really pretend to do science.

One could go on and on!


ERIC R. WEINSTEIN
Mathematician and Economist; Principal, Natron Group

The modern textbook example of groupthink within fundamental physics is likely the so-called Tau-Theta puzzle of the 1950s. The Tau and Theta particles were seen to be as physically indistinguishable as Clark Kent and Superman, except for the ways in which they disintegrated. Yet to suggest that they were the same particle required the mental leap needed to assert that natural law carries a kind of asymmetric beauty mark which could be used to distinguish processes in the real world from their reflections in a pristine mirror. After experimenters at Columbia finally indicated in 1956 that the Tau and Theta were indeed the same particle, physicists came to see that for decades, no one had really bothered to check whether something as profoundly dramatic as an asymmetric universe was hiding within plain sight and easy reach.

An even more compelling example of group blindness drawn from engineering is the bizarre case of the Rollaboard suitcase. In the dawning age of jet travel, it seemed no one could find a way to create stable wheeled luggage. Typical early designs featured leashes and tiny external casters on which horizontal luggage would briefly roll before tipping over. It was only in 1989 that Northwest Airlines pilot Robert Plath solved this problem for travelers with the now ubiquitous vertical design of the Rollaboard, with built in wheels and telescoping handles. What is fascinating about this example of groupthink is that every recent scientific genius who struggled with luggage while on the lecture circuit had missed this simple elegant idea, as it required no modern technological advance or domain specific expertise.


LANE GREENE
Journalist, The Economist

I assume someone might already have written in to suggest "the belief that physical traits acquired during one's lifetime could be passed on to children" — e.g., that a person who became fat through overeating would thereby have fat children (and not because he had genes for obesity). This was apparently even believed by Darwin, I just read, before the discovery and understanding of genes.


JAMES O'DONNELL
Classicist; Provost, Georgetown University; Author, The Ruin of the Roman Empire

As classicist, I feel I know too many examples! Ancient medicine and ancient astronomy in particular were full of truths, quite true and valid within the framework within which constructed, that now appear as utter nonsense. I would put at top of my list, however, the science of astronomy — not for the Ptolemaic mathematical workings-out, but for the solid, serious, scientific astrological content. That is to say, it's a beautiful example of a paradigm, in Kuhnian terms, that made perfect sense at the time, that was the foundation for many further advances, that led to undoubtedly serious science, that validated itself by e.g. the way it allowed you to predict eclipses (how could it not be science?), and that just fell apart at the touch of a serious thought. To compare large with small, I would put it next to the science of ulcer medicine 60 years ago, which made similar perfect sense and was all driven by diet and stress and was a continually refining science — falling apart more or less isntantaneously, what, 25 years ago, with the discovery of the link to H. pylori. What the two have in common is the focus on phenomena (that is, the things that appear, the surface data) produces science, but each time you go a step beneath phenomena to mechanisms, new science happens. That's when the impossible becomes possible.


GEOFFREY CARR
Science Editor, The Economist

Believing that people believed the Earth was flat is a good example of a modern myth about ancient scientific belief. Educated people have known it was spherical (and also how big it was) since the time of Eratosthenes. That is pretty close to the beginning of any system of thought that could reasonably merit being called scientific...

One that was long thought to be true, but isn't, is the spontaneous generation of life. I've never quite understood how that squared with life being divinely created. But the whole pre-Pasteur thing was definitely a widely held, incorrect belief...


JONATHAN HAIDT
Psychologist, University of Virginia; Author, The Happiness Hypothesis

The closest thing to a persistent flat earth belief in psychology is probably the view that experiences in the first five years of life largely shape the personality of the adult. (The child is father to the man, as Freud said). It's now clear that experiences that affect brain development, such as some viral diseases or some head injuries, can indeed change adult personality. Also, extreme conditions that endure for years or that interfere with the formation of early attachments (e.g., an abusive parent) can also have lasting effects. But the idea that relatively short-lived experiences in the first few years — even traumatic ones, and even short-lived sexual abuse — will have powerful effects on adult personality... this just doesn't seem to be true. (Although such events can leave lasting traces on older children). Personality is shaped by the interaction of genes with experience; psychologists and lay people alike long underestimated the power of genes, and they spent too much time looking at the wrong phase of childhood (early childhood), instead of at the developmental phases that matter more (i.e., the prenatal period, and adolescence).

Why is early childhood such a draw when people try to explain adult personalities? I think it's because we think in terms of stories, and it's almost impossible for us NOT to look back from Act III (adulthood) to early childhood (act I) when we try to explain someone turned out to be a hero or serial killer. In stories, there's usually some foreshadowing in act I of events to come in act III. But in real life there is almost never a connection.


JUAN ENRIQUEZ
Managing Director in Excel Medical Ventures; Chairman and CEO of Biotechonomy LLC; Author, As the Future Catches You

We have acted, with good reason, as if human beings are all alike. And given the history of eugenics this has been a good and rational position and policy. But we are entering an era where we recognize that there are more and differences in how a particular medicine affects particular groups of people. Same of foods, pollutants, viruses, and bacteria.We are beginning to recognize we react, and are at differential risks, of catching diseases like AIDS, malaria, anemias. And just this month we began to get a glimpse of the first thousand human genomes. These will soon number in the hundreds of thousands. Are we ready should these initial gene maps show that there are real and significant differences between groups of human beings?


SCOTT ATRAN
Anthropologist; Visiting Professor of Psychology and Public Policy at the University of Michigan; Rresidential Scholar in Sociology at the John Jay College of Criminal Justice, New York City; Author, Talking to the Enemy

Anglo-American empiricists and communists alike believed that human minds were almost infinitely malleable, and learned the structure and content of thoughts and ideas based on the frequency of events perceived and on the nearness of events to one another (if one kind of event frequently precedes a second kind of event then the first is likely the cause of the other). Rewards and punishments ("carrots and sticks") supposedly determine which events are attended to.

Many Continental thinkers and Fascists believed that fundamental ideas of science, art and the "higher thoughts" of European civilization were either innate or inherently easy to learn only for a biologically privileged set of human beings. As with most earlier views of human cognition and learning, both of these philosophies and their accompanying pseudo-sciences of the mind were based on social and political considerations that ignored, and indeed effectively banned, reasoned inquiry and evidence as to the nature of the human mind.

That is why, after centuries of science, study of the mind is still in a foetal stage, and actual progress has been limited to fundamental discoveries that can be counted on one hand (for example, that human linguistic competence — and thus perhaps other fundamental cognitive structures — is universally and innately fairly well-structured; or that human beings do not think like markov processors, logic machines, or as rational economic and political actors ought to).


RUPERT SHELDRAKE
Developmental Biologist; Author, The Sense of Being Stared At

In the nineteenth century, many scientists were convinced that the course of nature was totally determinate and in principle predictable in every detail, as in Laplace's famous fantasy of scientific omniscience: "Consider an intelligence which, at any instant, could have a knowledge of all the forces controlling nature together with the momentary conditions of all the entities of which nature consists. If this intelligence were powerful enough to submit all these data to analysis it would be able to embrace in a single formula the movements of the largest bodies in the universe and those of the lightest atoms; for it nothing would be uncertain; the past and future would be equally present for its eyes."

T.H. Huxley even imagined that the course of evolution was predictable: "If the fundamental proposition of evolution is true, that the entire world living and not living, is the result of the mutual interaction, according to definite laws, of the forces possessed by the molecules of which the primitive nebulosity of the universe was composed, it is no less certain the existing world lay, potentially, in the cosmic vapour, and that a sufficient intellect could, from a knowledge of the properties of the molecules of that vapour, have predicted, say, the state of the fauna of Great Britain in 1869."

With the advent of quantum theory, indeterminacy rendered the belief in determinism untenable, and in the neo-Darwinian theory of evolution (which T.H. Huxley's grandson, Julian, did so much to promote) randomness plays a central role through the chance mutations of genes.


EMANUEL DERMAN
Professor in Columbia University's Industrial Engineering and Operations Research Department; Partner at Prisma Capital Partners; Author, My Life as a Quant

1. For years, running shoe companies have assumed without evidence that more is better, that thicker padded soles are better at preventing injuries in runners. In the 70s, shoe soles grew Brobdingnagian. Now, recent research confirms that running barefoot and landing on your forefoot, on any surface, even one as hard as the road to hell, produces less shock than running and unavoidably landing on your heels in rigid padded stabilized shoes.

2. For years optometrists have given small children spectacles at the first hint of nearsightedness. But ordinary unifocal lenses modify not only their accommodation to distance vision but to near vision too, where they don't need help. Now there is evidence that giving near-sighted kids bifocals that correct only their distance vision and not their close-up vision seems to make their nearsightedness progress less rapidly.


CHARLES SEIFE
Professor of Journalism at New York University; Author, Proofiness

Caloric, phlogiston, and ether immediately come to mind, but I'm particularly fond one consequence of Aristotelian mechanics: the assertion that there is no such thing as a vacuum.

The concept of the void conflicted with the way that Aristotle conceived of motion; admitting a void into his universe quite simply broke all of his models about the nature of matter and the way objects move. (A rock, say, suspended in a vacuum, would not be able to fall to its proper place at the center of the world as his laws said they must.)

In the West, the consequent misconception — that nature so abhors a vacuum that it can not exist under any circumstance — lasted until Torricelli and Pascal disproved it in the 17th century.


MILFORD H. WOLPOFF
Professor of Anthropology and Adjunct Associate Research Scientist, Museum of Anthropology at the University of Michigan; Author, Race and Human Evolution

Creationism's step sister, intelligent design, and allied beliefs have been held true for some time, even as the mountain of evidence supporting an evolutionary explanation for the history and diversity of life continues to grow. Why has this belief persisted? There are political and religious reasons, of course, but history shows than neither politics nor religion require a creationist belief in intelligent design.

I think the deeper answer lies elsewhere, in the way children categorize the world in to a hierarchy of types of inanimate and living things (and for that matter types of people), and the rigid categorization this leaves in adults that stands in the way of accepting biological explanations that show the hierarchy can develop from natural laws including randomness,and categories may originate and change by natural laws within a hierarchical structure. Could a draw poker hand improve without divine intervention? Could Plato's precept of ideals have survived a trip to Art Van's?


ROBERT SHAPIRO
Professor Emeritus of Chemistry and Senior Research Scientist at New York University; Author, Planetary Dreams

For many centuries, most scientists and philosophers believed that dead or inanimate matter could quickly transform itself into living beings, just as the reverse can occur quite rapidly. This belief, rapid spontaneous generation, was supported by simple observation of common events. Fireflies emerged from the morning dew, bacteria appeared in sterilized broths and small animals arose from mud at the bottom of streams and ponds.

In Shakespeare's "Antony and Cleopatra" Lepidus told Antony "Your serpent of Egypt is born of the mud, by the action of the Sun, and so is your crocodile." Among the notables who endorsed this theory were Aristotle, Thomas Aquinas, Francis Bacon, Galileo and Copernicus. Many carefully controlled experiments, culminating in the work of Louis Pasteur, were needed to negate this idea.


JUDITH HARRIS
Author, No Two Alike

The apple doesn't fall far from the tree. In other words, people tend to resemble their parents. They resemble their parents not only in physical appearance but also, to some degree, in psychological characteristics.

The question is: Why? Two competing answers have been offered: nature (the genes that people inherit from their parents) and nurture (the way their parents brought them up). Neither of these extreme positions stood up to scrutiny and they eventually gave way to a compromise solution: nature + nurture. Half nature, half nurture. This compromise is now an accepted belief, widely held by scientists and nonscientists alike.

But the compromise solution is wrong, too. Genes do indeed make people turn out something like their parents, but the way their parents brought them up does not. So nature + nurture is wrong: it's nature + something else.

The evidence has been piling up since the 1970s; by now it's overwhelming. And yet few people outside of psychology know about this evidence, and even within psychology only a minority have come to terms with it.

You asked for "examples of wrong scientific beliefs that we've already learned were wrong." But who is "we"? A few thousand people have learned that the belief in nature + nurture is wrong, but most people haven't.


JORDAN POLLACK
Computer Science and Complex Systems Professor at Brandeis University

A persistent belief is that human symbolic intelligence is the highest form of intelligence around. This leads directly to both creationism and good old-fashioned AI which seeks to model cognition using Lisp programs.

Evolution can design machines of such great complexity that the space shuttle with half a million parts looks like a tinker toy construction. In order to explain the design intelligence of evolution, most Republicans are convinced that a superintelligent creator was involved. Developmental intelligence which manufactures machines with 10 billion moving parts without any factory supervisors is another area where nature outstrips the best human performance. Immunological Intelligence, telling self from non-self, is another AI-complete problem. And human intelligence itself is so vastly complex that we've made up stories of conscious symbol processing, like logic and grammar, to try to explain what goes on in our heads.

The mind, like the weather, envelopes the brain like a planet and requires dynamical and integrated explanations rather than just-so stories.


SUE BLACKMORE
Psychologist and Ex-Parapsychologist; Author, Consciousness: An Introduction

My favourite example is the hunt for the "élan vital" or life force. People seemed to think that — given living things behave so very differently from non-living things — there must be some special underlying force or substance or energy or something that explains the difference, something that animates a living body and leaves the body when it dies.

Of course many people still believe in various versions of this, such as spirits, souls, subtle energy bodies and astral bodies, but scientists long ago gave up the search once they realised that being alive is a process that we can understand and that needs no special force to make it work.

I think this was believed to be true for two reasons :

1. Explaining how living things work is not trivial — it has required understanding heredity, homeostasis, self-organisation and many other factors.

2. (perhaps more important) Human beings are natural dualists. From an early age children begin thinking of themselves not as a physical body but as something that inhabits a physical body or brain. We feel as though we are an entity that has consciousness and free will even though this is all delusion. I suggest that this delusion of duality is also the underlying cause of the hopeless hunt for the life force.


NICHOLAS G. CARR
Author, The Shallows

I think it's particularly fascinating to look at how scientific beliefs about the functioning of the human brain have progressed through a long series of misconceptions.

Aristotle couldn't believe that the brain, an inert grey mass, could have anything to do with thought; he assumed that the heart, hot and pulsing, must be the source of cognition, and that the brain's function was simply to cool the blood.

Descartes assumed that the brain, with its aperture-like "cavities and pores," was, along with the heart, part of an elaborate hydraulic system that controlled the flow of "animal spirits" through the "pipes" of the nerves. More recently, there was a longstanding belief that the cellular structure of the brain was essentially fixed by the time a person hit the age of 20 or so; we now know, through a few decades' worth of neuroplasticity research, that even the adult brain is quite malleable, adapting continually to shifts in circumstances and behavior.

Even more recently there's been a popular conception of the brain as a set of computing modules running, essentially, genetically determined software programs, an idea that is now also being chipped away by new research. Many of these misconceptions can be traced back to the metaphors human beings have used to understand themselves and the world (as Robert Martensen has described in his book The Brain Takes Shape).

Descartes' mechanistic "clockwork" metaphor for explaining existence underpinned his hydraulic brain system and also influenced our more recent conception of the brain as a system of fixed and unchanging parts.

Contemporary models of the brain's functioning draw on the popular metaphorical connection between the brain and the digital computer. My sense is that many scientific misconceptions have their roots in the dominant metaphors of the time. Metaphors are powerful explanatory tools, but they also tend to mislead by oversimplifying.


LEE SMOLIN
Founding and Senior Faculty member at Perimeter Institute for Theoretical Physics in Waterloo, Canada; Adjunct Professor of Physics at the University of Waterloo; Author, The Trouble With Physics

Perhaps the most embarrassing example from 20th Century physics of a false but widely held belief was the claim that von Neumann had proved in his 1930 text book on the mathematical foundations of quantum mechanics that hidden variables theories are impossible. These would be theories that give a complete description of individual systems rather than the statistical view of ensembles described by quantum mechanics. In fact de Broglie had written down a hidden variables theory in 1926 but abandoned work on it because of von Neumann's theorem. For the next two decades no one worked on hidden variables theories.

In the early 1950's David Bohm reinvented de Broglie's theory. When his paper was rejected because von Neumann proved what he claimed impossible, he read and easily found a fallacy in the von Neumann's reasoning. Indeed, there had been at least one paper pointing out the fallacy in the 1930s that was ignored. The result was that progress on hidden variables theories in general, and de Broglie and Bohm's theory in particular, was delayed by several decades.

An example in economics is the notion that an economic markets can usefully be described as having a single unique and stable equilibrium, to which it is driven by market forces. As described by neoclassical models of markets such as the Arrow-Debreu model of general equilibrium, equilibrium is defined as a set of prices for which demand for all goods equals supply, as a result of each consumer maximizing their utility and each producer maximizing their profit. A basic result is that such equilibria are Pareto efficient, which means no one's utility can be increased without decreasing some body else's utility. Furthermore, if the economy is in equilibrium there are no path dependent effects, moreover it can be argued that market prices in equilibrium are perfectly rational and reflect all relevant information.

If equiilibrium were unique, then one could argue that the most ethical thing to do is to leave markets free and unregulated so that they can find their points of equilibrium where efficiency and utility are maximized. This kind of thinking to some extent motivated choices about leaving financial markets under-regulated resulting in the recent economic crisis and current difficulties.

However, it was learned in the 1970s that even if efficiency and equilibrium are useful notions, the idea that equilibria are unique is not true in generic general equilibrium models. The Sonnenschein-Mantel-Debreu Theorem of 1972 implies that equilibria are in general highly non-unique, and it is not difficult to invent models in which the number of equilibria scales with the number of producers. But if there are multiple equilibria, most will not be stable. Moreover supply and demand are balanced in each of the many equilibria, so market forces do not suffice to explain which equilibria the market is in or to pick which would be preferred. The consequence theoretically is that path dependent effects which determine which of many the equilibria the market is in must be important, the political consequence is that there is not an ethical argument for leaving markets unregulated. Since then some of the more interesting work in economics studies issues of path dependence and multiple equilibria.

I cannot comment on why economists made the mistake of thinking about market equilibrium as if it were unique. I do think I have some insight into why a false belief about the possibility of alternatives to quantum mechanics could persist for more than two decades. During this period there was rapid progress in the application of quantum mechanics to a wide set of phenomena from astrophysics to nuclear and solid state physics.

Meanwhile the most popular interpretation of quantum mechanics was Bohr's, which is now hardly taken seriously by anyone. Those who concentrated on the foundations of the subject were left behind, especially as it was convenient for the progress that was being made, to believe that the foundations were surer than in fact they were. Perhaps there are periods in science where it makes sense for most scientists to sweep foundational worries under the carpet and make progress on applications, postponing the inevitable reckoning with the inconsistencies to a time when there are better hints from experiment.


MARTI HEARST
Associate Professor in the School of Information at UC Berkeley, Affiliate appointment in Computer Science Division

As a computer scientist, there isn't all much in my field that applies, but I do have one item (below). The real action in my view though are the many the counter-intuitive findings in psychology (how memory works, what we perceive and don't perceive, findings on child rearing, etc., etc.):

In the early days of the field of Artificial Intelligence, researchers thought that it would not be terribly difficult to implement a vision recognition or language understanding program. Although there is an apocryphal quote from Minsky saying he assigned solving vision as a summer research project, more reliable quotes, taken from a well-researched wikipedia article, are below:

"AI's founders were profoundly optimistic about the future of the new field: Herbert Simon predicted in 1965 that "machines will be capable, within twenty years, of doing any work a man can do" and Marvin Minsky agreed, writing that "within a generation ... the problem of creating 'artificial intelligence' will substantially be solved".

The importance of these misperceptions is the underestimation of the complexity of how the brain works.


GINO SEGRE
Professor of Physics and Astronomy at the University of Pennsylvania; Author, Ordinary Geniuses

I would not count flat earth as a wrong theory believed to be true by everybody since e.g. the ancient Greeks thought the Earth was a sphere and had even measured its curvature.

A classic example of a wrong theory is that of Phlogiston, namely the existence of a substance that is released in combustion. There were also variations going by the name of caloric. A second wrong theory is that of a Luminiferous Aether, a substance through which light is transmitted. Chemical experiments disproved the first and e.g. the Michelson -Morley expt. the second.

There are of course also numerous wrong theories/beliefs regarding spontaneous generation of life disproved in the 17th century by Francesco Redi and ultimately by Louis Pasteur in the 19th.

I have a small favorite, the belief that body temperature varied with climate, disproved by the invention in the early 17th century of the thermometer.


CARL ZIMMER
Science Writer; Author, Soul Made Flesh

"This laxe pithe or marrow in man's head shows no more capacity for thought than a Cake of Sewet or a Bowl of Curds."

This wonderful statement was made in 1652 by Henry More, a prominent seventeenth-century British philosopher. More could not believe that the brain was the source of thought. These were not the ravings of a medieval quack, but the argument of a brilliant scholar who was living through the scientific revolution. At the time, the state of science made it was very easy for many people to doubt the capacity of the brain. And if you've ever seen a freshly dissected brain, you can see why. It's just a sack of custard. Yet now, in our brain-centered age, we can't imagine how anyone could think that way.


GREGORY PAUL
Independent Researcher; Author, Dinosaurs of the Air

Richard Thaler seems to think that the concept of a flat earth was widely held for a long time. This is not really correct. Mariners have long understood that the earth is strongly curved and possibly a sphere. Ships disappear down over the horizon (I once saw this effect on the Chesapeake bay and was shocked how fast the hull of a giant container ship dropped out of sight while the top of the superstructure was still easily visible). Polaris gets lower on the horizon as one sails south and eventually disappears and so on. Over 2000 years ago the circumferance of the planet was pretty accurately calculated by Eratosthenes using some clever geometry and sun angle measurements. This knowledge may have been lost in the west in the dark ages, but was well known to the Euroelites after the improved communications from Constantinople, Alexandria etc after the Crusades.

When Columbus was trying to get a government to cough up the money for his trip west he was not trying convince patrons that the planet was a sphere. The problem was that the experts told the people with the money that the distance from Europe and Asia across the super ocean separating them was 14,000 miles with no visible means of logistical support during the voyage (the perfect Bible did not mention extra continents being in the way). However, some works had come out saying that Eratosthenes had messed up and the planet was much smaller (I've heard this was based on Biblical passages and Columbus was very devout, but am not sure about that). Columbus figured it was 3-4000 miles to the west, a skip and a hop compared to the horrendous around Africa route. When the Spanish monarchs finally kicked the last Muslims out of Iberia and were having fun picking on Jews they decided what the heck and see what this Columbus fellow could do, the cost was just three little cargo vessels and their crews.

The story about the crews getting upset about sailing off the edge of the earth is probably a myth since they knew better. That Columbus was fighting the false knowledge of the flat earth apparently was invented in the late 1800s in an effort to make him a great American symbol of the progress of science over superstition associated with the 1892 celebrations.

As far as a distinct example of lots of people believing in something that is scientifically wrong the best example I can think of are the various creation myths. This occurred not only before the advent of modern science but continues in the form of various forms of creationism.


ALISON GOPNIK
Psychologist, UC, Berkeley; Author, The Philosophical Baby

There is interesting evidence that many once popular and evidence-resistant scientific belief systems are also developed spontaneously by many children. For example, children seem to develop a "vitalistic" theory of intuitive biology, rather like the Chinese concept of "chi", at around age 5, independently of what they are taught in school. Similarly , school-age children, even those with an explicitly atheist upbringing, develop ideas about God as an explanatory force at about 7, as part of an "intuitive teleology" that explains events in terms of agency.

The psychologist Tania Lombrozo has shown that even Harvard undergraduates who endorse evolution consistently interpret evolutionary claims in a teleological rather than mechanistic way (eg giraffes try to reach the high leaves and so develop longer necks). And we have shown that six year olds develop a notion of fully autonomous "free will" that is notoriously difficult to overturn. There is also a lot of evidence that scientific theories are built out of these everyday intuitive theories.

If, as we think, children use Bayesian techniques to develop intuitive theories of the world, based on the evidence they see around them, then it might , in some sense, be rational to hold on to these beliefs, which have the weight of accumulated prior experience. Other scientific beliefs, without a history of everyday confirmation, might be easier to overturn based on just scientific evidence alone.


GEORGE DYSON
Science Historian; Author, Darwin Among the Machines

Many (but not all) scientists assumed the far side of the moon would turn out to look much the same as the side we are familiar with. "I was very enthusiastic about getting a picture of the other side of the moon," Herbert York, former advisor to President Eisenhower, told me in 1999. "And there were various ways of doing it, sooner or later. And I argued with Hornig [Donald Hornig, Chairman of the President's Science Advisory Committee] about it and he said, 'Why? It looks just like this side.' And it turned out it didn't."




60 MINUTES
November 18, 2010

J. CRAIG VENTER: DESIGNING LIFE

Dr. Craig Venter talks to Steve Kroft and takes him on a tour of his lab on "60 Minutes," Video

(CBS) The microbiologist whose scientists have already mapped the human genome and created what he calls "the first synthetic species" says the next breakthrough could be a flu vaccine that takes hours rather than months to produce.


...KROFT: "There are a lot of people in this country who don't think that you ought to screw around with nature."

VENTER: "We don't have too many choices now. We are a society that is one hundred percent dependent on science. We're going to go up in our population in the next 40 years; we can't deal with the population we have without destroying our environment."

KROFT: "But aren't you playing God?"

VENTER: "We're not playing anything. We're understanding the rules of life."

KROFT: "But that's more than studying life, that's changing life".

VENTER: "Well, domesticating animals was changing life, domesticating corn. When you do cross-breeding of plants, you're doing this blind experiment where you're just mixing DNA of different types of cells and just seeing what comes out of it."

KROFT: "This is a little different though, this is another step, isn't it?"

VENTER: "Yeah, now we're doing it in a deliberate design fashion with tiny bacteria. I think it's much healthier to do it based on some knowledge and a better understanding of life than to do it blindly and randomly." .

KROFT: "You know, I've asked two or three times, 'Do you think you're playing God?' I mean, do you believe in God?"

VENTER: "No. I believe the universe is far more wonderful than just assuming it was made by some higher power. I think the fact that these cells are software-driven machines and that software is DNA and that truly the secret of life is writing software, is pretty miraculous. Just seeing that process in the simplest forms that we're just witnessing is pretty stunning."

[...continue]
__

[CRAIG VENTER on Edge: A SHORT COURSE ON SYNTHETIC GENOMICS [7.30.09] The Edge Master Class 2009;LIFE: A GENE-CENTRIC VIEW [1.23.08] Craig Venter & Richard Dawkins: A Conversation in Munich; LIFE: WHAT A CONCEPT! [8.27.07] An Edge Special Event at Eastover Farm]



Nov 22, 2010

Ethical Technology

THE THIRD CULTURE

Curtis D. Carbonell

...the conversation about the role of the two great branches of learning is important, and still topical. C.P. Snow crystallized this debate in the mid-twentieth century with his suggestion that two cultures existed in the British academy, the literary and the scientific, and that they were at odds.

In his quest to argue that science will solve global social problems, why, Snow asked, should one be held responsible for knowing the works of Shakespeare, but not understand the Second Law of Thermodynamics? His insight gave steam to an internecine intellectual fight that had surfaced a number of times in the past. (Today, one can chart the most recent "science wars" all the way back through Snow to Arnold and Huxley, on to the Romantic critiques of the Enlightenment Project, to the debate between the Ancients and Moderns, which revolved around the new science's assault on both Aristotelianism and Renaissance Humanism.)

However, what shouldn't be forgotten in the admission that the humanities will resist ultimate reduction is that there are those in the humanities who suffer from science envy. This envy was given impetus by entomologist and evolutionary theorist E.O. Wilson in his monograph, Consilience, wherein Wilson suggests that the humanities must move toward the methods of the sciences to remain relevant.

While this gentle shimmy sounds harmless enough, there are those in humanistic disciplines like literary studies who have taken such a move to heart. For example, any cursory examination into the nascent approach called Literary Darwinism reveals a loose confederacy of individuals who think literary texts are best read within Darwinian contexts (think, reading Jane Austen to understand how her characters represent universals in human behavior related to, say, their inclusive fitness). ...

...That term, 'third culture', was popularized by literary agent and Edge founder John Brockman in a response to Snow to suggest that a new culture is emerging that is displacing the traditional, literary intellectual with thinkers in the sciences who are addressing key concepts and categories found in the humanities; Richard Dawkins writing about religion or Carl Sagan expounding about the awe of the cosmos both come to mind as quintessential examples. One need only browse through the science section of the book store to see that a bridge between the two cultures has been built, with much of the traffic in the popular sphere going one way. ...



19 November 2010

QUÉ ES UN TECNOESCRITOR? [WHAT IS A TECHNOLOGY WRITER?]
By FK

[Google Translation:] ...[Kevin] Kelly and [Steven] Johnson, (Nicholas Carr and author of "The Shallows" which explores what the Internet is doing to our brains) are clear examples of a profession lacking in these latitudes: the dedicated writer to think of our new state environment, technology and science. They are "tecnoescritores" or scientific writers such as the British biologist Richard Dawkins (The Selfish Gene "), Daniel Dennett (" Darwin's Dangerous Idea "), psychologist Steven Pinker (The Blank Slate), Matt Ridley ("genome"), Malcolm Gladwell ("Blink"), Bill Bryson ("Short History of Nearly Everything"), Brian Greene ("The Elegant Universe"), Michio Kaku ("Physics of the Impossible"), Paul Davies ( "The last three minutes"), and many more as the hypermedia Stephen Hawking (A Brief History of Time "). Direct descendants of Carl Sagan, Richard Feynman and Stephen Jay Gould, is a breed of authors who write and release science laboratories. And, oddly enough-attention-Argentine publishers, they sell many books. It's true: in recent years came over here-very interesting collections, such as barking Science (Siglo XXI), led by Diego Golombek biologist who trains scientists and science communicators to tell beyond a cryptic paper or news article forgettable . But you have to admit, compared to the international market for "literature" are still in the First B. Each in its own way and located in what CP Snow called "third culture" (that bridge between science and literature currently represented by the site Edge.org), the great science writers take a scientific publication, the link with the literature and in doing so take it one floor up. ...

[Spanish language original | Google Translarion]



DIE WELT
15. November 2010

BUCH-TIPP [BOOK TIP]

Wolfgang W. Merkel

[Google Translation:] What idea will change everything? "Prediction is difficult, especially when it concerns the future." The quote about the difficulty of recognizing the future, is sometimes attributed to Karl Valentin, or even Mark Twain. Technological views of wrong as forecasts made around the time of 2000 testify: from nuclear powered cars ri settlements on Mars. In this book, however, the cream of the global research community ventures forth with their outlooks in brief essays. A good book by sober scientists, not by technology dreamers. [EDITOR'S NOTE: See: This Will Change Everything]

[German language original | Amazon.de]



O Estado de S.Paulo/Culture
14 Novembro 2010 Domingo

REPÚBLICA SEM CIÊNCIA (REPUBLIC WITHOUT SCIENCE)

Daniel Piza

[Google Translation:] We stood on the "third culture" and Brazil, ignoring his history of science, scientists and the links between arts and sciences and humanities. It is true that Brazil has never received a Nobel (neither science nor the other areas, though some writers have deserved), but came close a few times (with Carlos Chagas and Jayme Tiomno, for example, and not with Peter Medawar, British born Petrópolis than 50 years ago received the Medicine and Physiology). And might not have gotten there because both neglect dedicated to the subject. When you see the vast literature on science emerged in the last 40 years and thinks the Brazilian case, where only recently appeared names like Marcelo Gleiser and Fernando Reinach, the melancholy of the comparison is inevitable.

The Vezo is worldwide, I know, like when you read stories of modernism at the turn of the 20th century would not talk about Einstein and Bohr, only a maximum of Freud (who was a doctor and, contrary to what many think, very friendly knowledge of brain physiology to understanding the mental complexity — and today would be impressed with the new scanning technology). But historians and sociologists Brazilians not only ignore science in his portraits of the time, but also ignore their achievements. In the U.S., where there is both religious conservatism, science has always had acute stress. And there appear initiatives such as the Edge website (www.edge.org), made to house the thoughts of authors such as Damasio, Dennett and Dawkins — to get the letter D — that try to connect to the humanistic scientific culture. ...

...The art, by the way, almost always spoke with science, because both can add knowledge, or at least a mutual defense and sound. It's been like painting by Leonardo da Vinci (who developed the veiling to be more faithful to the visual perception) to the fiction of Ian McEwan (although I liked less than solar, about a physicist involved with the issue of global warming, than Saturday), through the refraction of light by Rembrandt and Picasso's angular concurrences.

We need to rescue the role of scientists in Brazil's history and treat the subject in much better schools and publications. Although there are few initiatives here and there, with the importance of a FAPESP, there is still much room for improvement in production and thinking about science. Consider the recent obituaries in local newspapers with names like Martin Gardner (who knocked so many defenders of the existence of paranormal and UFO proven) and Benoit Mandelbrot (inventor of the concept of fractals, that some people think it was a kind of mystic of impurities, an intuitive holistic) to see how the emphasis of science is small in our culture. The republic would gain much knowledge and method to convey to their students.

[Portugese language original | Google Translation]


subscribe

THE EDGE ANNUAL QUESTION BOOK SERIES
Edited by John Brockman

"An intellectual treasure trove"
San Francisco Chronicle


THIS WILL CHANGE EVERYTHING: IDEAS THAT WILL SHAPE THE FUTURE
(*)
Edited by John Brockman

Harper Perennial

NOW IN BOOKSTORES AND ONLINE!

[click to enlarge]

Contributors include: RICHARD DAWKINS on cross-species breeding; IAN McEWAN on the remote frontiers of solar energy; FREEMAN DYSON on radiotelepathy; STEVEN PINKER on the perils and potential of direct-to-consumer genomics; SAM HARRIS on mind-reading technology; NASSIM NICHOLAS TALEB on the end of precise knowledge; CHRIS ANDERSON on how the Internet will revolutionize education; IRENE PEPPERBERG on unlocking the secrets of the brain; LISA RANDALL on the power of instantaneous information; BRIAN ENO on the battle between hope and fear; J. CRAIG VENTER on rewriting DNA; FRANK WILCZEK on mastering matter through quantum physics.


"a provocative, demanding clutch of essays covering everything from gene splicing to global warming to intelligence, both artificial and human, to immortality... the way Brockman interlaces essays about research on the frontiers of science with ones on artistic vision, education, psychology and economics is sure to buzz any brain." (Chicago Sun-Times)

"11 books you must read — Curl up with these reads on days when you just don't want to do anything else: 5. John Brockman's This Will Change Everything: Ideas That Will Shape the Future" (Forbes India)

"Full of ideas wild (neurocosmetics, "resizing ourselves," "intuit[ing] in six dimensions") and more close-to-home ("Basketball and Science Camps," solar technology"), this volume offers dozens of ingenious ways to think about progress" (Publishers Weekly — Starred Review)

"A stellar cast of intellectuals ... a stunning array of responses...Perfect for: anyone who wants to know what the big thinkers will be chewing on in 2010. " (New Scientist)

"Pouring over these pages is like attending a dinner party where every guest is brilliant and captivating and only wants to speak with you—overwhelming, but an experience to savor." (Seed)

* based On The Edge Annual Question — 2009: "What Will Change Everything?)

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.