Edge in the News
The distance between a neurone and a human mind seems very great, and to many philosophers and scientists quite impossible for science to cross. Even if minds are made from brains, and brains are made from billions of neurones, there seems no way to get from one sort of thing to the other.
Nicholas Humphrey's whole life as a scientist has been spent on that journey: in the 1960s he was part of the first team to discover how to record the activity of single neurones in a monkey's visual cortex; nearly 40 years later, he has reached a grand theory of how consciousness might have arisen in a Darwinian world, and why it might give us reasons to live.
The journey has been like the path of a neurone, full of twists and branchings and decisive contacts that altered its course. He has worked with monkeys in laboratories and in the wild. He has been a media don, a campaigner against nuclear weapons and the holder of a chair in parapsychological research who was dedicated to debunking even the possibility of telepathy or survival after death. He is an atheist, and the man who suggested to Richard Dawkins the analogy of viruses of the mind for religions; yet nowadays he talks as if spirituality were the thing that makes us human.
There is a self-confidence to this rather headlong life which stems, he thinks, in part from his background in the aristocracy of Cambridge. His father was an immunologist and FRS, his mother a psychiatrist and niece of John Maynard Keynes. In all, six of his relatives were fellows of the Royal Society, and one of his grandfathers, AV Hill, had won a Nobel prize. He never doubted he wanted to be a scientist: "It was what everyone around me was doing; the idea that I could have been professional at any other thing never really crossed my mind. I have to say there was a certain snobbishness about our attitudes. Anyone who didn't live in a large house didn't really count. Anyone who didn't have 15 cousins didn't count, and anyone who didn't have tea with a Nobel-winning grandpa wasn't really worth talking to either."
This sounds arrogant, but it is arrogance recollected after chastening. His career, which started out with great promise, has not run entirely smoothly. At first he wanted to be a physicist. At Westminster School, where he was educated, there was an inspired science teacher who devised a way for his pupils to measure the speed of light as it travelled the length of a London street and back. But when Humphrey went up to Trinity in 1961 on a scholarship to read mathematics and physics, he was disappointed in the course. He began to be fascinated by biology instead.
Though to many scientists biology feels messy and incoherent, to Humphrey it was much more logical and elegant than chemistry or physics: "Once I got into biology my eyes were open to a world of phenomena, a world of explanations, which had a kind of perfection I hadn't found before. There is no unifying theory in chemistry like evolutionary explanations in biology." As an ambitious young man, he set his sights on the biggest biological mystery he could find - human consciousness - so he switched to psychology, and began to work with monkeys under Larry Weiskrantz.
Humphrey was part of the team that first discovered how to record the activity of single nerve cells in a monkey's brain. Two other members later got Nobel prizes for this work, which underlies an enormous amount of subsequent research, since it made it possible to trace the ways in which the visual cortex receives and processes signals from the eyes. It was known in principle what was happening, but now the exact brain cells involved in image processing could be found and monitored.
His next discovery was wholly unexpected and is still hard to believe. In the laboratory was a monkey named Helen, who had been blinded when her visual cortex was cut with a scalpel. Humphrey decided to see what contact he could establish with the monkey, and got enough reaction to keep going. Over a period of seven years, he managed to coax out a sort of sense of sight. He played with the monkey, took her for walks, and did everything to persuade her that she could see: "Through this very intense and personal relationship - daily wondering what it was like to be her, and trying to get inside her mind - I began to get, I think, some insights into the general nature of consciousness. "It was like being part of a miracle. It wasn't really as if I had touched her with a healing hand, and made the blind see, but there are all those parables and models - and it was a bit like that."
Even four decades later, his excitement and pain are evident when he thinks of this. "It was a very sad moment when the monkey was killed. Of course she had to be. It was very important to know exactly what the lesion was. So [they] did it while I was away. I found it quite disturbing, though I think the research was interesting and important ... I wouldn't want to criticise anyone else who'd want to do it."
His next project was even more ambitious: to work on the aesthetic senses of a monkey. "It wasn't - not exactly - to make amends, but something like that was on my mind when I decided to work on aesthetics. I thought I would find out what monkeys would like doing if they had the choice."
This work was, very largely, a failure. He found that monkeys were strongly affected by colour, but shapes and sounds meant little to them. His first marriage was breaking up (he is now married, with two children, to an American psychologist), so in 1972 he went off to Rwanda for three months, to study mountain gorillas with Dian Fossey. Again, the question of what made us different arose: what had been the spur, or the reward, for human evolution, for our language and our consciousness. The answer he then came up with has been very influential. Variously known as "Machiavellian" or "social" intelligence, it is the idea that our brains evolved to cope not with the world around us, but with the people - or proto-people - of our ancestors' social groups.
Consciousness, in this theory, is a knowledge of what is going on in our own minds, and we have it so that we can better understand what is going on in the minds of those around us, so that we can manipulate them and avoid being manipulated in our turn. This fits human consciousness into a normal biological framework: it offers the possessor of bigger and better brains the kind of advantage that natural selection can see and work on.
For most of the 20th century consciousness had been out of bounds for scientists, and even for behavioural psychologists. Humphrey's original theory was one of the first signs that it could become a legitimate and fruitful area of scientific study. By the late 1970s he was a rather glamorous figure, living with the actress Susannah York, agitating against nuclear weapons - "We were always up on plinths in Trafalgar Square" - and in 1982 he was invited by Channel 4 to write and present a 10-part series on his theory. So he asked for leave of absence from the university and, when it was refused, resigned to make the programmes.
"I have tended to think that life's there as an exploration - don't pass up opportunities, whatever they are - and to have a certain sense that I'll be OK. At certain points I haven't. I've taken risks and then I'm very nearly not OK." He likes to quote Lord Byron: "The great object of life is sensation - to feel that we exist, even though in pain."
When the television series was finished, he could not get another academic job in England. Margaret Thatcher had come to power and the universities were shrinking. He was rescued by his friend Daniel Dennett, who found him a job at Tufts University, near Boston, and the two men worked closely together for years. In the mid-1990s he was able to move back to Cambridge, to a chair devoted to parapsychological research: since the whole burden of his interest in the subject was that he did not believe in it, he wrote Soul Searching, a book arguing that telepathy must be in principle impossible, and that Jesus was a conjuring charlatan like Uri Geller.
Yet, at the same time, he was developing a new and more complex theory of consciousness, which puts something like the soul at the centre of human existence. In his new theory the clue to the "hard problem" of consciousness - the problem of why and how minds appear from matter - is attacked head-on. The fact that we find it so difficult and so threatening to believe, as he says, "that there is nothing more to human experience than the churning of chemicals and electrons within the brain" seems to him to contain the kernel of the solution to the hard problem. If it is so difficult for us to think that way, then the difficulty might in some sense have been designed by natural selection.
Human beings, he writes, "have a self that seems to inhabit a separate universe of spiritual being. As the subjects of something so mysterious and strange, we humans gain new confidence and interest in our own survival, a new interest in other people, too. This feeds right back to our biological fitness, in both obvious and subtle ways. It makes us more fascinating and more fascinated, more determined to pursue lives wherever they will take us. In short, more like the amazing piece of work that humans are."
The theory is, like every other theory of consciousness, extremely controversial. After 200 years in which science has appeared to dethrone God and deny the possibility of the soul, Humphrey is the first man to claim that science can agree that we have souls - but that it was natural selection, not God, which gave us them.
DESIGN FLAWS
John Tyler Bonner reviews Intelligent Thought: Science Versus the Intelligent Design Movement edited by John Brockman
Editor's Summary
27 July 2006
For the defence
In his book Intelligent Thought: Science Versus the Intelligent Design Movement, John Brockman marshals the case for evolutionary science against its 'ID' detractors.
Contributors include Richard Dawkins, saying among other things that "The supernatural explanation fails to explain because it ducks the responsibility to explain itself". And Steven Pinker: "An evolutionary understanding of the human condition, far from being incompatible with a moral sense, can explain why we have one." This book should draw the fire of the ID web sites for a while
Design flaws
John Tyler Bonner
Destroying the argument that intelligent design has a scientific basis.
John Brockman's edited volume Intelligent Thought is largely a series of essays by scientists that make clear, often eloquently, how untenable the scientific basis of intelligent design really is. ...
If intelligent design has anything to say in its favour, it is that it spawned this book. Many of the essays are fascinating and fun to read, and tell us something new.
Intelligent Thought is a book for scientists; that is, for those who see evolutionary biology as a science. If you are a creationist you will be unmoved; there is no point in looking at the evidence.
The new book My Einstein: Essays by Twenty-Four of the World’s Leading Thinkers on the Man, His Work, and His Legacy (Pantheon) attempts the difficult task of putting a totally unique figure from a highly specialized world into some type of recognizable, easily discerned perspective. Editor John Brockman and his staff mostly succeed in making their arguments cogent, analysis straightforward and assessments presented in a fashion that won’t embarrass or anger those scientifically literate, but will also hold the attention of readers that normally avoid books containing discussions about quantum physics and relativity
My Einstein doesn’t oversimplify nor unnecessarily complicate its views, opinions and feelings regarding Einstein’s impact and life. But it does offer those of us in the non-scientific community a means for better understanding and appreciating both his incomparable intellect and the practical effect of his contributions.
My Einstein: Essays by 24 of the World's Leading Thinkers, edited by John Brockman (Pantheon, 261 pages, $25). Now that jokes about Einstein's appeal to the opposite sex have become Letterman monologue staples (as if it were news that genius might not preclude other more sanguine enthusiasms) we can see that in the year following the centennial of his most ground-breaking work, Albert Einstein's remains our culture's folk paradigm of genius. (Newton, his predecessor was, by comparison, magnificently eloquent but pugnacious and almost no fun at all — a prig who needed falling apples to humanize him.)
These essays are irresistible ... the charm of the book is that its often star-struck writers so freely wanted to be connected to entirely non-theoretical humanity, their own and Einstein's.
PICK OF THE PAPERBACKS
By Michael Bhaskar
What We Believe But Cannot Prove
ed by John Brockman (Pocket Books, pounds 7.99)
Scientists occasionally give the impression that belief is something best left to other people. Scientists know, and, what's more, they can prove it. In this refreshing anthology, a litany of heavyweight names abandon any such pretence and let rip with startling speculations on everything from the size of the universe to the consciousness of cockroaches.
Deftly introduced by Ian McEwan, we find Richard Dawkins musing on a universal principle of evolution, Martin Rees postulating the existence of aliens, and Jared Diamond discussing when humans first arrived in the Americas. By unleashing scientists from the rigours of established method, we gain fascinating glimpses into the future of arcane disciplines few fully understand. Even if there is considerable overlap in several of the entries, there is a strangely addictive quality to the clipped essay format.
He was a sexy flirt. He admitted to having difficulties with mathematics. He was only 12 when he decided that "the stories of the Bible could not be true and became a fanatical freethinker." His theory of relativity, which changed the way we view the world, "came from thinking about what it would be like to ride along on a beam of light." "The story goes that [he] liked to sleep ten hours a night -- unless he was working very hard on an idea; then it was eleven."
All these observations appear in My Einstein: Essays by Twenty-four of the World'sLeading Thinkers on the Man, His Work, and His Legacy , edited by John Brockman (Pantheon, $25), whose own devotion to "relative" thinking can be discerned in the title of his previous book, By the Late John Brockman . The essayists include Jeremy Bernstein, Gino C. Sergré and Maria Spiropulu, and the titles of their pieces range from the vaudevillian ("Einstein, Moe, and Joe") to the tantalizing ("The Greatest Discovery Einstein Didn't Make").
My Einstein delivers even more than its lengthy title promises. Philosopher Marcelo Gleiser's contribution helps explain why Einstein's ideas "became an obsession to so many. . . . In a world torn apart by the bloodiest war of all time, this Jewish scientist was proclaiming the existence of a reality wherein space and time are unified in a four-dimensional space-time, where space may contract and time may slow down, where matter is nothing but lumped-up energy. Who wouldn't want to step out of the miserable state that Europe was in in the early 1920s and into the rarefied atmosphere of a world beyond the senses?"
-- Dennis Drabelle
...But pride has always been haunted by fear that public acknowledge of Jewish achievement could fuel the perception of "Jewish domination" of institutions. And any characterization of Jews in biological terms smacks of Nazi pseudoscience about "the Jewish race." A team of scientists from the University of Utah recently strode into this minefield with their article "Natural History of Ashkenazi Intelligence," which was published online in the Journal of Biosocial Science a year ago, and was soon publicized in The New York Times, The Economist, and on the cover of New Yorkmagazine.
The Utah researchers Gregory Cochran, Jason Hardy, and Henry Harpending (henceforth CH&H) proposed that Ashkenazi Jews have a genetic advantage in intelligence, and that the advantage arose from natural selection for success in middleman occupations (moneylending, selling, and estate management) during the first millennium of their existence in northern Europe, from about 800 C.E. to 1600 C.E. Since rapid selection of a single trait often brings along deleterious byproducts, this evolutionary history also bequeathed the genetic diseases known to be common among Ashkenazim, such as Tay-Sachs and Gaucher's.
The CH&H study quickly became a target of harsh denunciation and morbid fascination. It raises two questions. How good is the evidence for this audacious hypothesis? And what, if any, are the political and moral implications? (Registration required)
According to critics, ID is neither observable nor repeatable.
ID or `intelligent design' is a movement that has been in the news recently for its alternative views about evolution. ID proponents allege that science shouldn't be limited to naturalism, and shouldn't demand the adoption of a naturalistic philosophy that dismisses any explanation that contains a supernatural cause out of hand, explains an entry for the phrase in Wikipedia.
ID has been the focus of lawsuits, with controversy revolving around issues such as whether ID can be defined as science, and taught in schools. According to critics, ID is neither observable nor repeatable, thus violating `the scientific requirement of falsifiability'.
Pitching science against ID movement, John Brockman has edited Intelligent Thought, from Vintage (www.vintagebooks.com) . The collection of 16 essays from experts begins with Jerry A. Coyne's piece about evidence of evolution buried in our DNA.
"Our genome is a veritable farrago of non-functional DNA, including many inactive `pseudogenes' that were functional in our ancestors," he notes. "Why do humans, unlike most mammals, require vitamin C in their diet? Because primates cannot synthesise this essential nutrient from simpler chemicals."
It seems we still carry all the genes for synthesising vitamin C though the gene used for the last step in this pathway "was inactivated by mutations 40 million years ago, probably because it was unnecessary in fruit-eating primates."
Tim D. White's piece takes one through volcanic rock samples `fingerprinted at the Los Alamos National Laboratory', and fossils aged millions of years. "Today, evolution is the bedrock of biology, from medicine to molecules, from AIDS to zebras," declares White.
"Biologists can't afford to ignore the interconnectedness of living things, much as politicians can't understand people, institutions or countries without understanding their histories.
`Intelligent aliens' is the focus of Richard Dawkins. How would we recognise intelligence in a pattern of radio waves picked up by a giant parabolic dish and say it is from deep space and not a hoax, asks Dawkins?
The universe can perform approximately 10 to the power 105 elementary operations per second on about 10 to the power 90 bits, writes Seth Lloyd in a chapter titled `How smart is the universe?' One learns that over the 13.8 billion years since the Big Bang, the universe has performed about 10 to the power 122 operations.
He looks closely at how the universe processes information and states that atoms register bits the same way the magnetic bits in a computer's hard drive do. With magnets flipping directions and changing bit values, "every atom and elementary particle in the universe registers and processes information."
Most bits are humble, explains Lloyd. "But some bits lead more interesting lives. Every time a neuron fires in your brain, for example, it lets loose a torrent of bits. The cascade of bits in neural signals is the information processing that underlines your thoughts." To him, "Sex is a glorious burst of information processing designed to pass on and transform" the billions of bits of genetic information locked in the nuclei of the cells. "The more microscopic the form of information processing, the longer it has been going on."
Worth a read for the defence of science it puts up bravely.
Tailpiece
"What's the moral of the Gates story?"
"That we should do charity?"
"No. You should first gross a few billions."
EVER SINCE musician, writer, and technological visionary Jaron Lanier coined the term ''virtual reality" in the early 1980s, and headed up efforts to implement the idea, he's been a member of the digerati in excellent standing. But he's an anxious member, known to raise alarms about just those big ideas and grand ambitions of the computer revolution that happen to excite the most enthusiasm among his peers. That was the case with his contrarian essay, ''One Half of a Manifesto," in 2000. He's done it again in a new piece, ''Digital Maoism," which has roiled the Internet since it was posted at edge.org on May 30.
In ''One Half of a Manifesto," Lanier attacked what he dubbed ''cybernetic totalism," an overweening intellectual synthesis in which mind, brain, life itself, and the entire physical universe are viewed as machines of a kind, controlled by processes not unlike those driving a computer. This digital-age ''dogma," he argued, got a boost from the era's new and ''overwhelmingly powerful technologies," which also obscured the dangers inherent in totalist thinking. People who would steer clear of Marxism, for example, might fall for an even more grandiose world view if it had digital cachet.
Der heute 39jährige ehemalige Optionsscheinhändler, der in St. Petersburg in Florida lebt, gründete 2001 aus Faszination für die freie Software-Bewegung mit seinem Privatvermögen die kostenfrei zugängliche Internet-Enzyklopädie Wikipedia, an der jedermann als "Wikipedianer" mitschreiben kann.
Wales gründete 2004 außerdem das kommerzielle Unternehmen Wikia, das einen werbefinanzierten Hosting-Dienst und mehrere kommerzielle Gemeinschaftsdienste betreibt. Der Name Wikipedia stammt vom hawaiianischen Wort "wiki wiki" und bedeutet übertragen "schnelles Lexikon". Das Projekt wird von der internationalen gemeinnützigen Stiftung Wikimedia unter dem Vorsitz von Jimmy Wales betrieben und von ehrenamtlichen Autoren, Organisatoren und Softwarespezialisten in aller Welt ständig weiterentwickelt. Ziel der Stiftung ist es, durch Wikipedia und weitere Projekte das Wissen der Menschheit allen Menschen auf der Welt zugänglich zu machen. Eine deutsche Sektion von Wikimedia gibt es seit 2004.
Wikipedia hat derzeit mehr als vier Millionen Einträge in rund 200 Sprachen. Die deutschsprachige Homepage (http://de.wikipedia.org) ist mit mehr als 400 000 Einträgen die zweitgrößte nach der englischsprachigen und die pro Kopf der deutschsprachigen Bevölkerung am meisten genutzte.
Eine wissenschaftliche Untersuchung des britischen Magazins "Nature" bescheinigte Wikipedia im vergangenen Dezember, mit einer durchschnittlichen Quote von vier Fehlern pro Wissenschaftsbeitrag in der gleichen Liga zu spielen wie die renommierte "Encyclopaedia Britannica" (drei Fehler pro Beitrag). In beiden Lexika seien Fehler die Regel und nicht die Ausnahme.
Beinahe zeitgleich geriet Wikipedia wegen seines offenen Standards, der Verfälschung von Beiträgen ermöglicht, in die Kritik. Zuletzt entspann sich im Webforum www.edge.org anhand von Wikipedia eine Debatte über die Grenzen des Kollektivismus im Internet.
Jimmy Wales sprach am 21. Juni in Königswinter als Gastredner beim 5. Petersberger Forum zum Thema "Macht", auf Einladung des Verlags für die Deutsche Wirtschaft.
It's fairly safe to say that most Canadians couldn't tell a wormhole from a doughnut hole, nor explain the basic mechanics of global warming, nor distinguish between Fermat and Fibonacci.
It's all too easy to put this down to simple fear of science, but that doesn't exculpate us from attempting to understand at least some of what is the best existing explanation -- pace various fundamentalisms -- for the workings of the universe and its contents. Of course, science has its enemies -- not just among the hyper-religious, but also many postmodernists, who see it as simply one among a competing array of equally valid master narratives. But at least ever since Aristotle, mankind has been consumed by a desire to understand the universe and our place in it. So why should Globe Books be any different? Our commitment to reviewing science books is part curiosity, part missionary. But we don't get to nearly as many as we'd like, so I offer a breathless roster of new titles well worth your consideration.
What We Believe But Cannot Prove: Today's Leading Thinkers on Science in the Age of Uncertainty. More than 100 minds, some doubtless great (including Ian McEwan, Robert Sapolsky, Stephen Pinker, Jared Diamond and Rebecca Goldstein), ponder the question: What do you believe to be true even though you cannot prove it? For me, the answer is Sherlock Holmes, case proved in . . .
Copernicus' dangerous idea, rejected by the Catholic Church, had seven parts: 1) There is no one center in the universe 2) The Earth's center is not the center of the universe 3) The center of the universe is near the sun 4) The distance from the Earth to the sun is imperceptible compared with the distance to the stars 5) The rotation of the Earth accounts for the apparent daily rotation of the stars 6) The apparent annual cycle of movements of the sun is caused by the Earth revolving around the sun, and 7) The apparent retrograde motion of the planets is caused by the motion of the Earth, from which one observes. ...
In January, GMLc mentioned Edge Foundation Inc., an online group of scholars and scientists, and its annual Big Question. Answers to last year's Big Question, 'What do you believe but cannot prove?' have been published this year in book form. ... The 2006 question was 'What is your dangerous idea?'
Intelligent designer? No: we have a bungling consistent evolver. Or maybe an adaptive changer. Rather an odd chap, that God...
Science journalism is a demanding profession, and the list of its great practitioners is not long. Even shorter, however, is the list of professional scientists who write engaging and accessible prose - who write, in short, excellent popular science. The literary agent for a large subset of that group is John Brockman, himself an author as well as literary entrepreneur. In "Intelligent Thought" (Vintage, 272 pages, $14), he has assembled a set of 16 essays, each responding to the current, anti-evolution Intelligent Design Movement (IDM), and the authors include some of the best-known science writers.
The war (it must be so named) between science and the fundamentalist faith-driven IDM is of a deeply troubling import for science education, and for science itself - thus inevitably for contemporary culture. How serious the implications are has only recently been recognized, probably too late for a reasonable cessation of hostilities. The wake-up call seems to have been national coverage, in all the media, of the "Dover" trial, which ended in December, 2005. In it, the plaintiffs - parents and teachers in the Dover, Penn., school district sought relief from an action of the district's Board of Education, which had in effect mandated the addition of Intelligent Design Theory (so-called) to the public school biology curriculum and classrooms. Presiding over the lengthy trial was U.S. District Judge John E. Jones, III. An extract from his painstaking and scholarly opinion is an appendix to this book. It is perhaps its most immediately valuable contribution. What are these often eloquent essays about, are they needed, and are they helpful?
The contributors represent a broad range of scientific disciplines. Richard Dawkins, for example, is a noted evolutionary biologist, as are Jerry Coyne and Neil Shubin. Leonard Susskind is a theoretical physicist; so is Lee Smolin. Greatly respected are philosopher-cognitive scientist Daniel Dennett; paleontologists Tim White and Scott Sampson; psychologists Steven Pinker, Nicholas Humphrey, and Marc Hauser; physicists Seth Lloyd and Lisa Randall; mathematical biologist Stuart Kauffman; anthropologist Scott Atran, and historian of science and behaviorist Frank Sulloway.
In the opening essay, "Intelligent Design: The Faith That Dare Not Speak Its Name," Mr. Coyne sets forth the argument that the IDM is motivated by religion and is, rather than serious scholarship, a faith-based attack on the architecture and trustworthiness of natural science. This is a strong but by now routine presentation of the case, and Mr. Coyne's expert treatments of it have appeared elsewhere, for example in the New Republic. The prolific Mr. Dennett writes on "The Hoax of Intelligent Design and How It Was Perpetrated." Hoax is a belligerent word, but the argument supporting it is solid.
Mr. Dennett's essay is not a paper-trail of the IDM: There is no such thing in this book - a significant lack. But a rich paper trail certainly exists. The IDM's history - with documentation - was presented in Harrisburg, Penn., by plaintiff's witness Barbara Forrest. It was eye-opening and central to the Dover outcome. In the trial, the IDM's attempt on the science curriculum was ruled unconstitutional. Mr. Dennett's contribution is a sharp expose of the IDM's logical and epistemological blunders.
Mr. Humphrey, examining the certainty that consciousness itself is a product of evolution, explains why it must be that, and presents a delicious paradox of consciousness research: An evolving consciousness among higher animals must have produced the insistent denial in us - conscious animals - that consciousness has evolved. Mr. White offers a short but authoritative review of hominid paleontology. We have today an embarrassment of riches in what were once called "missing links": our own, non-human ancestors, as well as those of many other contemporary vertebrates. There is no longer any question that our species had ancestors.
Mr. Dawkins dissects with eloquence the illusion of intelligent-agent design in natural objects. Mr. Sulloway's contribution is a short but incisive account of Darwin's initial failure to understand what he saw and collected in the Galapagos, and his subsequent epiphany on the meaning of those observations for "the species question," that is, for belief in the immutability of the biblical "kinds."
Steven Pinker addresses the common fear underlying most forms of resistance to evolution. It gives rise to the ancient claim that without revealed religion and its key principle - that humankind is of special concern to and under continuous observation by a powerful God - the moral order would collapse; we would succumb to a destructive anarchy. But the evidence is clear that all humans possess a moral sense independently of the details of their religion, if any, and that religion in us is a plausible, indeed an inevitable, consequence of evolutionary history.
This volume has other pleasures, including Lee Smolin on several forms of the Anthropic Principle and the relevance thereto of recent cosmology requiring a multiverse, rather than "the universe"; Stuart Kauffman, whose mathematics of self-organization is often misunderstood as a denial of Darwinism, clarifies in his essay the position in no uncertain terms; Lisa Randall offers a theoretical physicist's view of the facts of evolution and the "theory" of intelligent design, from which she derives the conclusion that
Whoever is responsible [for the history of life] is just trying out various possibilities. We don't have an intelligent designer (ID), we have a bungling consistent evolver (BCE). Or maybe an adaptive changer (AC). In fact, what we have in the most economical interpretation is, of course, evolution.
This collection is helpful but not because it provides the primary knowledge base for the current effort to limit the impact of the IDM - a politically potent hoax with an excellent public relations machine and adequate funding. The necessary primary sources on the IDM and on the relevant science are already available in excellent recent books and in a rising stream of papers in the relevant scientific literature and on the Internet. Nothing coming from these reliable scientific sources constitutes or implies the existence of a "conflict" of "theories."
There is no scientific conflict. ID is not a theory in the ordinary sense of science, and it is certainly not a reputable "alternate view" of the planet's life. It has no unique content other than its claim for the existence of a designer. It is not worthy of the time it would take away from real science in the schools, where the time is already far too short. It is in fact the denial of theory, supported only by unsupported claims of flaws in Darwinism. No positive scientific evidence has ever been offered for ID.
We need this book because its authors have name recognition with the general reading public, because they write well, and because the fight will not end any time soon. Humanity needs to come to grips, sooner rather than later, with its biological meanings, and with the values and anti-values of its religious belief systems. The fight is just beginning. If the real values of religion and spirituality, which include humility before the wonders of nature, are to survive our rising tastes for religious war and destruction, then more than just an elite among us must understand science - and what it yields as description of physical reality through deep time. The more often the small faction of us who read can pause to browse engaging books like "Intelligent Thought," the better is the chance that we can stop the impetus of Homo sapiens toward self-destruction.
Mr. Gross last wrote for these pages about Charles Darwin.
prostoalex writes New York Times Technology section this weekend is running an extensive article on Wikipedia and recentchanges to the editorial policy. ...
by ryrivard First, it wasn't just the "technology" section, it was on the front page of the National Edition.
Second, Wikipedia is damned in both directions by the media: They are either too open and so all sorts of loonies can post whatever they want. Or, when the close up a bit, they are abandoning their own principles.
Anyone who hasn't read it needs to read DIGITAL MAOISM: The Hazards of the New Online Collectivism by Jaron Lanier [edge.org] and the spirited reply [edge.org]...
(Translation and Introduction by Andrian Krey):
In the early 90's computer scientist and musician Jarnon Lanier was one of the first visionaries of a digital cutlure. He taught computer sciences at Universities like Columbia, Yale and NYU. At the end of the 90's he was leading the work on the academic Internet 2. As a musician he has worked with people like Philip Glass, Ornette Coleman and George Clinton. Jaron lanier has written the following essay 'Digital Maoism' for the series 'Original Edge Essays' for the online forum of the same name (www.edge.org), where the text launched a heated debate about the cultural qualities of the internet with the participation of wikipedia founders Larry Sanger and Jimmy Wales, computer expert Esther Dyson and media thinker Douglas Rushkoff.
To wiki or not to wiki? That is the question.
Whether ‘tis nobler to plunge in and write a few Wikipedia entries on subjects regarding which one has some expertise; and also, p'raps, to revise some of the weaker articles already available there...
Or rather, taking arms against a sea of mediocrity, to mock the whole concept of an open-source, online encyclopedia -- that bastard spawn of “American Idol” and a sixth grader’s report copied word-for-word from theWorld Book....
Hamlet, of course, was nothing if not ambivalent –- and my attitude towards how to deal with Wikipedia is comparably indecisive. Six years into its existence, there are now something in the neighborhood of 2 million entries, in various languages, ranging in length from one sentence to thousands of words.
They are prepared and edited by an ad hoc community of contributors. There is no definitive iteration of a Wikipedia article: It can be added to, revised, or completely rewritten by anyone who cares to take the time.
Strictly speaking, not all wiki pages are Wikipedia entries. As this useful item explains, a wiki is a generic term applying to a Web page format that is more or less open to interaction and revision. In some cases, access to the page is limited to the members of a wiki community. With Wikipedia, only a very modest level of control is exercised by administrators. The result is a wiki-based reference tool that is open to writers putting forward truth, falsehood, and all the shades of gray in between.
In other words, each entry is just as trustworthy as whoever last worked on it. And because items are unsigned, the very notion of accountability is digitized out of existence.
Yet Wikipedia now seems even more unavoidable than it is unreliable. Do a search for any given subject, and chances are good that one or more Wikipedia articles will be among the top results you get back.
Nor is use of Wikipedia limited to people who lack other information resources. My own experience is probably more common than anyone would care to admit. I have a personal library of several thousand volumes (including a range of both generalist and specialist reference books) and live in a city that is home to at least to three universities with open-stack collections. And that’s not counting access to the Library of Congress.
The expression “data out the wazoo” may apply. Still, rare is the week when I don’t glance over at least half a dozen articles from Wikipedia. (As someone once said about the comic strip “Nancy,” reading it usually takes less time than deciding not to do so.)
Basic cognitive literacy includes the ability to evaluate the strengths and the limitations of any source of information. Wikipedia is usually worth consulting simply for the references at the end of an article -- often with links to other online resources. Wikipedia is by no means a definitive reference work, but it’s not necessarily the worst place to start.
Not that everyone uses it that way, of course. Consider a recent discussion between a reference librarian and a staff member working for an important policy-making arm of the U.S. government. The librarian asked what information sources the staffer relied on most often for her work. Without hesitation, she answered: “Google and Wikipedia.” In fact, she seldom used anything else.
Coming from a junior-high student, this would be disappointing. From someone in a position of power, it is well beyond worrisome. But what is there to do about it? Apart, that is, from indulging in Menckenesque ruminations about the mule-like stupidity of the American booboisie?
Sure, we want our students, readers, and fellow citizens to become more astute in their use of the available tools for learning about the world. (Hope springs eternal!) But what is to be done in the meantime?
Given the situation at hand, what is the responsibility of people who do have some level of competence? Is there some obligation to prepare adequate Wikipedia entries?
Or is that a waste of time and effort? If so, what’s the alternative? Or is there one? Luddism is sometimes a temptation – but, as solutions go, not so practical.
I throw these questions out without having yet formulated a cohesive (let alone cogent) answer to any of them. At one level, it is a matter for personal judgment. An economic matter, even. You have to decide whether improving this one element of public life is a good use of your resources.
At the same time, it’s worth keeping in mind that Wikipedia is not just one more new gizmo arriving on the scene. It is not just another way to shrink the American attention span that much closer to the duration of a subatomic particle. How you relate to it (whether you chip in, or rail against it) is even, arguably, a matter of long-term historical consequence. For in a way, Wikipedia is now 70 years old.
It was in 1936 that H.G. Wells, during a lecture in London, began presenting the case for what he called a “world encyclopedia” – an international project to synthesize and make readily available the latest scientific and scholarly work in all fields. Copies would be made available all over the planet. To keep pace with the constant growth of knowledge, it would be revised and updated constantly. (An essay on the same theme that Wells published the following year is available online.)
A project on this scale would be too vast for publication in the old-fashioned format of the printed book. Besides, whole sections of the work would be rewritten frequently. And so Wells came up with an elegant solution. The world encyclopedia would be published and distributed using a technological development little-known to his readers: microfilm.
Okay, so there was that slight gap between the Wellsian conception and the Wikipedian consummation. But the ambition is quite similar -- the creation of “the largest encyclopedia in history, both in terms of breadth and depth” (as the FAQ describes Wikipedia’s goal).
Yet there are differences that go beyond the delivery system. Wells believed in expertise. He had a firm faith in the value of exact knowledge, and saw an important role for the highly educated in creating the future. Indeed, that is something of an understatement: Wells had a penchant for creating utopian scenarios in which the best and the brightest organized themselves to take the reins of progress and guide human evolution to a new level.
Sometimes that vision took more or less salutary forms. After the first World War, he coined a once-famous saying that our future was a race between education and disaster. In other moods, he was prone to imagining the benefits of quasi-dictatorial rule by the gifted. What makes Wells a fascinating writer, rather than just a somewhat scary one, is that he also had a streak of fierce pessimism about whether his projections would work out. His final book, published a few months before his death in 1946, was a depressing little volume called The Mind at the End of Its Tether, which was a study in pure worry.
The title Wells gave to his encyclopedia project is revealing: when he pulled his various essays on the topic together into a book, he called itWorld Brain. The researchers and writers he imagined pooling their resources would be the faculty of a kind of super-university, with the globe as its campus. But it would do even more than that. The cooperative effort would effectively mean that humanity became a single gigantic organism -- with a brain to match.
You don’t find any of Wells’s meritocracy at work in Wikipedia. There is no benchmark for quality. It is an intellectual equivalent of the Wild West, without the cows or the gold.
And yet, strangely enough, you find imagery very similar to that of Wells’s “world brain” emerging in some of the more enthusiastic claims for Wikipedia. As the computer scientist Jaron Lanier noted in a recent essay, there is now an emergent sensibility he calls “a new online collectivism” – one for which “something like a distinct kin to human consciousness is either about to appear any minute, or has already appeared.” (Lanier offers a sharp criticism of this outlook. See also the thoughtful responses to his essay assembled by John Brockman.)
From the “online collectivist’ perspective, the failings of any given Wikipedia entry are insignificant. “A core belief in the wiki world,” writes Lanier, “is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds.”
The problem being, of course, that it does not always work out that way. In 2004, Robert McHenry, the former editor-in-chief of the Encyclopedia Britannica, pointed out that, even after 150 edits, the Wikipedia entry on Alexander Hamilton would earn a high school student a C at best.
“The earlier versions of the article,” he noted, “are better written over all, with fewer murky passages and sophomoric summaries.... The article has, in fact, been edited into mediocrity.”
It is not simply proof of the old adage that too many cooks will spoil the broth. “However closely a Wikipedia article may at some point in its life attain to reliability,” as McHenry puts it, “it is forever open to the uninformed or semiliterate meddler.”
The advantage of Wikipedia’s extreme openness is that people are able to produce fantastically thorough entries on topics far off the beaten path. The wiki format creates the necessary conditions for nerd utopia. As a fan of the new “reimagined” "Battlestar Galactica," I cannot overstate my awe at the fan-generated Web site devoted to the show. Participants have created a sort of mini-encyclopedia covering all aspects of the program, with a degree of thoroughness and attention to accuracy matched by few entries at Wikipedia proper.
At the same time, Wikipedia is not necessarily less reliable than more prestigious reference works. A study appearing in the journal Naturefound that Wikipedia entries on scientific topics were about as accurate as corresponding articles in the Encyclopedia Britannica.
And in any case, the preparation of reference works often resembles a sausage factory more than it does a research facility. As the British writer Joseph McCabe pointed out more than 50 years ago in a critiqueof the Columbia Encyclopedia, the usual procedure is less meritocratic than one might suppose. “A number of real experts are paid handsomely to write and sign lengthy articles on subjects of which they are masters,” noted McCabe, “and the bulk of the work is copied from earlier encyclopedias by a large number of penny-a-liners.”
Nobody writing for Wikipedia is “paid handsomely,” of course. For that matter, nobody is making a penny a line. The problems with it are admitted even by fans like David Shariatmadari, whose recent article on Wikipedia ended with an appeal to potential encyclopedists “to get your ideas together, get registered, and contribute.”
Well, okay ... maybe. I’ll think about it at least. There’s still something appealing about Wells’s vision of bringing people together “into a more and more conscious co-operating unity and a growing sense of their own dignity” – through a “common medium of expression” capable of “informing without pressure or propaganda, directing without tyranny.”
If only we could do this without all the semi-mystical globaloney (then and now) about the World Brain. It would also be encouraging if there were a way around certain problems -- if, say, one could be sure that different dates wouldn’t be given for the year that Alexander Hamilton ended his term as Secretary of the Treasury.
Das Wikipedia-Prinzip ist digitaler Maoismus, behauptet Jaron Lanier in Edge. Im Express feiern Eric Hobsbawm und Jacques AttaliKarl Marx als Denker der Globalisierung. Segolene Royal sieht das wohl etwas anders, entnehmen wir der Weltwoche. Der Economisttraut keinem Roboter. Die New York Review of Books sieht die Opiumindustrie in Afghanistan wachsen und gedeihen. Der Spectatorberichtet aus Darfur. DU widmet sich dem Volk der Kritischen Wälder. In Le Point feiert Bernard-Henri Levy Angela Merkel als lebenden Beweis für die Aktualität von Simone de Beauvoirs Werk.
Edge.org | L`Express | The Economist | Die Weltwoche | The New York Review of Books | The Spectator | Il Foglio | Nepszabadsag | Folio | DU | Le point | Elsevier | The New York Times Book Review
Edge.org, 30.05.2006
Die besten Essays über die bestürzende Medienrevolution namens Internet kommen nach wie vor aus den USA. Vor ein paar Wochenentwarf Kevin Kelly (mehr hier) im New York Times Magazine die euphorische Vision eines durch das Internet geschaffenen kollektiven und unendlichen Buchs. Fast gleichzeitig setzt Jaron Lanier (mehrhier), ohne direkt auf Kelly zu antworten, einen scharfen Gegenakzent und kritisiert einen von Projekten wie Wikipedia angefachten Kollektivgeist, der glaubt, dass sich der Weltgeist schon von alleine und ohne verantwortliche Autoren im Netz aggregiert. Lanier spricht von einem "new online Collectivism", "einer Wiederkehr der Idee von einem allwissenden Kollektiv": "Diese Idee hatte fürchterliche Konsequenzen, als sie in verschiedenen Epochen von rechts- oder linksextremen Kräften über uns gebracht wurde. Die Tatsache, dass sie nun wieder von prominenten Forschern und Futorologen aufgebracht wird - darunter Leuten, die ich kenne und mag - macht sie nicht weniger gefährlich." Lanier glaubt nicht an eine Abschaffung der Autorenschaft: "Das schöne am Netz ist, dass es Beziehungen zwischen Leuten herstellt. Der Wert liegt in diesen anderen Leuten. Wenn wir glauben, dass das Internet selbst als Ganzes etwas zu sagen hat, dann entwerten wir diese Leute und machen uns zu Idioten."
Über Laniers Essay werden auf edge.org intensive Debatten geführt. Es antwortet unter anderem Kevin Kelly.
Edge.org |The Spectator | Il Foglio| Nepszabadsag | DU | The Economist | L'Express | Die Weltwoche | Folio | Le point | The New York Review of Books
Edge.org, 30.05.2006 (USA)
The best essays about the disconcerting media revolution known as the Internet continue to come from the USA. A fortnight ago in the New York Times Magazine, Kevin Kelly (more here) set out his euphoric vision of the Internet-based collective and the universal book. Almost immediately, although without direct reference to Kelly, Jaron Lanier (more here) penned an acerbic counter argument, criticising the collective spirit kindled by projects such as Wikipedia which believes a collective intelligence will aggregate by itself on the net without responsible authors. Lanier talks of a "new online collectivism" and the "resurgence of the idea that the collective is all-wise". "This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous." Lanier does not believe in erasing authorship: "The beauty of the Internet is that it connects people. The value is in the other people. If we start to believe that the Internet itself is an entity that has something to say, we're devaluing those people and making ourselves into idiots."
Lanier's essay provoked many people to enter into the debate at edge.org, Kevin Kelly among them.
The Spectator, 12.06.2006 (UK)
Peter Oborne reports from Darfur: "When we visited the scene of the battle we found that bodies had beenshoved hastily into mass graves. An arm stuck out from under one bush, and the flesh had been eaten by wild animals. A human foot obtruded from another grave. Dried pools of blood stained the ground. The stench of human putrefaction was heavy in the air. Bits and pieces of clothing, spent bullets and theprotective amulets used by African fighters lay scattered on the ground. One body still lay exposed. The dead man had evidently climbed a tree to escape his attackers, but been shot down from his hiding place."
Il Foglio, 10.06.2006 (Italy)
The Golf GTI was, sociologically speaking at least, the forerunner of the now controversial SUV, writesMaurizio Crippa, and also the perfect symbol of the 80s. "If cars have a spirit, then it is certainly an evil one, demonic. The enemy is inside them, a man like in Stephen King's 'Christine' of 1983. Christine might have been a Plymouth Fury of 1958, but its cursed spirit uncovered the ghastly depths of the GT decade and all the souped-up, turbo-boosted and drilled-out engines. That all came to an end in 1989, famously the year of salvation. The Golf, in particular the GTI, the black one - and we are not talking about the one with rabbit's foot in the back – was aggressive, demanding, loud."
Nepszabadsag, 10.06.2006 (Hungary)
After substantial renovation, the legendary New York coffee house, one of the most important literary coffee houses of the Danube monarchy has reopened. The writer Ivan Bächer recollects:"Once upon a time, not only the coffee house but the whole palace, even every room, every corner every nook and cranny of the the entire block of the surrounding houses was full of journalists, writers, publishing houses and editorial offices." The new Italian owners have redeveloped the literary spirit to death, Bächer states disappointedly: "On the wall is a box of reinforced glass in which a dozen beautiful old books are hermetically sealed. A book safe. At the opening celebrations in 1895 the playwright Ferenc Molnar threw the keys to the coffee house into the Danube so that the splendid institution could never be closed again. After the reopening, perhaps someone should take the precaution of throwing the keys to the reinforced glass box into the Danube to prevent anyone from entertaining the idea of ever opening a book in these rooms. (Here and here photos of the coffee house in its heyday, Here, here and here after the renovation.)
DU magazine focusses on Germany for the World Cup and has its correspondents report from every corner of the Bundesrepublik. As usual only a very small selection is available online, but Albrecht Tübke's photographic portraits which accompany the pieces of writing can be viewed here.
The lengthy discursive essays are less illuminating than the small atmospheric pieces such as the one by Svenja Leibe on the village where she grew up. "Drive off the motorway, on and on through the scattered settlements, none of which you will find surprising. Drive through them, but do not hope to see anything through the panorama windows of the bungalows, drive on down the curvy streets, past the pig farms, past the silver bunting of the car show rooms. Follow the neon coloured invitations to 'foam parties and barn raves'. Look out for people, you won't see many of them. Don't think the red lantern in front of the family house is a forgotten Christmas decoration. Drive. Drive down the pretty hill, on past the hidden building sites in the garden of the old pheasantry, down to the 'tank resistant' bridge that stoutly spans a tiny stream. The road runs directly into the heart of the village and to a little house behind a metre-long curve sign where it turns very sharply to the left. Don't look out of the window with too much interest here, you will only make them suspicious. There is nothing to buy any more. Leave them in peace. Let them file away at their gardens, take that seriously."
The Economist, 09.06.2006 (UK)
Inspired by Isaac Asimov's futuristic vision "I, Robot", The Economist asks in itsTechnology Quarterly how secure our future will be among robots. Do Asimov's three lawsfor the protection of humans hold today? "Regulating the behaviour of robots is going to become more difficult in the future, since they will increasingly have self-learning mechanisms built into them, says Gianmarco Veruggio, a roboticist at the Institute of Intelligent Systems for Automation in Genoa, Italy. As a result, their behaviour will become impossible to predict fully, he says, since they will not be behaving in predefined ways but will learn new behaviour as they go."
Other articles dealing with new fuel cells, artificial neural networks in car motors and the victory march of Bluetooth (wireless personal area networks) are unfortunately not online. Not in the magazine but also topical here is Robocup, the world robot football championships taking place this week in Bremen.
L'Express, 09.06.2006 (France)
Does globalisation make Karl Marx a "pioneer of modern thinking"? The question is tossed around in this issue by two indiviuals who are convinced the answer is yes: English historianEric Hobsbawm and Jacques Attali, economist and former advisor to Francois Mitterand, whose book "Karl Marx ou l'esprit du monde" was published last year. Hobsbawm finds a renewed interest in Marx entirely natural: "Today we are seeing the globalised economy thatMarx anticipated. Still, he didn't foresee all of its repercussions. For example, the Marxist prophesy whereby an increasingly numerous proletariat topples capitalism in the industrial countries did not come about." Attali comments: "The Socialist International was a remarkable attempt on Marx's part to think the world in its entirety. Marx is an extraordinarily modern thinker, because rather than sketching the outlines of a socialist state, his writings describe the capitalism of the future."
Die Weltwoche, 08.06.2006 (Switzerland)
Daniel Binswanger portrays Segolene Royal, the promising presidential candidate whose conservative views are pushing French socialists into an identity crisis. "Re-education camps for criminal youths controlled by the army, state paternalism of parents with authority problems, cutbacks in funding for people with delinquent children: for the last week people in France have been discussing a whole catalogue of measures aimed at coming to grips with youth violence in the banlieues. But for once the debate has not been set off by the hyperactive Minister of the Interior Nicolas Sarkozy. The French are rubbing their eyes in disbelief: as if in a political mirage, the discourse on law-and-order has changed camps."
Folio, 06.06.2006 (Switzerland)
What's become of lunch? A sandwich gulped down while you're walking. Folio presents this rule and exceptions to it.
Stephan Israel visits Michel Addons, cook for the Italian EU Commission: "Today there's lobster tails on spring rolls with ginger and oyster sauce. For the main course there's veal sweetbreads with new potatoes and green asparagus from Provence. For desert there's strawberries on creme brulee. Today is the yearly visit of the much-feared auditors from Luxembourg."
Italian author Andrea Camilleri commiserates with those who have to swallow down a hamburger on the street, reminiscing about how his grandmother used to cook at noon. "As primo there was mostly pasta, as a gratin or with meat sauce, sometimes there was also melanzane alla parmigiana. As secondo there was poultry, lamb or fish, then cheese and sausages. Of course a lunch like that took its time. No one went back to work before four in the afternoon."
In his "Duftnote" column on fragrances, Luca Turin voices his amazement at a new summer perfume: "This is so wretched that it almost sets new standards in the matter." (Here the English version.)
Bernard-Henri Levy is up in arms that no one in France has said a word about Simone de Beauvoir, who died 20 years ago. In his "notebook" column, he pays homage to seven women, all of whom are "proof of the timelessness of de Beauvoir's tremendous work": Hillary Clinton, Condoleezza Rice, Chilean president Michelle Bachelet, French politician Segolene Royal, women's rights activist Fadela Amara, Burmese Nobel Peace Prize laureate Aung San Suu Kyi and – German chancellor Angela Merkel. Levy writes: "Angela Merkel, 'that woman' as Gerhard Schröder, Putinist and world record holder in matters of corruption under a democracy, called her; that 'girl' who peeved him no end at the time of his election defeat... She, the specialist in quantum physics (elementary particles are not Michel Houellebecq's terrain, but hers), enjoys a popularity that has her predecessor, and all of Europe's heads of government, green with envy. And on top of that she's rehabilitating the finances of an economy that thanks to her is once more becoming what it always has been and should definitely be once more: the moving force in the European equation."
The New York Review of Books, 22.06.2006 (USA)
Five years after the American victory over the Taliban, Ahmed Rashid sees Afghanistan once more on the verge of collapse: "A revived Taliban movement has made a third of the country ungovernable. Together with al-Qaeda, Taliban leaders are trying to carve out new bases on the Afghanistan–Pakistan border. They are aided by Afghanistan's resurgent opium industry, which has contributed to widespread corruption and lawlessness, particularly in the south. The country's huge crop of poppies is processed into opium and refined into heroin for export, now accounting for close to 90 percent of the global market."
Further articles: Alan Ryan presents three books in which renowned philosophers – Kwame Anthony Appiah, Amartya Sen and Martha Nussbaum – address concepts of cultural diversity andcosmopolitanism. Freeman J. Dyson reviews Daniel C. Dennet's philosophical treatise on religion,"Breaking the Spell", in which Dennett pinpoints the real problem as "belief in belief": "He finds evidence that large numbers of people who identify themselves as religious believers do not in fact believe the doctrines of their religions but only believe in belief as a desirable goal."