...But pride has always been haunted by fear that public acknowledge of Jewish achievement could fuel the perception of "Jewish domination" of institutions. And any characterization of Jews in biological terms smacks of Nazi pseudoscience about "the Jewish race." A team of scientists from the University of Utah recently strode into this minefield with their article "Natural History of Ashkenazi Intelligence," which was published online in the Journal of Biosocial Science a year ago, and was soon publicized in The New York Times, The Economist, and on the cover of New Yorkmagazine.
The Utah researchers Gregory Cochran, Jason Hardy, and Henry Harpending (henceforth CH&H) proposed that Ashkenazi Jews have a genetic advantage in intelligence, and that the advantage arose from natural selection for success in middleman occupations (moneylending, selling, and estate management) during the first millennium of their existence in northern Europe, from about 800 C.E. to 1600 C.E. Since rapid selection of a single trait often brings along deleterious byproducts, this evolutionary history also bequeathed the genetic diseases known to be common among Ashkenazim, such as Tay-Sachs and Gaucher's.
The CH&H study quickly became a target of harsh denunciation and morbid fascination. It raises two questions. How good is the evidence for this audacious hypothesis? And what, if any, are the political and moral implications? (Registration required)
According to critics, ID is neither observable nor repeatable.
ID or `intelligent design' is a movement that has been in the news recently for its alternative views about evolution. ID proponents allege that science shouldn't be limited to naturalism, and shouldn't demand the adoption of a naturalistic philosophy that dismisses any explanation that contains a supernatural cause out of hand, explains an entry for the phrase in Wikipedia.
ID has been the focus of lawsuits, with controversy revolving around issues such as whether ID can be defined as science, and taught in schools. According to critics, ID is neither observable nor repeatable, thus violating `the scientific requirement of falsifiability'.
Pitching science against ID movement, John Brockman has edited Intelligent Thought, from Vintage (www.vintagebooks.com) . The collection of 16 essays from experts begins with Jerry A. Coyne's piece about evidence of evolution buried in our DNA.
"Our genome is a veritable farrago of non-functional DNA, including many inactive `pseudogenes' that were functional in our ancestors," he notes. "Why do humans, unlike most mammals, require vitamin C in their diet? Because primates cannot synthesise this essential nutrient from simpler chemicals."
It seems we still carry all the genes for synthesising vitamin C though the gene used for the last step in this pathway "was inactivated by mutations 40 million years ago, probably because it was unnecessary in fruit-eating primates."
Tim D. White's piece takes one through volcanic rock samples `fingerprinted at the Los Alamos National Laboratory', and fossils aged millions of years. "Today, evolution is the bedrock of biology, from medicine to molecules, from AIDS to zebras," declares White.
"Biologists can't afford to ignore the interconnectedness of living things, much as politicians can't understand people, institutions or countries without understanding their histories.
`Intelligent aliens' is the focus of Richard Dawkins. How would we recognise intelligence in a pattern of radio waves picked up by a giant parabolic dish and say it is from deep space and not a hoax, asks Dawkins?
The universe can perform approximately 10 to the power 105 elementary operations per second on about 10 to the power 90 bits, writes Seth Lloyd in a chapter titled `How smart is the universe?' One learns that over the 13.8 billion years since the Big Bang, the universe has performed about 10 to the power 122 operations.
He looks closely at how the universe processes information and states that atoms register bits the same way the magnetic bits in a computer's hard drive do. With magnets flipping directions and changing bit values, "every atom and elementary particle in the universe registers and processes information."
Most bits are humble, explains Lloyd. "But some bits lead more interesting lives. Every time a neuron fires in your brain, for example, it lets loose a torrent of bits. The cascade of bits in neural signals is the information processing that underlines your thoughts." To him, "Sex is a glorious burst of information processing designed to pass on and transform" the billions of bits of genetic information locked in the nuclei of the cells. "The more microscopic the form of information processing, the longer it has been going on."
Worth a read for the defence of science it puts up bravely.
"What's the moral of the Gates story?"
"That we should do charity?"
"No. You should first gross a few billions."
EVER SINCE musician, writer, and technological visionary Jaron Lanier coined the term ''virtual reality" in the early 1980s, and headed up efforts to implement the idea, he's been a member of the digerati in excellent standing. But he's an anxious member, known to raise alarms about just those big ideas and grand ambitions of the computer revolution that happen to excite the most enthusiasm among his peers. That was the case with his contrarian essay, ''One Half of a Manifesto," in 2000. He's done it again in a new piece, ''Digital Maoism," which has roiled the Internet since it was posted at edge.org on May 30.
In ''One Half of a Manifesto," Lanier attacked what he dubbed ''cybernetic totalism," an overweening intellectual synthesis in which mind, brain, life itself, and the entire physical universe are viewed as machines of a kind, controlled by processes not unlike those driving a computer. This digital-age ''dogma," he argued, got a boost from the era's new and ''overwhelmingly powerful technologies," which also obscured the dangers inherent in totalist thinking. People who would steer clear of Marxism, for example, might fall for an even more grandiose world view if it had digital cachet.
Der heute 39jährige ehemalige Optionsscheinhändler, der in St. Petersburg in Florida lebt, gründete 2001 aus Faszination für die freie Software-Bewegung mit seinem Privatvermögen die kostenfrei zugängliche Internet-Enzyklopädie Wikipedia, an der jedermann als "Wikipedianer" mitschreiben kann.
Wales gründete 2004 außerdem das kommerzielle Unternehmen Wikia, das einen werbefinanzierten Hosting-Dienst und mehrere kommerzielle Gemeinschaftsdienste betreibt. Der Name Wikipedia stammt vom hawaiianischen Wort "wiki wiki" und bedeutet übertragen "schnelles Lexikon". Das Projekt wird von der internationalen gemeinnützigen Stiftung Wikimedia unter dem Vorsitz von Jimmy Wales betrieben und von ehrenamtlichen Autoren, Organisatoren und Softwarespezialisten in aller Welt ständig weiterentwickelt. Ziel der Stiftung ist es, durch Wikipedia und weitere Projekte das Wissen der Menschheit allen Menschen auf der Welt zugänglich zu machen. Eine deutsche Sektion von Wikimedia gibt es seit 2004.
Wikipedia hat derzeit mehr als vier Millionen Einträge in rund 200 Sprachen. Die deutschsprachige Homepage (http://de.wikipedia.org) ist mit mehr als 400 000 Einträgen die zweitgrößte nach der englischsprachigen und die pro Kopf der deutschsprachigen Bevölkerung am meisten genutzte.
Eine wissenschaftliche Untersuchung des britischen Magazins "Nature" bescheinigte Wikipedia im vergangenen Dezember, mit einer durchschnittlichen Quote von vier Fehlern pro Wissenschaftsbeitrag in der gleichen Liga zu spielen wie die renommierte "Encyclopaedia Britannica" (drei Fehler pro Beitrag). In beiden Lexika seien Fehler die Regel und nicht die Ausnahme.
Beinahe zeitgleich geriet Wikipedia wegen seines offenen Standards, der Verfälschung von Beiträgen ermöglicht, in die Kritik. Zuletzt entspann sich im Webforum www.edge.org anhand von Wikipedia eine Debatte über die Grenzen des Kollektivismus im Internet.
Jimmy Wales sprach am 21. Juni in Königswinter als Gastredner beim 5. Petersberger Forum zum Thema "Macht", auf Einladung des Verlags für die Deutsche Wirtschaft.
It's fairly safe to say that most Canadians couldn't tell a wormhole from a doughnut hole, nor explain the basic mechanics of global warming, nor distinguish between Fermat and Fibonacci.
It's all too easy to put this down to simple fear of science, but that doesn't exculpate us from attempting to understand at least some of what is the best existing explanation -- pace various fundamentalisms -- for the workings of the universe and its contents. Of course, science has its enemies -- not just among the hyper-religious, but also many postmodernists, who see it as simply one among a competing array of equally valid master narratives. But at least ever since Aristotle, mankind has been consumed by a desire to understand the universe and our place in it. So why should Globe Books be any different? Our commitment to reviewing science books is part curiosity, part missionary. But we don't get to nearly as many as we'd like, so I offer a breathless roster of new titles well worth your consideration.
What We Believe But Cannot Prove: Today's Leading Thinkers on Science in the Age of Uncertainty. More than 100 minds, some doubtless great (including Ian McEwan, Robert Sapolsky, Stephen Pinker, Jared Diamond and Rebecca Goldstein), ponder the question: What do you believe to be true even though you cannot prove it? For me, the answer is Sherlock Holmes, case proved in . . .
Copernicus' dangerous idea, rejected by the Catholic Church, had seven parts: 1) There is no one center in the universe 2) The Earth's center is not the center of the universe 3) The center of the universe is near the sun 4) The distance from the Earth to the sun is imperceptible compared with the distance to the stars 5) The rotation of the Earth accounts for the apparent daily rotation of the stars 6) The apparent annual cycle of movements of the sun is caused by the Earth revolving around the sun, and 7) The apparent retrograde motion of the planets is caused by the motion of the Earth, from which one observes. ...
In January, GMLc mentioned Edge Foundation Inc., an online group of scholars and scientists, and its annual Big Question. Answers to last year's Big Question, 'What do you believe but cannot prove?' have been published this year in book form. ... The 2006 question was 'What is your dangerous idea?'
Intelligent designer? No: we have a bungling consistent evolver. Or maybe an adaptive changer. Rather an odd chap, that God...
Science journalism is a demanding profession, and the list of its great practitioners is not long. Even shorter, however, is the list of professional scientists who write engaging and accessible prose - who write, in short, excellent popular science. The literary agent for a large subset of that group is John Brockman, himself an author as well as literary entrepreneur. In "Intelligent Thought" (Vintage, 272 pages, $14), he has assembled a set of 16 essays, each responding to the current, anti-evolution Intelligent Design Movement (IDM), and the authors include some of the best-known science writers.
The war (it must be so named) between science and the fundamentalist faith-driven IDM is of a deeply troubling import for science education, and for science itself - thus inevitably for contemporary culture. How serious the implications are has only recently been recognized, probably too late for a reasonable cessation of hostilities. The wake-up call seems to have been national coverage, in all the media, of the "Dover" trial, which ended in December, 2005. In it, the plaintiffs - parents and teachers in the Dover, Penn., school district sought relief from an action of the district's Board of Education, which had in effect mandated the addition of Intelligent Design Theory (so-called) to the public school biology curriculum and classrooms. Presiding over the lengthy trial was U.S. District Judge John E. Jones, III. An extract from his painstaking and scholarly opinion is an appendix to this book. It is perhaps its most immediately valuable contribution. What are these often eloquent essays about, are they needed, and are they helpful?
The contributors represent a broad range of scientific disciplines. Richard Dawkins, for example, is a noted evolutionary biologist, as are Jerry Coyne and Neil Shubin. Leonard Susskind is a theoretical physicist; so is Lee Smolin. Greatly respected are philosopher-cognitive scientist Daniel Dennett; paleontologists Tim White and Scott Sampson; psychologists Steven Pinker, Nicholas Humphrey, and Marc Hauser; physicists Seth Lloyd and Lisa Randall; mathematical biologist Stuart Kauffman; anthropologist Scott Atran, and historian of science and behaviorist Frank Sulloway.
In the opening essay, "Intelligent Design: The Faith That Dare Not Speak Its Name," Mr. Coyne sets forth the argument that the IDM is motivated by religion and is, rather than serious scholarship, a faith-based attack on the architecture and trustworthiness of natural science. This is a strong but by now routine presentation of the case, and Mr. Coyne's expert treatments of it have appeared elsewhere, for example in the New Republic. The prolific Mr. Dennett writes on "The Hoax of Intelligent Design and How It Was Perpetrated." Hoax is a belligerent word, but the argument supporting it is solid.
Mr. Dennett's essay is not a paper-trail of the IDM: There is no such thing in this book - a significant lack. But a rich paper trail certainly exists. The IDM's history - with documentation - was presented in Harrisburg, Penn., by plaintiff's witness Barbara Forrest. It was eye-opening and central to the Dover outcome. In the trial, the IDM's attempt on the science curriculum was ruled unconstitutional. Mr. Dennett's contribution is a sharp expose of the IDM's logical and epistemological blunders.
Mr. Humphrey, examining the certainty that consciousness itself is a product of evolution, explains why it must be that, and presents a delicious paradox of consciousness research: An evolving consciousness among higher animals must have produced the insistent denial in us - conscious animals - that consciousness has evolved. Mr. White offers a short but authoritative review of hominid paleontology. We have today an embarrassment of riches in what were once called "missing links": our own, non-human ancestors, as well as those of many other contemporary vertebrates. There is no longer any question that our species had ancestors.
Mr. Dawkins dissects with eloquence the illusion of intelligent-agent design in natural objects. Mr. Sulloway's contribution is a short but incisive account of Darwin's initial failure to understand what he saw and collected in the Galapagos, and his subsequent epiphany on the meaning of those observations for "the species question," that is, for belief in the immutability of the biblical "kinds."
Steven Pinker addresses the common fear underlying most forms of resistance to evolution. It gives rise to the ancient claim that without revealed religion and its key principle - that humankind is of special concern to and under continuous observation by a powerful God - the moral order would collapse; we would succumb to a destructive anarchy. But the evidence is clear that all humans possess a moral sense independently of the details of their religion, if any, and that religion in us is a plausible, indeed an inevitable, consequence of evolutionary history.
This volume has other pleasures, including Lee Smolin on several forms of the Anthropic Principle and the relevance thereto of recent cosmology requiring a multiverse, rather than "the universe"; Stuart Kauffman, whose mathematics of self-organization is often misunderstood as a denial of Darwinism, clarifies in his essay the position in no uncertain terms; Lisa Randall offers a theoretical physicist's view of the facts of evolution and the "theory" of intelligent design, from which she derives the conclusion that
Whoever is responsible [for the history of life] is just trying out various possibilities. We don't have an intelligent designer (ID), we have a bungling consistent evolver (BCE). Or maybe an adaptive changer (AC). In fact, what we have in the most economical interpretation is, of course, evolution.
This collection is helpful but not because it provides the primary knowledge base for the current effort to limit the impact of the IDM - a politically potent hoax with an excellent public relations machine and adequate funding. The necessary primary sources on the IDM and on the relevant science are already available in excellent recent books and in a rising stream of papers in the relevant scientific literature and on the Internet. Nothing coming from these reliable scientific sources constitutes or implies the existence of a "conflict" of "theories."
There is no scientific conflict. ID is not a theory in the ordinary sense of science, and it is certainly not a reputable "alternate view" of the planet's life. It has no unique content other than its claim for the existence of a designer. It is not worthy of the time it would take away from real science in the schools, where the time is already far too short. It is in fact the denial of theory, supported only by unsupported claims of flaws in Darwinism. No positive scientific evidence has ever been offered for ID.
We need this book because its authors have name recognition with the general reading public, because they write well, and because the fight will not end any time soon. Humanity needs to come to grips, sooner rather than later, with its biological meanings, and with the values and anti-values of its religious belief systems. The fight is just beginning. If the real values of religion and spirituality, which include humility before the wonders of nature, are to survive our rising tastes for religious war and destruction, then more than just an elite among us must understand science - and what it yields as description of physical reality through deep time. The more often the small faction of us who read can pause to browse engaging books like "Intelligent Thought," the better is the chance that we can stop the impetus of Homo sapiens toward self-destruction.
Mr. Gross last wrote for these pages about Charles Darwin.
prostoalex writes New York Times Technology section this weekend is running an extensive article on Wikipedia and recentchanges to the editorial policy. ...
by ryrivard First, it wasn't just the "technology" section, it was on the front page of the National Edition.
Second, Wikipedia is damned in both directions by the media: They are either too open and so all sorts of loonies can post whatever they want. Or, when the close up a bit, they are abandoning their own principles.
Anyone who hasn't read it needs to read DIGITAL MAOISM: The Hazards of the New Online Collectivism by Jaron Lanier [edge.org] and the spirited reply [edge.org]...
(Translation and Introduction by Andrian Krey):
In the early 90's computer scientist and musician Jarnon Lanier was one of the first visionaries of a digital cutlure. He taught computer sciences at Universities like Columbia, Yale and NYU. At the end of the 90's he was leading the work on the academic Internet 2. As a musician he has worked with people like Philip Glass, Ornette Coleman and George Clinton. Jaron lanier has written the following essay 'Digital Maoism' for the series 'Original Edge Essays' for the online forum of the same name (www.edge.org), where the text launched a heated debate about the cultural qualities of the internet with the participation of wikipedia founders Larry Sanger and Jimmy Wales, computer expert Esther Dyson and media thinker Douglas Rushkoff.
To wiki or not to wiki? That is the question.
Whether ‘tis nobler to plunge in and write a few Wikipedia entries on subjects regarding which one has some expertise; and also, p'raps, to revise some of the weaker articles already available there...
Or rather, taking arms against a sea of mediocrity, to mock the whole concept of an open-source, online encyclopedia -- that bastard spawn of “American Idol” and a sixth grader’s report copied word-for-word from theWorld Book....
Hamlet, of course, was nothing if not ambivalent –- and my attitude towards how to deal with Wikipedia is comparably indecisive. Six years into its existence, there are now something in the neighborhood of 2 million entries, in various languages, ranging in length from one sentence to thousands of words.
They are prepared and edited by an ad hoc community of contributors. There is no definitive iteration of a Wikipedia article: It can be added to, revised, or completely rewritten by anyone who cares to take the time.
Strictly speaking, not all wiki pages are Wikipedia entries. As this useful item explains, a wiki is a generic term applying to a Web page format that is more or less open to interaction and revision. In some cases, access to the page is limited to the members of a wiki community. With Wikipedia, only a very modest level of control is exercised by administrators. The result is a wiki-based reference tool that is open to writers putting forward truth, falsehood, and all the shades of gray in between.
In other words, each entry is just as trustworthy as whoever last worked on it. And because items are unsigned, the very notion of accountability is digitized out of existence.
Yet Wikipedia now seems even more unavoidable than it is unreliable. Do a search for any given subject, and chances are good that one or more Wikipedia articles will be among the top results you get back.
Nor is use of Wikipedia limited to people who lack other information resources. My own experience is probably more common than anyone would care to admit. I have a personal library of several thousand volumes (including a range of both generalist and specialist reference books) and live in a city that is home to at least to three universities with open-stack collections. And that’s not counting access to the Library of Congress.
The expression “data out the wazoo” may apply. Still, rare is the week when I don’t glance over at least half a dozen articles from Wikipedia. (As someone once said about the comic strip “Nancy,” reading it usually takes less time than deciding not to do so.)
Basic cognitive literacy includes the ability to evaluate the strengths and the limitations of any source of information. Wikipedia is usually worth consulting simply for the references at the end of an article -- often with links to other online resources. Wikipedia is by no means a definitive reference work, but it’s not necessarily the worst place to start.
Not that everyone uses it that way, of course. Consider a recent discussion between a reference librarian and a staff member working for an important policy-making arm of the U.S. government. The librarian asked what information sources the staffer relied on most often for her work. Without hesitation, she answered: “Google and Wikipedia.” In fact, she seldom used anything else.
Coming from a junior-high student, this would be disappointing. From someone in a position of power, it is well beyond worrisome. But what is there to do about it? Apart, that is, from indulging in Menckenesque ruminations about the mule-like stupidity of the American booboisie?
Sure, we want our students, readers, and fellow citizens to become more astute in their use of the available tools for learning about the world. (Hope springs eternal!) But what is to be done in the meantime?
Given the situation at hand, what is the responsibility of people who do have some level of competence? Is there some obligation to prepare adequate Wikipedia entries?
Or is that a waste of time and effort? If so, what’s the alternative? Or is there one? Luddism is sometimes a temptation – but, as solutions go, not so practical.
I throw these questions out without having yet formulated a cohesive (let alone cogent) answer to any of them. At one level, it is a matter for personal judgment. An economic matter, even. You have to decide whether improving this one element of public life is a good use of your resources.
At the same time, it’s worth keeping in mind that Wikipedia is not just one more new gizmo arriving on the scene. It is not just another way to shrink the American attention span that much closer to the duration of a subatomic particle. How you relate to it (whether you chip in, or rail against it) is even, arguably, a matter of long-term historical consequence. For in a way, Wikipedia is now 70 years old.
It was in 1936 that H.G. Wells, during a lecture in London, began presenting the case for what he called a “world encyclopedia” – an international project to synthesize and make readily available the latest scientific and scholarly work in all fields. Copies would be made available all over the planet. To keep pace with the constant growth of knowledge, it would be revised and updated constantly. (An essay on the same theme that Wells published the following year is available online.)
A project on this scale would be too vast for publication in the old-fashioned format of the printed book. Besides, whole sections of the work would be rewritten frequently. And so Wells came up with an elegant solution. The world encyclopedia would be published and distributed using a technological development little-known to his readers: microfilm.
Okay, so there was that slight gap between the Wellsian conception and the Wikipedian consummation. But the ambition is quite similar -- the creation of “the largest encyclopedia in history, both in terms of breadth and depth” (as the FAQ describes Wikipedia’s goal).
Yet there are differences that go beyond the delivery system. Wells believed in expertise. He had a firm faith in the value of exact knowledge, and saw an important role for the highly educated in creating the future. Indeed, that is something of an understatement: Wells had a penchant for creating utopian scenarios in which the best and the brightest organized themselves to take the reins of progress and guide human evolution to a new level.
Sometimes that vision took more or less salutary forms. After the first World War, he coined a once-famous saying that our future was a race between education and disaster. In other moods, he was prone to imagining the benefits of quasi-dictatorial rule by the gifted. What makes Wells a fascinating writer, rather than just a somewhat scary one, is that he also had a streak of fierce pessimism about whether his projections would work out. His final book, published a few months before his death in 1946, was a depressing little volume called The Mind at the End of Its Tether, which was a study in pure worry.
The title Wells gave to his encyclopedia project is revealing: when he pulled his various essays on the topic together into a book, he called itWorld Brain. The researchers and writers he imagined pooling their resources would be the faculty of a kind of super-university, with the globe as its campus. But it would do even more than that. The cooperative effort would effectively mean that humanity became a single gigantic organism -- with a brain to match.
You don’t find any of Wells’s meritocracy at work in Wikipedia. There is no benchmark for quality. It is an intellectual equivalent of the Wild West, without the cows or the gold.
And yet, strangely enough, you find imagery very similar to that of Wells’s “world brain” emerging in some of the more enthusiastic claims for Wikipedia. As the computer scientist Jaron Lanier noted in a recent essay, there is now an emergent sensibility he calls “a new online collectivism” – one for which “something like a distinct kin to human consciousness is either about to appear any minute, or has already appeared.” (Lanier offers a sharp criticism of this outlook. See also the thoughtful responses to his essay assembled by John Brockman.)
From the “online collectivist’ perspective, the failings of any given Wikipedia entry are insignificant. “A core belief in the wiki world,” writes Lanier, “is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds.”
The problem being, of course, that it does not always work out that way. In 2004, Robert McHenry, the former editor-in-chief of the Encyclopedia Britannica, pointed out that, even after 150 edits, the Wikipedia entry on Alexander Hamilton would earn a high school student a C at best.
“The earlier versions of the article,” he noted, “are better written over all, with fewer murky passages and sophomoric summaries.... The article has, in fact, been edited into mediocrity.”
It is not simply proof of the old adage that too many cooks will spoil the broth. “However closely a Wikipedia article may at some point in its life attain to reliability,” as McHenry puts it, “it is forever open to the uninformed or semiliterate meddler.”
The advantage of Wikipedia’s extreme openness is that people are able to produce fantastically thorough entries on topics far off the beaten path. The wiki format creates the necessary conditions for nerd utopia. As a fan of the new “reimagined” "Battlestar Galactica," I cannot overstate my awe at the fan-generated Web site devoted to the show. Participants have created a sort of mini-encyclopedia covering all aspects of the program, with a degree of thoroughness and attention to accuracy matched by few entries at Wikipedia proper.
At the same time, Wikipedia is not necessarily less reliable than more prestigious reference works. A study appearing in the journal Naturefound that Wikipedia entries on scientific topics were about as accurate as corresponding articles in the Encyclopedia Britannica.
And in any case, the preparation of reference works often resembles a sausage factory more than it does a research facility. As the British writer Joseph McCabe pointed out more than 50 years ago in a critiqueof the Columbia Encyclopedia, the usual procedure is less meritocratic than one might suppose. “A number of real experts are paid handsomely to write and sign lengthy articles on subjects of which they are masters,” noted McCabe, “and the bulk of the work is copied from earlier encyclopedias by a large number of penny-a-liners.”
Nobody writing for Wikipedia is “paid handsomely,” of course. For that matter, nobody is making a penny a line. The problems with it are admitted even by fans like David Shariatmadari, whose recent article on Wikipedia ended with an appeal to potential encyclopedists “to get your ideas together, get registered, and contribute.”
Well, okay ... maybe. I’ll think about it at least. There’s still something appealing about Wells’s vision of bringing people together “into a more and more conscious co-operating unity and a growing sense of their own dignity” – through a “common medium of expression” capable of “informing without pressure or propaganda, directing without tyranny.”
If only we could do this without all the semi-mystical globaloney (then and now) about the World Brain. It would also be encouraging if there were a way around certain problems -- if, say, one could be sure that different dates wouldn’t be given for the year that Alexander Hamilton ended his term as Secretary of the Treasury.
New York City literary agent and head of the Third Culture movement John Brockman knows how to start a debate. He also knows, which debates to avoid, which is why he and his likeminded authors had stayed always stayed away from politics. Brockman and leading scientific thinkers like Pinker, Diamond and Dennett had set upon to challenge humanities by leading intellectual debates with the arguments of science. Just the same they had avoided the debate about intelligent design and the forrays of christian fundamentalists to get the American public to doubt Darwin's theory of evolution. In the past centuries there had rarely been grounds for debate between faith and
science. . . .
Briefly after the symposium (he staged at Harvard this spring) Brockman had to deal with the tar pits of intelligent design debates after all and published the anthology of essays 'Intelligent Thought'. The book features some of the best science writers who are writing against the folly of creationsim with a passion, as if their life was at stake. Brockman remembers, when he decided to meddle in this debate: "Last fall the president, the majority leader of the Senate and Senator McCain all publicly declared their support to teach Intelligent Design alongside evolution in public schools."
Das Wikipedia-Prinzip ist digitaler Maoismus, behauptet Jaron Lanier in Edge. Im Express feiern Eric Hobsbawm und Jacques AttaliKarl Marx als Denker der Globalisierung. Segolene Royal sieht das wohl etwas anders, entnehmen wir der Weltwoche. Der Economisttraut keinem Roboter. Die New York Review of Books sieht die Opiumindustrie in Afghanistan wachsen und gedeihen. Der Spectatorberichtet aus Darfur. DU widmet sich dem Volk der Kritischen Wälder. In Le Point feiert Bernard-Henri Levy Angela Merkel als lebenden Beweis für die Aktualität von Simone de Beauvoirs Werk.
Edge.org | L`Express | The Economist | Die Weltwoche | The New York Review of Books | The Spectator | Il Foglio | Nepszabadsag | Folio | DU | Le point | Elsevier | The New York Times Book Review
Die besten Essays über die bestürzende Medienrevolution namens Internet kommen nach wie vor aus den USA. Vor ein paar Wochenentwarf Kevin Kelly (mehr hier) im New York Times Magazine die euphorische Vision eines durch das Internet geschaffenen kollektiven und unendlichen Buchs. Fast gleichzeitig setzt Jaron Lanier (mehrhier), ohne direkt auf Kelly zu antworten, einen scharfen Gegenakzent und kritisiert einen von Projekten wie Wikipedia angefachten Kollektivgeist, der glaubt, dass sich der Weltgeist schon von alleine und ohne verantwortliche Autoren im Netz aggregiert. Lanier spricht von einem "new online Collectivism", "einer Wiederkehr der Idee von einem allwissenden Kollektiv": "Diese Idee hatte fürchterliche Konsequenzen, als sie in verschiedenen Epochen von rechts- oder linksextremen Kräften über uns gebracht wurde. Die Tatsache, dass sie nun wieder von prominenten Forschern und Futorologen aufgebracht wird - darunter Leuten, die ich kenne und mag - macht sie nicht weniger gefährlich." Lanier glaubt nicht an eine Abschaffung der Autorenschaft: "Das schöne am Netz ist, dass es Beziehungen zwischen Leuten herstellt. Der Wert liegt in diesen anderen Leuten. Wenn wir glauben, dass das Internet selbst als Ganzes etwas zu sagen hat, dann entwerten wir diese Leute und machen uns zu Idioten."
Über Laniers Essay werden auf edge.org intensive Debatten geführt. Es antwortet unter anderem Kevin Kelly.
Edge.org, 30.05.2006 (USA)
The best essays about the disconcerting media revolution known as the Internet continue to come from the USA. A fortnight ago in the New York Times Magazine, Kevin Kelly (more here) set out his euphoric vision of the Internet-based collective and the universal book. Almost immediately, although without direct reference to Kelly, Jaron Lanier (more here) penned an acerbic counter argument, criticising the collective spirit kindled by projects such as Wikipedia which believes a collective intelligence will aggregate by itself on the net without responsible authors. Lanier talks of a "new online collectivism" and the "resurgence of the idea that the collective is all-wise". "This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous." Lanier does not believe in erasing authorship: "The beauty of the Internet is that it connects people. The value is in the other people. If we start to believe that the Internet itself is an entity that has something to say, we're devaluing those people and making ourselves into idiots."
Lanier's essay provoked many people to enter into the debate at edge.org, Kevin Kelly among them.
Peter Oborne reports from Darfur: "When we visited the scene of the battle we found that bodies had beenshoved hastily into mass graves. An arm stuck out from under one bush, and the flesh had been eaten by wild animals. A human foot obtruded from another grave. Dried pools of blood stained the ground. The stench of human putrefaction was heavy in the air. Bits and pieces of clothing, spent bullets and theprotective amulets used by African fighters lay scattered on the ground. One body still lay exposed. The dead man had evidently climbed a tree to escape his attackers, but been shot down from his hiding place."
Il Foglio, 10.06.2006 (Italy)
The Golf GTI was, sociologically speaking at least, the forerunner of the now controversial SUV, writesMaurizio Crippa, and also the perfect symbol of the 80s. "If cars have a spirit, then it is certainly an evil one, demonic. The enemy is inside them, a man like in Stephen King's 'Christine' of 1983. Christine might have been a Plymouth Fury of 1958, but its cursed spirit uncovered the ghastly depths of the GT decade and all the souped-up, turbo-boosted and drilled-out engines. That all came to an end in 1989, famously the year of salvation. The Golf, in particular the GTI, the black one - and we are not talking about the one with rabbit's foot in the back – was aggressive, demanding, loud."
After substantial renovation, the legendary New York coffee house, one of the most important literary coffee houses of the Danube monarchy has reopened. The writer Ivan Bächer recollects:"Once upon a time, not only the coffee house but the whole palace, even every room, every corner every nook and cranny of the the entire block of the surrounding houses was full of journalists, writers, publishing houses and editorial offices." The new Italian owners have redeveloped the literary spirit to death, Bächer states disappointedly: "On the wall is a box of reinforced glass in which a dozen beautiful old books are hermetically sealed. A book safe. At the opening celebrations in 1895 the playwright Ferenc Molnar threw the keys to the coffee house into the Danube so that the splendid institution could never be closed again. After the reopening, perhaps someone should take the precaution of throwing the keys to the reinforced glass box into the Danube to prevent anyone from entertaining the idea of ever opening a book in these rooms. (Here and here photos of the coffee house in its heyday, Here, here and here after the renovation.)
DU magazine focusses on Germany for the World Cup and has its correspondents report from every corner of the Bundesrepublik. As usual only a very small selection is available online, but Albrecht Tübke's photographic portraits which accompany the pieces of writing can be viewed here.
The lengthy discursive essays are less illuminating than the small atmospheric pieces such as the one by Svenja Leibe on the village where she grew up. "Drive off the motorway, on and on through the scattered settlements, none of which you will find surprising. Drive through them, but do not hope to see anything through the panorama windows of the bungalows, drive on down the curvy streets, past the pig farms, past the silver bunting of the car show rooms. Follow the neon coloured invitations to 'foam parties and barn raves'. Look out for people, you won't see many of them. Don't think the red lantern in front of the family house is a forgotten Christmas decoration. Drive. Drive down the pretty hill, on past the hidden building sites in the garden of the old pheasantry, down to the 'tank resistant' bridge that stoutly spans a tiny stream. The road runs directly into the heart of the village and to a little house behind a metre-long curve sign where it turns very sharply to the left. Don't look out of the window with too much interest here, you will only make them suspicious. There is nothing to buy any more. Leave them in peace. Let them file away at their gardens, take that seriously."
Inspired by Isaac Asimov's futuristic vision "I, Robot", The Economist asks in itsTechnology Quarterly how secure our future will be among robots. Do Asimov's three lawsfor the protection of humans hold today? "Regulating the behaviour of robots is going to become more difficult in the future, since they will increasingly have self-learning mechanisms built into them, says Gianmarco Veruggio, a roboticist at the Institute of Intelligent Systems for Automation in Genoa, Italy. As a result, their behaviour will become impossible to predict fully, he says, since they will not be behaving in predefined ways but will learn new behaviour as they go."
Other articles dealing with new fuel cells, artificial neural networks in car motors and the victory march of Bluetooth (wireless personal area networks) are unfortunately not online. Not in the magazine but also topical here is Robocup, the world robot football championships taking place this week in Bremen.
Does globalisation make Karl Marx a "pioneer of modern thinking"? The question is tossed around in this issue by two indiviuals who are convinced the answer is yes: English historianEric Hobsbawm and Jacques Attali, economist and former advisor to Francois Mitterand, whose book "Karl Marx ou l'esprit du monde" was published last year. Hobsbawm finds a renewed interest in Marx entirely natural: "Today we are seeing the globalised economy thatMarx anticipated. Still, he didn't foresee all of its repercussions. For example, the Marxist prophesy whereby an increasingly numerous proletariat topples capitalism in the industrial countries did not come about." Attali comments: "The Socialist International was a remarkable attempt on Marx's part to think the world in its entirety. Marx is an extraordinarily modern thinker, because rather than sketching the outlines of a socialist state, his writings describe the capitalism of the future."
Daniel Binswanger portrays Segolene Royal, the promising presidential candidate whose conservative views are pushing French socialists into an identity crisis. "Re-education camps for criminal youths controlled by the army, state paternalism of parents with authority problems, cutbacks in funding for people with delinquent children: for the last week people in France have been discussing a whole catalogue of measures aimed at coming to grips with youth violence in the banlieues. But for once the debate has not been set off by the hyperactive Minister of the Interior Nicolas Sarkozy. The French are rubbing their eyes in disbelief: as if in a political mirage, the discourse on law-and-order has changed camps."
What's become of lunch? A sandwich gulped down while you're walking. Folio presents this rule and exceptions to it.
Stephan Israel visits Michel Addons, cook for the Italian EU Commission: "Today there's lobster tails on spring rolls with ginger and oyster sauce. For the main course there's veal sweetbreads with new potatoes and green asparagus from Provence. For desert there's strawberries on creme brulee. Today is the yearly visit of the much-feared auditors from Luxembourg."
Italian author Andrea Camilleri commiserates with those who have to swallow down a hamburger on the street, reminiscing about how his grandmother used to cook at noon. "As primo there was mostly pasta, as a gratin or with meat sauce, sometimes there was also melanzane alla parmigiana. As secondo there was poultry, lamb or fish, then cheese and sausages. Of course a lunch like that took its time. No one went back to work before four in the afternoon."
Bernard-Henri Levy is up in arms that no one in France has said a word about Simone de Beauvoir, who died 20 years ago. In his "notebook" column, he pays homage to seven women, all of whom are "proof of the timelessness of de Beauvoir's tremendous work": Hillary Clinton, Condoleezza Rice, Chilean president Michelle Bachelet, French politician Segolene Royal, women's rights activist Fadela Amara, Burmese Nobel Peace Prize laureate Aung San Suu Kyi and – German chancellor Angela Merkel. Levy writes: "Angela Merkel, 'that woman' as Gerhard Schröder, Putinist and world record holder in matters of corruption under a democracy, called her; that 'girl' who peeved him no end at the time of his election defeat... She, the specialist in quantum physics (elementary particles are not Michel Houellebecq's terrain, but hers), enjoys a popularity that has her predecessor, and all of Europe's heads of government, green with envy. And on top of that she's rehabilitating the finances of an economy that thanks to her is once more becoming what it always has been and should definitely be once more: the moving force in the European equation."
Five years after the American victory over the Taliban, Ahmed Rashid sees Afghanistan once more on the verge of collapse: "A revived Taliban movement has made a third of the country ungovernable. Together with al-Qaeda, Taliban leaders are trying to carve out new bases on the Afghanistan–Pakistan border. They are aided by Afghanistan's resurgent opium industry, which has contributed to widespread corruption and lawlessness, particularly in the south. The country's huge crop of poppies is processed into opium and refined into heroin for export, now accounting for close to 90 percent of the global market."
Further articles: Alan Ryan presents three books in which renowned philosophers – Kwame Anthony Appiah, Amartya Sen and Martha Nussbaum – address concepts of cultural diversity andcosmopolitanism. Freeman J. Dyson reviews Daniel C. Dennet's philosophical treatise on religion,"Breaking the Spell", in which Dennett pinpoints the real problem as "belief in belief": "He finds evidence that large numbers of people who identify themselves as religious believers do not in fact believe the doctrines of their religions but only believe in belief as a desirable goal."
THE Web has given airline customers more convenience and more power. The ability to compare prices instantly at several airlines — something that was previously available only to travel agents — can't help but keep prices down.
But the airlines are still in control. The complicated algorithms they employ to analyze demand, competitors' prices and other data are the reason the same flight costs $350 one day and $550 the next. Here, online travel sites like Expedia and Travelocity aren't much help. That's because "Expedia's real customers are the travel companies — not you," writes John Battelle of Searchblog (battellemedia.com).
Farecast, a new Web service still being tested, monitors and analyzes price data and gives probabilities on when and by how much future fares might rise or fall. The "when" is crucial. Fares tend to fluctuate, but the trick is to know when they will hit their low point. Farecast is designed to predict it for you.
For Mr. Battelle, Farecast represents a potential return to the Web's early promise of shifting power to consumers. That was thwarted, he writes, when merchants began to collude with one another and with aggregators. "If you think AutoByTel or Expedia is on your side, you're kidding yourself."
One problem with Farecast is that it doesn't include Southwest Airlines, which doesn't supply information to aggregators. For Michael Arrington, that's a deal breaker. "Farecast is a nice solution that distills useful information from complete pricing chaos by the airlines," Mr. Arrington writes in his TechCrunch blog. But without Southwest, "the lowest and most understandable prices are excluded from the service."
The Trouble With Wikis There is nothing wrong, per se, with Wikipedia, writes Jaron Lanier, the computer scientist, artist and author, in a provocative essay on the Web site Edge: The Third Culture (edge.org). Rather, he says, the problem is how Wikipedia is used and the way it has been elevated to such importance so quickly.
Is it a good idea to rely on an encyclopedia that can be changed on a whim by any number of anonymous users? Is relying on the "hive mind" envisioned by the former Wired magazine editor Kevin Kelly the way to go about using the Web?
Usually not, Mr. Lanier writes. Doing so amounts to taking techno-utopianism to its extreme — favoring the tool over the worker, and the collective over the individual.
The kind of "foolish collectivism" represented by Wikipedia — as well as "meta" sites like Digg, Reddit and popurls, which aggregate sites based on popularity-driven algorithms — grinds away the Web's edges and saps it of its humanity, he argues. "The fallacy of the infallible collective" gives such sites more credibility than they deserve, he writes.
Often, he acknowledges, the hive mind is smarter than any individual — in determining prices, for example. "The collective is good at solving problems which demand results that can be evaluated by uncontroversial performance parameters," he writes, "but it is bad when taste and judgment matter." Often, he says, it is best to combine the strengths of the hive mind with those of the individual — as with open-source software.
"The best guiding principle is to always cherish individuals first," he concludes.
Luminaries like Mr. Kelly, Douglas Rushkoff, Esther Dyson, Howard Rheingold and Jimmy Wales, a founder of Wikipedia, reacted to the essay, "Digital Maoism," on Edge.
On Wikipedia, Mr. Kelly said, there is "far more deliberate design management going on than first appears."
The "bottom-up hive mind will never take us to our end goal," he adds. "We are too impatient. So we add design and top-down control to get where we want to go."
Monkey Chow Diaries "Imagine going to the grocery store only once every six months," paying less than $1 a meal, Adam Scott, a blogger writes. His imaginings led him to experiment with a diet of nothing but Monkey Chow. It is "a complete and balanced diet for the nutrition of primates," he says. Track his — um, progress? — at angryman.ca/monkey.html. DAN MITCHELL
Two weeks ago, Edge.org published Jaron Lanier's essay "Digital Maoism: The Hazards of the New Online Collectivism," critiquing the importance people are now placing on Wikipedia and other examples of the "hive mind," as people called it in the cyberdelic early 1990s. It's an engaging essay to be sure, but much more thought-provoking to me are the responses from the likes of Clay Shirky, Dan Gillmor, Howard Rheingold, our own Cory Doctorow, Douglas Rushkoff, and, of course, Jimmy Wales.
From Douglas Rushkoff:
I have a hard time fearing that the participants of Wikipedia or even the call-in voters of American Idol will be in a position to remake the social order anytime, soon. And I'm concerned that any argument against collaborative activity look fairly at the real reasons why some efforts turn out the way they do. Our fledgling collective intelligences are not emerging in a vacuum, but on media platforms with very specific biases.
First off, we can't go on pretending that even our favorite disintermediation efforts are revolutions in any real sense of the word. Projects like Wikipedia do not overthrow any elite at all, but merely replace one elite — in this case an academic one — with another: the interactive media elite...
While it may be true that a large number of current websites and group projects contain more content aggregation (links) than original works (stuff), that may as well be a critique of the entirety of Western culture since post-modernism. I'm as tired as anyone of art and thought that exists entirely in the realm of context and reference — but you can't blame Wikipedia for architecture based on winks to earlier eras or a music culture obsessed with sampling old recordings instead of playing new compositions.
Honestly, the loudest outcry over our Internet culture's inclination towards re-framing and the "meta" tend to come from those with the most to lose in a society where "credit" is no longer a paramount concern. Most of us who work in or around science and technology understand that our greatest achievements are not personal accomplishments but lucky articulations of collective realizations. Something in the air... Claiming authorship is really just a matter of ego and royalties.
From Cory Doctorow:
Wikipedia isn't great because it's like the Britannica. The Britannica is great at being authoritative, edited, expensive, and monolithic. Wikipedia is great at being free, brawling, universal, and instantaneous.
From Jimmy Wales (italics indicate quotes from Jaron's original essay):
"A core belief of the wiki world is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds."
My response is quite simple: this alleged "core belief" is not one which is held by me, nor as far as I know, by any important or prominent Wikipedians. Nor do we have any particular faith in collectives or collectivism as a mode of writing. Authoring at Wikipedia, as everywhere, is done by individuals exercising the judgment of their own minds.
"The best guiding principle is to always cherish individuals first."
UPDATE: Jaron Lanier writes us that he's received a lot of negative feedback from people who he thinks may not have actually read his original essay:
In the essay i criticized the desire (that has only recently become influential) to create an "oracle effect" out of anonymity on the internet - that's the thing i identified as being a new type of collectivism, but i did not make that accusation against the wikipedia - or against social cooperation on the net, which is something i was an early true believer in- if i remember those weird days well, i think i even made up some of the rhetoric and terminology that is still associated with net advocacy today- anyway, i specifically exempted many internet gatherings from my criticism, including the wikipedia, boingboing, google, cool tools... and also the substance of the essay was not accusatory but constructive- the three rules i proposed for creating effective feedback links to the "hive mind" being one example.
Collectives have their uses, but writing encyclopedias? With no firm editorial hand? Call it the Wikipedia problem...
More than a dozen years ago I was involved in a project to build an internet-delivered encyclopedic reference source. Those of us who worked on it were dazzled by the potential that seemed to be opening up before us. There was a worldwide communication network that anyone could use; here in hand was the most comprehensive and authoritative generalreference work in the English language; and in between us and the goal that grew more ambitious each day were only some technical challenges and the limits of our imaginations. It was a wonderful time to be an encyclopedia editor.
Well, things didn't work out just as we hoped, for reasons too numerous to mention here. I recall this episode mainly to make the point that I understand the enthusiasm, the evangelism, that Wikipedia evokes in many, many people. I wish I could share it with them now. But, as David Shariatmadari's openDemocracy article "The sultan and the glamour model" (25 May 2006) shows once again, Wikipedia's most eloquent advocates fail, or refuse, to acknowledge certain issues.
Bias and imbalance
Shariatmadari's article praises the work of a group calling itself by the unfortunately self-congratulatory labelWikiproject: Countering Systemic Bias and ends with a call for more such efforts to improve the coverage of the encyclopedia. Certainly such work is needed. I would suggest that it needs to begin with a clear distinction between "bias" and "imbalance", terms that Shariatmadari uses interchangeably but that to an editor mean quite different things. The Wikiproject seems to concern itself with topics that are treated in insufficient detail or not at all; to me, this is addressing imbalance. "Bias" denotes a lack of objectivity or fairness in the treatment of topics. Thus, when a writer called Joseph McCabe alleged in a widely distributed pamphlet that certain articles in theEncyclopedia Britannica had been unduly influenced by the Catholic church, he was charging bias. (That was in 1947, and he was quite wrong, by the way.)
Is imbalance in Wikipedia "systemic"? I should rather say that it results inevitably from a lack of system. Given the method by which Wikipedia articles are created, for there to be any semblance of balance in the overall coverage of subject-matter would be miraculous. Balance results from planning. As an example, the planning of the coverage of the fifteenth edition of Britannica took an in-house staff and dozens of advisers several years to complete. That was forty years ago; it would be harder now.
It is unremarkable that the topics covered at present in Wikipedia reflect the interests of those who contribute to it, and that these contributors represent a relatively narrow, self-selected segment of society. In the absence of planning and some degree of central direction, how else could it have been?
It is well to bear in mind also that imbalance is a judgment, not a fact, and that it cannot be reduced to numbers. To say that article A is longer than article B is not to show that B has not been given its due. Some subjects require more background, more context, more sheer wordage to convey a sense of understanding to the reader. Are 260 lines too much to devote to the Scots language? Clearly, someone does not think so. Someone else might well feel that there ought to be much more. Three lines for the language of the Yi is almost certainly too few, but what is the right number? Who – I'm asking for a showing of hands here – knows? What is lacking is not some numerical standard but editorial standards: a set of principles that define what constitutes adequate treatment of various kinds of topics for an intended audience.
Truth and openness
David Shariatmadari writes that the situation is "uncannily like free market economics applied to knowledge." This is quite inapt. I suppose it is meant to shock; what could be worse than, you know, capitalism? I'll just point out that another shocking word that might properly be applied to Wikipedia is "globalist." Sorry, but I calls 'em as I sees 'em.
More seriously, a better analogy might be a children's soccer team. It is notorious that, in the United States, at least, a game involving the youngest children will consist of a swarm of twenty or so players buzzing ineffectively about the ball. As the children grow older, however, they will develop individual skills and learn to play positions and to execute strategies. Just so, traditionally, have editors honed skills, learned appropriate methods and processes, and developed the synoptic view required by the job.
No complex project can be expected to yield satisfactory results without a clear vision of what the goal is – and here I mean what a worthy internet encyclopedia actually looks like – and a plan to reach that goal, which will include a careful inventory of the needed skills and knowledge and some meaningful measures of progress. To date, the "hive mind" of Wikipedia's "digital Maoism" (as Jaron Lanier'svigorous critique on edge.org calls it) displays none of these.
That vision of the goal must do something that Wikipedia and Wikipedians steadfastly decline to do today, and that is to consider seriously the user, the reader. What is the user meant to take away from the experience of consulting a Wikipedia article? The most candid defenders of the encyclopedia today confess that it cannot be trusted to impart correct information but can serve as a starting-point for research. By this they seem to mean that it supplies some links and some useful search terms to plug into Google. This is not much. It is a great shame that some excellent work – and there is some – is rendered suspect both by the ideologically required openness of the process and by association with much distinctly not excellent work that is accorded equal standing by that same ideology.
One simple fact that must be accepted as the basis for any intellectual work is that truth – whatever definition of that word you may subscribe to – is not democratically determined. And another is that talent, whether for soccer or for exposition, is not equally distributed across the population, while a robust confidence is one's own views apparently is. If there is a systemic bias in Wikipedia, it is to have ignored so far these inescapable facts.
"powerful and persuasive essays"
In this paperback original, 16 noted scientists, including Steven Pinker and Richard Dawkins refute the "intelligent design" movement in powerful and persuasive essays.