Two weeks ago, Edge.org published Jaron Lanier's essay "Digital Maoism: The Hazards of the New Online Collectivism," critiquing the importance people are now placing on Wikipedia and other examples of the "hive mind," as people called it in the cyberdelic early 1990s. It's an engaging essay to be sure, but much more thought-provoking to me are the responses from the likes of Clay Shirky, Dan Gillmor, Howard Rheingold, our own Cory Doctorow, Douglas Rushkoff, and, of course, Jimmy Wales.
From Douglas Rushkoff:
I have a hard time fearing that the participants of Wikipedia or even the call-in voters of American Idol will be in a position to remake the social order anytime, soon. And I'm concerned that any argument against collaborative activity look fairly at the real reasons why some efforts turn out the way they do. Our fledgling collective intelligences are not emerging in a vacuum, but on media platforms with very specific biases.
First off, we can't go on pretending that even our favorite disintermediation efforts are revolutions in any real sense of the word. Projects like Wikipedia do not overthrow any elite at all, but merely replace one elite — in this case an academic one — with another: the interactive media elite...
While it may be true that a large number of current websites and group projects contain more content aggregation (links) than original works (stuff), that may as well be a critique of the entirety of Western culture since post-modernism. I'm as tired as anyone of art and thought that exists entirely in the realm of context and reference — but you can't blame Wikipedia for architecture based on winks to earlier eras or a music culture obsessed with sampling old recordings instead of playing new compositions.
Honestly, the loudest outcry over our Internet culture's inclination towards re-framing and the "meta" tend to come from those with the most to lose in a society where "credit" is no longer a paramount concern. Most of us who work in or around science and technology understand that our greatest achievements are not personal accomplishments but lucky articulations of collective realizations. Something in the air... Claiming authorship is really just a matter of ego and royalties.
From Cory Doctorow:
Wikipedia isn't great because it's like the Britannica. The Britannica is great at being authoritative, edited, expensive, and monolithic. Wikipedia is great at being free, brawling, universal, and instantaneous.
From Jimmy Wales (italics indicate quotes from Jaron's original essay):
"A core belief of the wiki world is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds."
My response is quite simple: this alleged "core belief" is not one which is held by me, nor as far as I know, by any important or prominent Wikipedians. Nor do we have any particular faith in collectives or collectivism as a mode of writing. Authoring at Wikipedia, as everywhere, is done by individuals exercising the judgment of their own minds.
"The best guiding principle is to always cherish individuals first."
UPDATE: Jaron Lanier writes us that he's received a lot of negative feedback from people who he thinks may not have actually read his original essay:
In the essay i criticized the desire (that has only recently become influential) to create an "oracle effect" out of anonymity on the internet - that's the thing i identified as being a new type of collectivism, but i did not make that accusation against the wikipedia - or against social cooperation on the net, which is something i was an early true believer in- if i remember those weird days well, i think i even made up some of the rhetoric and terminology that is still associated with net advocacy today- anyway, i specifically exempted many internet gatherings from my criticism, including the wikipedia, boingboing, google, cool tools... and also the substance of the essay was not accusatory but constructive- the three rules i proposed for creating effective feedback links to the "hive mind" being one example.
More than a dozen years ago I was involved in a project to build an internet-delivered encyclopedic reference source. Those of us who worked on it were dazzled by the potential that seemed to be opening up before us. There was a worldwide communication network that anyone could use; here in hand was the most comprehensive and authoritative generalreference work in the English language; and in between us and the goal that grew more ambitious each day were only some technical challenges and the limits of our imaginations. It was a wonderful time to be an encyclopedia editor.
Well, things didn't work out just as we hoped, for reasons too numerous to mention here. I recall this episode mainly to make the point that I understand the enthusiasm, the evangelism, that Wikipedia evokes in many, many people. I wish I could share it with them now. But, as David Shariatmadari's openDemocracy article "The sultan and the glamour model" (25 May 2006) shows once again, Wikipedia's most eloquent advocates fail, or refuse, to acknowledge certain issues.
Bias and imbalance
Shariatmadari's article praises the work of a group calling itself by the unfortunately self-congratulatory labelWikiproject: Countering Systemic Bias and ends with a call for more such efforts to improve the coverage of the encyclopedia. Certainly such work is needed. I would suggest that it needs to begin with a clear distinction between "bias" and "imbalance", terms that Shariatmadari uses interchangeably but that to an editor mean quite different things. The Wikiproject seems to concern itself with topics that are treated in insufficient detail or not at all; to me, this is addressing imbalance. "Bias" denotes a lack of objectivity or fairness in the treatment of topics. Thus, when a writer called Joseph McCabe alleged in a widely distributed pamphlet that certain articles in theEncyclopedia Britannica had been unduly influenced by the Catholic church, he was charging bias. (That was in 1947, and he was quite wrong, by the way.)
Is imbalance in Wikipedia "systemic"? I should rather say that it results inevitably from a lack of system. Given the method by which Wikipedia articles are created, for there to be any semblance of balance in the overall coverage of subject-matter would be miraculous. Balance results from planning. As an example, the planning of the coverage of the fifteenth edition of Britannica took an in-house staff and dozens of advisers several years to complete. That was forty years ago; it would be harder now.
It is unremarkable that the topics covered at present in Wikipedia reflect the interests of those who contribute to it, and that these contributors represent a relatively narrow, self-selected segment of society. In the absence of planning and some degree of central direction, how else could it have been?
It is well to bear in mind also that imbalance is a judgment, not a fact, and that it cannot be reduced to numbers. To say that article A is longer than article B is not to show that B has not been given its due. Some subjects require more background, more context, more sheer wordage to convey a sense of understanding to the reader. Are 260 lines too much to devote to the Scots language? Clearly, someone does not think so. Someone else might well feel that there ought to be much more. Three lines for the language of the Yi is almost certainly too few, but what is the right number? Who – I'm asking for a showing of hands here – knows? What is lacking is not some numerical standard but editorial standards: a set of principles that define what constitutes adequate treatment of various kinds of topics for an intended audience.
Truth and openness
David Shariatmadari writes that the situation is "uncannily like free market economics applied to knowledge." This is quite inapt. I suppose it is meant to shock; what could be worse than, you know, capitalism? I'll just point out that another shocking word that might properly be applied to Wikipedia is "globalist." Sorry, but I calls 'em as I sees 'em.
More seriously, a better analogy might be a children's soccer team. It is notorious that, in the United States, at least, a game involving the youngest children will consist of a swarm of twenty or so players buzzing ineffectively about the ball. As the children grow older, however, they will develop individual skills and learn to play positions and to execute strategies. Just so, traditionally, have editors honed skills, learned appropriate methods and processes, and developed the synoptic view required by the job.
No complex project can be expected to yield satisfactory results without a clear vision of what the goal is – and here I mean what a worthy internet encyclopedia actually looks like – and a plan to reach that goal, which will include a careful inventory of the needed skills and knowledge and some meaningful measures of progress. To date, the "hive mind" of Wikipedia's "digital Maoism" (as Jaron Lanier'svigorous critique on edge.org calls it) displays none of these.
That vision of the goal must do something that Wikipedia and Wikipedians steadfastly decline to do today, and that is to consider seriously the user, the reader. What is the user meant to take away from the experience of consulting a Wikipedia article? The most candid defenders of the encyclopedia today confess that it cannot be trusted to impart correct information but can serve as a starting-point for research. By this they seem to mean that it supplies some links and some useful search terms to plug into Google. This is not much. It is a great shame that some excellent work – and there is some – is rendered suspect both by the ideologically required openness of the process and by association with much distinctly not excellent work that is accorded equal standing by that same ideology.
One simple fact that must be accepted as the basis for any intellectual work is that truth – whatever definition of that word you may subscribe to – is not democratically determined. And another is that talent, whether for soccer or for exposition, is not equally distributed across the population, while a robust confidence is one's own views apparently is. If there is a systemic bias in Wikipedia, it is to have ignored so far these inescapable facts.
"powerful and persuasive essays"
In this paperback original, 16 noted scientists, including Steven Pinker and Richard Dawkins refute the "intelligent design" movement in powerful and persuasive essays.
Jaron Lanier, who more or less invented virtual reality in the 1980s (making me a lifelong Lanier fan), has published a fascinating Edge essay on Digital Maosim: The Hazards of the New Online Collectivism.
The opening gambit is: "The hive mind is for the most part stupid and boring. Why pay attention to it?" What he is pointing to is the collective output exemplified by Wikipedia etc, meta-sources of informaiton such as Google, and meta-meta-meta sources such as (in increasing order of meta-ness), Boing Boing, Digg and Popurls.
It's not hard to see why the fallacy of collectivism has become so popular in big organizations: If the principle is correct, then individuals should not be required to take on risks or responsibilities. We live in times of tremendous uncertainties coupled with infinite liability phobia, and we must function within institutions that are loyal to no executive, much less to any lower level member. Every individual who is afraid to say the wrong thing within his or her organization is safer when hiding behind a wiki or some other Meta aggregation ritual.
I've participated in a number of elite, well-paid wikis and Meta-surveys lately and have had a chance to observe the results. I have even been part of a wiki about wikis. What I've seen is a loss of insight and subtlety, a disregard for the nuances of considered opinions, and an increased tendency to enshrine the official or normative beliefs of an organization. Why isn't everyone screaming about the recent epidemic of inappropriate uses of the collective? It seems to me the reason is that bad old ideas look confusingly fresh when they are packaged as technology.
Why do we do it? As Lanier points out later:
It's safer to be the aggregator of the collective. You get to include all sorts of material without committing to anything. You can be superficially interesting without having to worry about the possibility of being wrong.
Comment: Edge is based on the idea of accumulating the knowledge of a very small number of the world's smartest people -- more or less the opposite of Google or Wikipedia
The intelligent-design movement suffered a political setback last December when a federal judge ordered a Pennsylvania school district to stop talking about it in high school, but it lives on as an idea, to the bemusement and occasional frustration of most serious scientists. Sixteen of them, including Dennett, contributed essays in defense of evolution to a small anthology called "Intelligent Thought," published last week. It was compiled by John Brockman, better known as the editor of the Web site edge.org, the thinking man's Drudge Report. Evolutionary biologist Richard Dawkins deconstructs the claim by ID proponents that the "designer" could be an intelligent alien rather than God, and psychologist Steven Pinker shows how moral sensibility can arise by way of natural selection. "Evolutionary biology certainly hasn't explained everything that perplexes biologists," Dennett concludes, "but Intelligent Design hasn't yet tried to explain anything at all."
It's just an empty glass box now, but this site will become the world's most powerful nerd magnet tomorrow. Expect to see geeks flying through the air towards it, whoosh! over Manhattan, like steel dust drawn to a neodymium disc. Many thanks to literary uber-agent John Brockman for the photo. Link to full-size (jpeg). Steve Jurvetson has some thoughts about it here.
The worst kind of argument to have is one with someone who Just Doesn't Get It. The debates that find your well-reasoned points countered with the tautological equivalent of "nuh-uh" or "because, that's why" may not make you feel like you lost the argument, but you certainly don't feel like you won, either. Especially when the topic you're disagreeing on isn't even something that should be up for debate.
That's the overriding sense one suspects the writers of the essays in Intelligent Thought were experiencing when they put pen to paper. More than one of them, I'm sure, muttered to himself: "I can't fucking believe I'm having to write this."
By elegantly and eloquently explaining the airtight science behind Darwinism (not a theory anymore, by the way, but a scientifically proven fact) and deftly swatting away the distortions and dogma that define ID, Brockman and the other contributors to Intelligent Thought may not end the "debate" with this book, but they've managed to provide an excellent and readable primer on evolution and the power of the scientific method.
Intellectuals are not just people who know things, but people who shape the thoughts of their generation...
Edge is not so much the "Internet as highbrow cocktail party," as it is the "Internet as Center for Advanced Studies." Here, Brockman and the leading thinkers in a raft of scientific and social disciplines exchange ideas and build theories…and we get to watch.
EVERYONE HAS A fleeting fantasy in which they are reborn as, say, a Hollywood star or a stupendously wealthy author. My occasional fancy is that I am a science reporter of some repute, bringing beard-tuggingly important matters — such as the dialogue between science and religion — to the attention of readers and opinion-formers.
So I flirted with the idea of applying for a Templeton-Cambridge Journalism Fellowship in Science and Religion. The placement at Cambridge University would undoubtedly be fun — I’d spend two months listening to scientists, religious scholars and philosophers. I’d hang out with serious thinkers, meet high-minded hacks, my credentials as an intellectual would soar. With a stipend of about £10,000, plus book allowance and travel expenses, it wouldn’t be a badly paid gig, either.
The only hitch, apart from selling the jolly to my editors, was the origin of the cheque. The John Templeton Foundation is an enormously wealthy charity that awards an annual prize of $1.4 million for Progress Toward Research or Discoveries about Spiritual Realities (Sir John Templeton, a financier, insisted that the prize should be more lucrative than the Nobel Prize).
Over the past decade Templeton prizes have gone to scientists who have explored such concepts as nothingness, infinity, and multiple universes, exactly the kind of “wow” subjects that inspire awed contemplation. Next month the Cambridge University cosmologist John D. Barrow will receive his cheque at Buckingham Palace; he is praised for work that “has illuminated understanding of the Universe and cast the intrinsic limitations of scientific inquiry into sharp relief” .
Ah, yes, the “limitations of scientific enquiry”. This quote hints at the religious agenda of the foundation, which has become a significant donor to such institutions as Oxford University, where it is funding research to discover whether religious belief can reduce pain. The foundation is also paying for studies about the effect of prayer on health. That would be fine, were it not for the aims stated on the section of its website devoted to spirituality and health: “. . . the foundation hopes to contribute to the reintegration of faith into modern life”.
The foundation wisely rejects intelligent design but nevertheless emphasises the metaphysical dimension of any funded research: “What can research tell us about God, about the nature of divine action in the world, about meaning and purpose?” it asks. Which, to my reading, assumes the existence of both God and divine action.
Anyway, at the end of their jaunt, Templeton journalism fellows are “encouraged to write and publish news stories, editorial pieces, or magazine articles ... contributing to a more informed public discussion of the relationship between science and religion”.
Now, consider that one of my more memorable articles about just this topic contended that illusions of the divine may point to mental illness. Another article rubbished a study that claimed that childless couples could double their chances of IVF success by getting strangers to pray for them. Neither study was associated in any way with the foundation, but I wonder whether it would have considered those pieces “more informed”?
My vague misgivings have now been articulated by John Horgan, a science writer and agnostic who became a 2005 Templeton fellow. “I rationalised that taking the foundation’s money did not mean that it had bought me, as long as I remained true to my views,” he wrote last week in The Chronicle of Higher Education, the US equivalent of The Times Higher (click here to read his essay).
So, what happened when Horgan told a foundation official that he had no wish for religion and science to be reconciled? “She told us that . . . she didn’t think someone with those opinions should have accepted a fellowship.”
I applaud those writers who become Templeton fellows; I commend their desire to learn more and I wish them well in their efforts to keep an open mind. In truth, I envy them their two-month summer sabbatical.
Perhaps I lack backbone, but I worry that accepting the foundation’s largesse might make me a bit soft. And a soft reporter is the last thing needed by infertile couples who wrongly believe that a stranger’s prayer will help to bring them a child.
From left, Elizabeth Spelke, John Brockman, Seth Lloyd, and Daniel C. Dennett engage in a panel discussion of the book “What We Believe But Cannot Prove,” to which they all contributed, in Radcliffe’s Longfellow Hall yesterday.
Last night, three Harvard professors, a Massachusetts Institute of Technology (MIT) professor, and a Tufts professor provided their own answers to this question before a crowded audience in Askwith Lecture Hall at the Graduate School of Education.
The ideas that they debated included individual consciousness, a common human gene pool, and the existence of electrons.
The discussion, sponsored by the Harvard Bookstore and Seed Magazine, marked the recent release of the essay collection, “What We Believe But Cannot Prove: Today’s Leading Thinkers on Science in the Age of Certainty,” which was edited by John Brockman.
The panelists, who all contributed essays to the book, featured Harvard psychology professors, Daniel Gilbert, Mark D. Hauser, and Elizabeth Spelke, as well as a Tufts philosophy professor, Daniel C. Dennett, and an MIT engineering professor, Seth Lloyd.
Spelke said she believes human beings are alike, but that she also believes they are predisposed to believe they are fundamentally different.
She said, though, that she remained convinced that people are capable of overcoming their beliefs when these are disproved.
Gilbert claimed that “the only fact that proves itself is our own experience.”
“The fact of your experience is not a fact to me,” he said.
He argued that we can demonstrate “to our own satisfaction” that a creature has a consciousness.
After the introductory remarks, discussion focused largely on human consciousness and language.
The ability to communicate through language may be a key to determining other people’s consciousness and experience, according to Dennett.
But the panelists also pointed out that another unique feature of language is that it allows humans to hide intentions.
“God gave us language so we can conceal our thoughts,” Dennett responded.
Questions from the audience focused on issues such as free will, spirituality, and what constitutes certainty and proof.
“Proof is that which makes everybody shut up,” Gilbert said.
“No,” Hauser responded to break the silence, prompting laughter from the audience.
Todos los años, el sitio de Internet www.edge.org, que nuclea a los científicos más importantes y prestigiosos del mundo, inaugura el calendario haciéndoles a sus miembros una pregunta crucial. La de este año fue ni más ni menos que: ¿cuál es la idea más peligrosa del mundo? A continuación, las diez respuestas más explosivas, y una yapa.
...Presented with photos on a screen, the white Israeli infants preferred looking at new faces of their own race; African babies raised in Ethiopia preferred to look at African faces. But the Ethiopian-Israeli infants, who had been exposed since birth too people of both races, showed no preference. The import of this study is ambiguous, Spelke said. The finding could mean that babies aren't born prejudiced after all—that they earn to be wary of others only if they grow up in an isolated environment. Or it would mean that babies are programmed to to use people who look more like their own parents, and this instinct can be counterbalanced through enlightened education.
If the latter interpretation proved to be the case, Spelke would be optimistic. As she recently posted on Edge [*], a Web publication that airs scientific controversies, "Humans are capable of discovering that our core intuitions about geometry once led humans to believe that the world was flat—until the science that humans perfected proved otherwise—core intuitions night lead us to believe that linguistic and racial differences mean something more fundamental than they really do.
"Nobody should ever be troubled by our research, whatever we come to find," Spelke told me. "Everybody should be troubled by the phenomena that motivate it: the pervasive tendency of people all over the world to categorize others into different social groups, despite our common and universal humanity, and to endow these groups with social and emotional significance that fuels ethnic conflict conflict and can even lead to war and genocide." This mirrors her belief that, in time, feminism will embolden more women to take up high-level careers in the physical sciences, and more of us will recognize hoe alike men's and women's minds really are. For Spelke, who has spent most of her life documenting the core knowledge that we're born with, the most important thing about it is our uniquely human abilities to rise above it.
[* ED. NOTE: See "The Science of Gender and Science—Pinker vs. Spelke, A Debate"]
John Brockman: 40 years of "intermedia kinetic environments"
Here's what the New York Timeshad to say about "cultural impresario," sci/tech literary uber-agent, and EDGE founder John Brockman -- 40 years ago, today. Snip from "So What Happens After Happenings," an article dated Sunday, September 4, 1966. "Hate Happenings. Love Intermedia Kinetic Environments." John Brockman is partly kidding, while conveying the notion that Happenings are Out and Intermedia Kinetic Environments are In in the places where the action is.
John Brockman, the New York Film Festival's 25-year-old coordinator of a special events program on independent cinema in the United States, plugging into the switched-on "expanded cinema" world in which a film is not just a movie, but an Experience, an Event, an Environment. ...
posted by Xeni Jardin at 09:26:03 PM
...The above are the opinions of experts on profound issues of love, consciousness, existence of God. However, the laypeople, too, reach a similar conclusion with the help of their common sense, which are often vague, prejudiced, and what an expert would term as irrational. Paradoxically, the rational as well as the irrational mind reaches a similar conclusion though from the opposite directions. What is then the path to truth?
Democracy is not the best way to rule a country. The concept of the free will disappear the more we learn about the brain. Internet undermines the quality of our relationships. Read the leading brains of the world list their most dangerous ideas.
You might have wondered who all those people are who write explicitly mean anonymous comments online. Face to face, most people are pretty well behaved, but a worrying number of them show a whole other face protected by their digital KuKlux Klan-hood. The danger with anonymity is one of the thoughts being debated when New York-based literary agent John Brockman asks the world's leading thinkers about their most dangerous ideas. ...
"Woran glaubst du, obwohl du es nicht beweisen kannst?", wollte Edge-Herausgeber John Brockman letztes Jahr wissen. Zuvor waren Fragen wie "Welches ist die wichtigste unerzählte Geschichte?", "Was ist die bedeutsamste Erfindung der letzten zweitausend Jahre?", "Was sind die akutesten wissenschaftlichen Probleme?" oder schlicht "Was nun?", die in die Runde geworfen worden. Mit der Fragestellung für 2006 ist es gelungen, die Atmosphäre der Dringlichkeit im Generellen noch weiter anzuheizen. "Was ist deine gefährlichste Idee?", will Edge wissen. Geantwortet haben 172 Wissenschaftler, die sich als der Third Culture-Community zugehörig begreifen und das Ideal eines Intellektuellentypus hochhalten, der den Naturwissenschaften statt der Literatur als Leitdisziplin zugewandt ist.
172 Wissenschaftler antworteten auf die Edge-Frage 2006
Seit nunmehr neun Jahren startet die Stiftung Edge mit einer Umfrage zu einem großen generellen Thema ins neue Jahr. 172 Wissenschaftler haben diesmal geantwortet. Sie geben preis, was sie für ihre gefährlichste Idee halten, die wahr werden könnte.
In a front-page article, Il Sole 24 Ore, Italy's largest financial daily, announced the "Edge Question Forum" in "Domenica", the weekend Arts & Culture section. The Forum, an ongoing project designed to bring third culture thinking to Italy, features excerpts from the Edge responses in addition to articles solicited rom Italian humanist intellectuals and scientists.
What We Believe But Cannot Prove: Today’s Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Harper Perennial, 252 pp., paperback, $13.95
Chasing Spring: An American Journey Through a Changing Season
By Bruce Stutz
Scribner, 239 pp., $24
Pilgrim on the Great Bird Continent: The Importance of Everything and Other Lessons From Darwin’s Lost Notebooks
By Lyanda Lynn Haupt
Little, Brown, 276 pp., illustrated, $24.95
For the past eight years, the website www.edge.org has tried to provoke its distinguished roster of contributors with a big, elegant question. Last year's question was this: What do you believe to be true even though you cannot prove it?
A hundred and nine prominent thinkers, including folks as accomplished as Richard Dawkins, Steven Pinker, Rebecca Goldstein, and Freeman Dyson, responded. Their answers are collected in a new book, ''What We Believe But Cannot Prove," and it makes for some astounding reading.
What do they believe (but can't prove)? Many believe there is an external reality independent of consciousness. Many believe life is pervasive in the universe. Several believe, in the words of neurologist Robert M. Sapolsky, that ''there is no God(s) or such a thing as a soul." Theoretician Judith Rich Harris believes the Neanderthals disappeared because Homo sapiens ate them. Two believe there is a God. One believes in true love. The longtime New Scientist editor Alun Anderson believes cockroaches are conscious. Harvard professor Daniel Gilbert believes you are true -- that is, he believes you have an inner life and a sense of self -- even though he cannot prove it.
Taken as a whole, this little compendium of essays will send you careening from mathematics to economics to the moral progress of the human race, and it is marvelous to watch this muddle of disciplines overlap. Will the human brain eventually be able to discover all there is to discover about the physical world? Or will there always be things that we will not know?
A few months before edge.org proposed its 2005 question, former Natural History editor Bruce Stutz was recovering from heart-valve surgery in the torpid gloom of a Brooklyn winter. Dazed, drained of energy, and feeling suddenly that his ''most verdant years" were behind him, Stutz began yearning for spring.
He yearned not just for the material manifestations of the season, but for its sparkle, the ''transformative energy possessed by growing, blossoming, transmuting things." By the end of June he traveled almost 10,000 miles in a 1984 Chevy Impala, watching spring creep northward across the United States, seeing everything he could, and writing it all down.
Part travelogue, part environmental assessment, part midlife crisis, ''Chasing Spring" is as much about Stutz himself as it is about the season. He slogs from Louisiana to Arizona to Utah to Alaska, chatting with scientists, watching birds, testing his heart on mountain passes and in the tundra of the Arctic National Wildlife Refuge. By the end of his odyssey, the spring of Arctic Alaska and the spring of Stutz's own soul have become inseparable.
''Chasing Spring" is an eclectic and digressive book. Its author makes the baffling claim that 18,000-foot peaks rise from the deserts of Arizona. He is effusive enough to offer abbreviated ruminations on the Great Salt Lake, the Roman festival of Lupercalia, and the picking of morel mushrooms.
But the charm of ''Chasing Spring" is in its raw enthusiasm, Stutz's personal invigoration braided into the ongoing invigoration of the continent. ''Bring on that juice and joy!" he writes. ''I'm ready now for the spring forest in its mist, spring it its musk, in its spring greens: canopy green, shafts of wild iris green." Reading Stutz is a bit like reading Whitman: You imagine the author stomping gleefully in puddles, peering on his hands and knees into the forget-me-nots and saxifrage.
Spring, of course, means someone is publishing another book about Darwin. Thankfully, this year it's an excellent one. Lyanda Lynn Haupt's ''Pilgrim on the Great Bird Continent" probes Darwin's journals, pocket notebooks, and letters with the goal of understanding how a beetle-obsessed, squeamish, overprivileged 22-year-old could spend five years circumnavigating South America and emerge as a polished naturalist whose vision would change human understanding forever.
Even more than ''Chasing Spring," ''Pilgrim" is about what Haupt calls ''deep watchfulness." She argues that Darwin's notebooks ''foist upon us his strict but beautiful maxim. Nothing in the natural world is beneath our notice -- he almost whacks us on the head with it. Nothing."
Here is a young, apprehensive, occasionally self-absorbed Darwin who gradually strips away his vanities to find an intensity of observation that borders on the religious. He swims with iguanas; he waits four hours on his knees in the mud to glimpse a sedge wren. He stands perfectly motionless in a forest until a shy bird will, in his words, finally ''approach within a few feet, in the most familiar manner." He lies on his back for an entire hour simply to watch the slow circling of condors. Ultimately Haupt's portrait is of a devastatingly sensitive man who teaches himself to approach the world with a profound humility.
Watch what's going on around you, forget being hungry or wet, and bring all your intelligence to being present. If any time of the year is about throwing open your windows and letting the energy of the world pour over you, it has to be springtime.
Perhaps Darwin's own son said it best: ''I used to like to hear him admire the beauty of a flower; it was a kind of gratitude to the flower itself and a personal love for its delicate form and colour. I seem to remember him gently touching a flower he delighted in; it was the same simple admiration that a child might feel."
Anthony Doerr is the author of ''The Shell Collector" and ''About Grace."
The headlines were intriguing. “A Free-for-All on Science and Religion,” wrote the New York Times. “Losing Our Religion: A gathering of scientists and atheists explores whether faith in science can ever substitute for belief in God,” was Newsweek’s version. New Scientist magazine called its article “Beyond Belief—In Place of God: Can secular science ever oust religious belief—and should it even try?”
The reports summarized the highlights of a conference, held Nov. 5-7 at the Salk Institute in La Jolla, Calif., that attracted a large number of very prominent scientists, mostly from the United States and Britain, for a discussion called “Beyond Belief: Science, Religion, Reason and Survival.”
Richard Dawkins was there, an evolutionary biologist from Britain who wrote “The God Delusion,” currently a best seller.
Sam Harris, a doctoral student in neuroscience, also spoke. He is author of “Letter to a Christian Nation,” another recent best seller, as well as an earlier book, “The End of Faith: Religion, Terror and the Future of Reason.”
Physicist and Nobel laureate Steven Weinberg also spoke, as did Neil deGrasse Tyson, director of the Hayden Planetarium in New York. Carolyn Porco of the Space Science Institute in Boulder, Colo., seemed to be one of the very few women speakers in a conference dominated by white men.
The published accounts mentioned above emphasize that the overwhelming majority of the conferees identified themselves as atheists or non-believers and the speakers posed the issue as a conflict between reason and dogma. But they sharply debated one another on what scientists’ attitude should be toward religion.
If anyone at the conference took a historical materialist view of this question—that is, a Marxist view—the mass media did not report it.
That alone is worthy of note, because for many years a conference in the U.S. that promoted atheism would have been branded “communist” by much of the commercial media. That certainly was the case during the years of the Reagan administration, when the influence of the religious right in politics was very consciously promoted at the same time that a major assault was being made on social programs benefiting the working class.
It was considered a noteworthy break with these political and ideological forces when Nancy Reagan later disagreed publicly with the religious right over the issue of stem-cell research, after her husband was diagnosed with Alzheimer’s disease.
But since the collapse of the USSR, the debate over science and religion has taken a new turn. The prominent speakers at this conference could not be considered leftists by any stretch of the imagination.
What Marx said about religion
When Karl Marx wrote about religion in the mid-19th century, at a time when much of the new ruling bourgeois class in Europe still identified with the Enlightenment as against medieval dogma, he was able to say about the German intellectual establishment that, “[T]he criticism of religion has been essentially completed.”
But he went on to explain why religion continued to have a strong influence among the masses.
“Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people.
“The abolition of religion as the illusory happiness of the people is the demand for their real happiness. ... The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo.” (Karl Marx, “A Contribution to the Critique of Hegel’s Philosophy of Right,” 1844)
Marx’s term “the opium of the people” is often quoted out of context, as though it were nothing but a slur against religion. But here it is obvious that he was referring quite eloquently to how people turn to religion to dull their pain over unbearable social conditions that need to be abolished.
Marxism goes to the heart of the problem. The new capitalist class needed rationalism as against dogma in order to lay the basis for the tremendous scientific-technological development that vastly expanded its means of production and commerce. But capitalism brought with it new horrors for the masses—the conversion of much of the peasantry into wage laborers working 12 to 14 hours a day in the hellish mines and factories.
Thus this new system, which needed rationalism and science in order to grow, at the same time propagated the social conditions that ensured a continued place for religion among the masses. Even today, after several centuries of scientific discoveries that have transformed the way in which every daily task is done—and have brought immense fortunes to those in the ruling class—a large percentage of the people cling to religion as “the heart of a heartless world,” to use Marx’s phrase.
Did the conference in La Jolla look at religion in this social context? Not if the published accounts correctly represent it.
What, then, spurred on scientists to organize such a gathering at this time?
One would certainly expect that much of the energy for it came from the need to respond to the increasing efforts by the religious right and certain corporate interests to impose anti-scientific views on society. The attempts to legislate the teaching of “creationism” as opposed to evolution, the opposition to stem-cell research by churches claiming to defend the “unborn,” the denial of global warming by scientists funded by energy companies—all this cries out for a counter-attack by scientists. Undoubtedly, many of the attendees at the conference came because of this political climate.
But there was another and more disturbing motivation, and it was pushed by some of the most prominent speakers.
The Web site edge.org is devoted to scientific discussion. According to a critique of the conference written for Edge by participant Scott Atran, “We first heard from Steven Weinberg, and then from every other second speaker, about the history of Islam, about why Muslim science went into decline after the 13th or 14th centuries, and about why suicide bombers, the most fanatically religious of all would-be mass murderers, are an outgrowth of Islam. Missing at ‘Beyond Belief’ was erudition and deep understanding of Islamic history other than the usual summaries of names and achievements. ...
“We heard from Sam Harris that Muslims represent less than 10 percent of the population in Western European countries such as France, but over 50 percent of the prison population. The obvious inference expected from the audience is that Islam encourages criminal behavior. ...
“Richard Dawkins tells us that Islam oppresses women.”
The New York Times article of Nov. 21 confirmed that Islam-bashing was a strong component of this conference. “By shying away from questioning people’s deeply felt beliefs, even the skeptics, Mr. [Sam] Harris said, are providing safe harbor for ideas that are at best mistaken and at worst dangerous. ‘I don’t know how many more engineers and architects need to fly planes into our buildings before we realize that this is not merely a matter of lack of education or economic despair,’ he said.”
In Harris’s book “Letter to a Christian Nation,” he tries to ingratiate himself with Christians in the United States by saying, “Nonbelievers like myself stand beside you dumbstruck by the Muslim hordes who chant death to whole nations of the living. But we stand dumbstruck by you as well—by your denial of tangible reality, by the suffering you create in service to your religious myths, and by your attachment to an imaginary God.”
Harris says he started writing the book the day after 9/11.
Clearly, the time has not yet come when scientists in the imperialist countries can be expected to organize a truly scientific discussion on religion. That would require an honest, dispassionate view of the world today as it is: divided between the rich and the poor, the oppressor and the oppressed, the imperialist countries and those fighting against efforts to re-colonize them.
Islamic fundamentalism is flourishing among the oppressed as U.S. and British imperialists inflict unspeakable atrocities on the peoples of the Middle East. It cannot be equated with Christian fundamentalism in Western imperialist countries.
What is needed to counteract dogma is not just atheism but a Marxist-Leninist world view that understands religion and all social phenomena in their real context and can apply this to the current period in human history, which is characterized above all by the capitalist division of society into opposing social classes and a world system in which a few imperialist countries super-exploit the majority of the human race. The triumph of “reason” will come when the masses of people overturn this unjust, antiquated social system.