Human life is lived in a middle position between our genetic determinants on the one hand and culture on the other. It's out of that that human freedom emerges. And artistic works, the plays of Shakespeare, the novels of Jane Austen, the works of Wagner and Beethoven, Rembrandt and Hokusai, are among the freest, most human acts ever accomplished. These creations are the ultimate expressions of freedom.
Denis Dutton is a visionary. He was among the first (together with our own
John Brockman) to realize that a website could be a forum for cutting-edge
ideas, not just a way to sell things or entertain the bored. Today Arts and
Letters Daily is the web site that I try the hardest not to visit, because
it is more addictive than crack cocaine. He started one of the first
print-on-demand services for out-of-print scholarly books. He saw that
philosophy and literature had much to say to each other, and started a deep
and lively scholarly journal to move that dialogue along. He saw that
pompous and empty prose in the humanities had become an impediment to
thinking, and initiated the Bad Academic Writing contest to expose it.
And now he is changing the direction of aesthetics. Many people believe that
this consilience between the arts, humanities, and sciences represents the
future of the humanities, revitalizing them with a progressive research
agenda after the disillusionments of postmodernism. Dutton has written the
first draft of this agenda. He has defended a universal definition of art—something that many theorists assumed was simply impossible. And he has
advanced a theory that aesthetics have a universal basis in human
psychology, ultimately to be illuminated by the processes of evolution. Hus
ideas in in this area are not meant to be definitive, but they lay out
testable hypotheses, and point to many fields that can be brought to bear on
our understanding of art.
I see this as part of a larger movement of consilience, in which (to take a
few examples), ideas from auditory cognition will provide insight into
music, phonology will help illuminate poetics, semantics and pragmatics will
advance our understanding of fiction, and moral psychology will be brought
to bear on jurisprudence and philosophy. And in his various roles, Denis
Dutton will be there when it happens.
—Steven Pinker, Johnstone Family Professor, Department of Psychology, Harvard University; Author, The Stuff of Thought.
DENIS DUTTON, a philosopher, is founder and editor of the highly regarded Web publication, Arts & Letters Daily (www.aldaily.com). He teaches the philosophy of art at the University of Canterbury, New Zealand, writes widely on aesthetics. and is editor of the journal Philosophy and Literature, and the author of the recently published The Art Instinct: Beauty, Pleasure and Human Evolution.
In this EdgeVideo, evolutionary biologist Armand Leroi reports on his art/science conversation and collaboration with musician Brian Eno which began when the two sat next to each other an an Edge dinner in London. The dinner discussion began with evolution and music, proceeded to the evolution of music, and led to the following question: has anybody attempted to reconstruct the history of human song? People around the world sing in different ways. Is it possible to retrieve that history. Can we do for songs what we've done for genes, for language?
ARMAND LEROI is a Reader in Evolutionary Developmental Biology at Imperial College, London. He is the author of Mutants: On Genetic Variety and the Human Body, winner of The Guardian First Book Award, 2004.
Writing in Sueddeutsche Zeitung ("Short Answers To Big Questions"), Feuilleton editor Andrian Kreye noted that:
The experiment is not only represents a collaboration by Brockman and Obrist’s of their own work; it is also a continuation of a movement that began in the '60s on America’s East Coast. John Cage brought together young artists and scientists for symposia and seminars to see what what would happen in the interaction of big thinkers from different fields. The resulting dialogue, which at the time seemed abstract and esoteric, can today be regarded as the forerunner to interdisciplinary science and the digital culture.
"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures.
"Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter.
Mahzarin Banaji, Samuel Barondes, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Leonard Susskind, Nassim Nicholas Taleb, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Richard Wrangham, Philip Zimbardo
Even as mega-banks topple, Juan Enriquez says the big reboot is yet to come. But don't look for it on your ballot -- or in the stock exchange. It'll come from science labs, and it promises keener bodies and minds. Our kids are going to be ... different.
This year's TED Conference, TED 2009, held in Long Beach and curated by Chris Anderson, offered four intense days interesting presentations of "ideas worth spreading". The "spreading" of these ideas extends far beyond the confines of the conference hall as Anderson has extended his vision to multiple viewing locations as well and by presenting TED conferences in venues such as India, Africa, Oxford, and Europe. And most importantly, he has tapped into the viral nature of the Internet age with the "Ted Talks", videos of the live conference events, which feature superb production quality coupled with elegant web presentation. The combination of interesting speakers, excellent technology and production, and the Internet, makes for a rich experience, free for all.
Byars's art and life reflect a sustained, creative engagement with Asian aesthetics and spiritual philosophy. He was introduced to Japan by the artist Morris Graves and from 1957 to 1967 he lived in Kyoto, the center of traditional Japanese arts and culture, seeking out the study and practice of Zen meditation, Shinto ritual, a classical No dance theater. Byars drew eclectically from No's slow, stylized movement and medieval dramas of the supernatural realm to forge a contemporary performance art that was highly abstract, poetic, and ceremonial. A self-styled Eastern mystic who dressed in all-black or all-gold costumes, Byars identified with Asia's concept of death as a mental state of eternal perfection and self transcendence, which influenced the material, spectacular quality, and themes of his performance, sculpture, and installation art.
Byars's work explores the phenomenon of presence. He plays between the immediate living moment and an evocation of death as a realm of the eternal. The Death of James Lee Byars (1982/94) was created as the site for a performance based on earlier works exploring the artist's own "departure" from the real world. The installation presents a gold-leafed room where Byars enacted his symbolic death with a glass sarcophagus and five crystals left as a bodily trace. The performance instructions read: "Quietly lie down and quietly get up." This shimmering space invites contemplation of an otherworldly state of being—not just of transcendent death, but of the East, whose grace it conjures.
Like Byars, all the exhibition artists in The Third Mind were born before 1960. For these artists, foreign travel was part escape, part enlightenment, and grounded in an Orientalist tradition that sought self-betterment through the selective appropriation of ideas, practices, relationships, and material artifacts that represented a superior alternative to Europe and America. After 1990, artists traveled less for personal research and far more as participants in the biennials and other international shows that nave proliferated around the globe over the last two decades. This development has paralleled globalization and the consequent shift in the nature of no knowledge is transmitted. While earlier generations idealized knowledge and art, contemporary generations value information, culture, and critique. This shift is key to understanding a specific trajectory of America art thought that this exhibition reveals.
Yves Behar, FuseProject; Jeff Bezos, Amazon; Zack Bogue; Stewart Brand, Long Now Foundation; Max Brockman, Brockman, Inc.; Rod Brooks, Robotocist, Heartland Robotics; Geoffrey Carr, The Economist; Steve Case, Revolution Health; Jean Case, Case Foundation; Larry Cohen, Gates Foundation; Keith Coleman, Google G-Mail; Brian Cox, CERN; Daniel C. Dennett, Tufts; Susan Dennett; Peter Diamandis, X-Prize Foundation; Juan Enriquez, Excel Medical Ventures; Tony Fadell, Apple; Peter Gabriel; Bill Gates, Gates Foundation; Saul Griffith, Makani Power; Pati Hillis; Danny Hillis, Applied Minds; Arianna Huffington, Huffington Post; Joi Ito, Creative Commons, Neotony; Bill Joy, Kleiner Perkins; Dean Kamen, Deka Research; Jon Kamen, Radical Media; Mickey Kaus, Slate; Kevin Kelly, kk.org; Danielle Lambert; Jaron Lanier; Steven Levy, Wired; Katinka Matson, edge.org, Brockman, Inc.; Marissa Mayer, Google; Nathan Myhrvold, Intellectual Ventures; Shannon O'Leary; Tim O'Reilly, O'Reilly's Radar; Anne Ornish; Dean Ornish, Preventive Medicine Research Institute; Pierre Omidyar, Omidyar network; Pam Omidyar, Omidyar Network; Larry Page, Google; Lori Park, Google; Nick Pritzker; Lisa Randall, Harvard; Jacqui Safra; Linda Stone; Yossi Vardi; Evan Williams, Twitter; Nathan Wolfe, Stanford; Richard Saul Wurman, Founder, TED
Many years ago in the midst of the Web 1.0 boom, when working as a reporter for The Wall Street Journal, BoomTown redubbed an annual dinner that book agent John Brockman threw at the TED conference.
It was jokingly called the "Millionaires' Dinner," but I renamed it the "Billionaires' Dinner."
That was due to the frothy fortunes that had been made at the time by the Internet pioneers, from Amazon to AOL to eBay. Get it?!?
Well, despite the economic meltdown, there were still a lot of billionaires in attendance at Brockman's most recent dinner last Thursday in Long Beach. But he recounted to me that the proceedings were a lot more focused on the serious times we are in, as was the whole digerati-packed conference held last week.
Indeed, Brockman now calls the event the "Edge Dinner," after his lively Edge Web site, where he presides over a variety of eclectic online debates and discussions (in January, for example, the topic was: "DOES THE EMPIRICAL NATURE OF SCIENCE CONTRADICT THE REVELATORY NATURE OF FAITH?").
Since I managed to miss the fete entirely (embarrassing confession: I fell dead asleep at 7 p.m. and did not wake until the next morning) and could not chronicle it, Brockman allowed me to post some photos from the event taken by him and by former Microsoft research guru and current intellectual property mogul Nathan Myhrvold.
[ED. NOTE: Edge contributors will be pleased to read aboutSara Lippincott in John McPhee's article in the February 9th edition of The New Yorker (see abstract below, from the magazine's Web site). Sara has has served as the line editor of all the Edge Annual Question books, turning our lightly edited Web texts into publishable and well-received books. —JB]
Sara Lippincott retired as an editor at this magazine in the early nineteen-nineties, having worked in The New Yorker's fact-checking department from 1966 until 1982. She had a passion for science. In 1973, a long piece of the writer's called "The Curve of Binding Energy" received her full-time attention for three or four weeks and needed every minute of it. Explaining her work to an audience at a journalism school, Sara once said, "Each word in the piece that has even a shred of fact clinging to it is scrutinized, and, if passed, given the checker's imprimatur, which consists of a tiny pencil tick." The writer describes a paragraph from his sixty-thousand-word piece—which was about weapons-grade nuclear material in private industry and what terrorists might do with it—which presented Sara with a certain degree of difficulty. Physicist John A. Wheeler had told the writer about a Japanese weapon balloon landing on a nuclear reactor at the Hanford Engineer Works, in the winter of 1944 or 45. If Wheeler's story were true, it would make it into print. If unverifiable, it would be deleted. Sara's telephone calls ricocheted all over the U.S. Hanford Engineer Works, of the Manhattan Project, was so secret that the Joint Chiefs of Staff didn't know about it. Sara finally located a site manager who confirmed that the balloon had landed on a high-tension line carrying power to the reactor. The fix was made and the piece ran. Sometimes a mistake is introduced during the checking process. This has happened to the writer only once—and nearly thirty years ago. The piece, called "Basin and Range," was the first in a series of long pieces on geology. Mentions current fact-checker Joshua Hersh. Sara, who checked the "Basin" piece, told the writer that he was wrong about the Adriatic Plate, that it is not moving north but southwest. Eldridge Moores had apparently confirmed it. After the piece was published, the writer called Moores, who said that it was in fact the Aegean Plate, not the Adriatic, that was moving southwest. Any error is everlasting. Mentions Time and Atlantic. After an error gets into The New Yorker, heat-seeking missiles rise off the earth and home in on the author, the fact-checker, and the editor. In the comfortable knowledge that the fact-checking department is going to sweep up behind him, the writer likes to guess at certain names and numbers early on. Mentions Willy Bemis and the Illinois River. Describes the process of fact-checking a piece the writer wrote in 2003 about tracing John and Henry Thoreau's upstream journey. Mentions Henry Moore's "Oval with Points." The writer describes checking parts of a book he was writing in 2002. The task took him three months. Mentions William Penn, Cotton Mather, and Joseph Seccombe. ...
Lawrence Krauss, Howard Gardner, Lisa Randall, Patrick Bateson, Daniel Everett, Daniel C. Dennett, Lee Smolin,George Dyson, Emanuel Derman, Karl. W. Giberson,Kenneth R. Miller,Sam Harris, Steven Pinker, NEW Michael Shemer
We should be and we can be doing a much better job to predict and prevent pandemics. But the really bold idea is that we could reach a point—and this is a distant point in the future—where we become so good at this that we really reach a point where we have the "final plague," and where we are really capable of catching so many of these things that new pandemics become an oddity. I think that is something that we should certainly have as an ideal.
Nathan Wolfe trained at Harvard under Marc Hauser (where he was Hauser's first doctoral student) and Richard Wrangham. "I started working with Richard and thinking about self-medicating behavior of chimpanzees," he says. "Richard encouraged me to understand what the chimps may be treating, and so I starting thinking about what are the viruses, what are the microorganisms of chimps that they may be consuming plants in order to treat. Then I never really came back from that."
Subsequently he lived in Malaysia for three years and then in Africa for close to seven years. He describes himself as "a nice Jewish boy from suburban Detroit", which opens up an interesting line of research for Edge scientists, given that our other pandemics expert, Larry Brilliant, Executive Director of Google.org. and the man credited with eliminating smallpox, is also "a nice Jewish boy from suburban Detroit"."I'm sure it was some kind of rebellion," Wolfe said, "but I'm not sure what it was. My grandmother, for years, even when I became an assistant professor at Hopkins, said, "Will this let you go back and get an MD now, Nathan?" Something like that. I do come from that sort of family background, but they just figure it is working out okay. They certainly wish I would make a lot more money. But I told them you were going to help me with that. "
NATHAN WOLFE is the Lorry Lokey Visiting Professor of Human Biology at Stanford University and directs the Global Viral Forecasting Initiative (www.gvfi.org). His research combines methods from molecular virology, ecology, evolutionary biology, and anthropology to study the biology of viral emergence.
View the complete 1-hour HD streaming video of the Edge event that took place at Hubert Burda Media's Digital Life Design Conference (DLD) in Munich on January 27th as the greatest living psychologist and the foremost scholar of extreme events discuss hindsight biases, the illusion of patterns, perception of risk, and denial.
DANIEL KAHNEMAN is Eugene Higgins Professor of Psychology, Princeton University, and Professor of Public Affairs, Woodrow Wilson School of Public and International Affairs. He is winner of the 2002 Nobel Prize in Economic Sciences for his pioneering work integrating insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty. NASSIM NICHOLAS TALEB, essayist and former mathematical trader, is Distinguished Professor of Risk Engineering at New York University's Polytechnic Institute. He is the author of Fooled by Randomness and the international bestseller The Black Swan.
At blame for the financial crisis is the nature of man, say two renowned scientists: Nobel Prize winner Daniel Kahneman and bestselling author Nassim Taleb ( "The Black Swan").
By Ansgar Siemens, FOCUS online editor
Two men sitting on the stage. Left. Daniel Kahneman, 74, bright-eyed, Nobel Prize winner. Right Nassim Taleb, 49, former Wall Street banker, best-selling author. Both speak on the future of Digital Life Design Conference (DLD) in Munich on the financial crisis, about the beginning--mainly they talk about people. They say it is due to human nature, that the crisis has broken out. And they choose harsh words in discussing the scale of the disaster.
Kahneman explains why there are bubbles in the financial markets, even though everyone knows that they eventually burst. The researchers used the comparison with the weather: If there is little rain for three years, people begin to believe that this is the normal situation. If over the years stocks only increase, people can't imagine a break in this trend.
"Those responsible must go--today and not tomorrow"
Taleb speaks out sharply against the bankers. The people in control of taxpayer's money are spending billions of dollars. "I want those responsible for the crisis gone today, today and not tomorrow," he says, leaning forward vigorously. The risk models of banks are a plague, he says, the bankers are charlatans.
It is nonsense to think that we can assess risks and thus protect against a crash. Taleb has become famous with his theory of the black swan described in his eponymous bestsellers described. Black swans, which are events that are not previously seen--not even with the best model. "People will never be able to control a coincidence," he says.
The early warning
"Taleb had an early warning before the crisis. In 2003 he took note of the balance sheet of the U.S. mortgage finance giant Fannie Mae, and he saw "dynamite".
In autumn last year, the U.S. government instituted A dramatic bailout. Taleb said in the "Sunday Times" in 2008: "Bankers are very dangerous." And even now, he sees a scandal: He provocatively asks what have the banks done with the government bailout money. "They have paid out more bonuses, and they have increased their risks." And it was not their own money.
Taleb calls for rigorous changes: nationalize banks--and abolish financial models. Kahneman does not quite agree with him. Certainly, the models are not capable of predicting a collapse. But one should not ignore our human nature. People will always require and use models and get benefit from them--even if they are wrong.
...Across the world, people believe that devotion to sacred or core values that incorporate moral beliefs — like the welfare of family and country, or commitment to religion and honor — are, or ought to be, absolute and inviolable. Our studies, carried out with the support of the National Science Foundation and the Defense Department, suggest that people will reject material compensation for dropping their commitment to sacred values and will defend those values regardless of the costs.
In our research, we surveyed nearly 4,000 Palestinians and Israelis from 2004 to 2008, questioning citizens across the political spectrum including refugees, supporters of Hamas and Israeli settlers in the West Bank. We asked them to react to hypothetical but realistic compromises in which their side would be required to give away something it valued in return for a lasting peace.
SCOTT ATRAN, an anthropologist at the National Center for Scientific Research in Paris, John Jay College and the University of Michigan at Ann Arbor, is the author of In Gods We Trust.
JEREMY GINGES is a professor of psychology at the New School for Social Research.
...On Tuesday, Chief Justice John Roberts joined the Flubber Hall of Fame when he administered the presidential oath of office apparently without notes. Instead of having Barack Obama "solemnly swear that I will faithfully execute the office of president of the United States," Chief Justice Roberts had him "solemnly swear that I will execute the office of president to the United States faithfully." When Mr. Obama paused after "execute," the chief justice prompted him to continue with "faithfully the office of president of the United States." (To ensure that the president was properly sworn in, the chief justice re-administered the oath Wednesday evening.)
How could a famous stickler for grammar have bungled that 35-word passage, among the best-known words in the Constitution? Conspiracy theorists and connoisseurs of Freudian slips have surmised that it was unconscious retaliation for Senator Obama's vote against the chief justice's confirmation in 2005. But a simpler explanation is that the wayward adverb in the passage is blowback from Chief Justice Roberts's habit of grammatical niggling. ...
STEVEN PINKER is Johnstone Family Professor, Department of Psychology; Harvard University; Author, The Language Instinct and The Stuff of Thought; chairman of the usage panel of The American Heritage Dictionary.
"The real question," writes biologist Jerry Coyne in his New Republic article "Seeing And Believing", is whether there is a philosophical incompatibility between religion and science. Does the empirical nature of science contradict the revelatory nature of faith? Are the gaps between them so great that the two institutions must be considered essentially antagonistic?
We no longer have President George W. Bush, Senate Majority Leader Bill Frist, and Senator John McCain announcing in August 2006 their support for teaching Intelligent Design in pubic schools. That was a mobilizing moment for the champions of rational thinking such as Coyne, Richard Dawkins, Daniel C. Dennett, Sam Harris, Christopher Hitchens, and P.Z. Myers to mount an unrelenting campaign against superstition, supernaturalism, and ignorance. The dilemma as Coyne notes is that against the backdrop of scientific knowledge available to us today, these three words are applicable not only to the texts that inform literal fundamentalists but also to the rarefied theological mumbo-jumbo of the most refined, liberal theologians.
On inauguration day, President Obama announced the goal of "restoring science to its rightful place" while, in the same speech, acknowledging that nonbelievers are citizens of this nation in the same way as followers of religion. In light of the growing tendency of scientists to speak out about their lack of faith, isn't it now time to ask a few questions? Is "belief in belief" as defined by Dennett a good thing? Is there merit in the late Stephen Jay Gould's assertion that religion and science form "non-overlapping magisteria" (NOMA) which address two independent ways of arriving at truth? Isn't it now time for an honest discussion about whether science and belief are indeed compatible?
But as Coyne points out:
Would that it were that easy! True, there are religious scientists and Darwinian churchgoers. But this does not mean that faith and science are compatible, except in the trivial sense that both attitudes can be simultaneously embraced by a single human mind. (It is like saying that marriage and adultery are compatible because some married people are adulterers. ) It is also true that some of the tensions disappear when the literal reading of the Bible is renounced, as it is by all but the most primitive of JudeoChristian sensibilities. But tension remains. The real question is whether there is a philosophical incompatibility between religion and science. Does the empirical nature of science contradict the revelatory nature of faith? Are the gaps between them so great that the two institutions must be considered essentially antagonistic? The incessant stream of books dealing with this question suggests that the answer is not straightforward."
In the next few days, Edge plans to publish a series of brief responses by selected contributors addressing these issues.
Saving Darwin: How to be a Christian and Believe in Evolution
By Karl W. Giberson
(HarperOne, 248 pp., $24.95)
Only A Theory: Evolution and the Battle for America's Soul
By Kenneth R. Miller
(Viking, 244 pp., $25.95)
...Unfortunately, some theologians with a deistic bent seem to think that they speak for all the faithful. These were the critics who denounced Dawkins and his colleagues for not grappling with every subtle theological argument for the existence of God, for not steeping themselves in the complex history of theology. Dawkins in particular was attacked for writing The God Delusion as a "middlebrow" book. But that misses the point. He did indeed produce a middlebrow book, but precisely because he was discussing religion as it is lived and practiced by real people. The reason that many liberal theologians see religion and evolution as harmonious is that they espouse a theology not only alien but unrecognizable as religion to most Americans.
Statistics support this incompatibility. For example, among those thirty-four countries surveyed, we see a statistically strong negative relationship between the degree of faith and the acceptance of evolution. Countries such as Denmark, France, Japan and the United Kingdom have a high acceptance of Darwinism and low belief in God, while the situation is reversed in countries like Bulgaria, Latvia, Turkey, and the United States. And within America, scientists as a group are considerably less religious than non-scientists. This is not say that such statistics can determine the outcome of a philosophical debate. Nor does it matter whether these statistics mean that accepting science erodes religious faith, or that having faith erodes acceptance of science. (Both processes must surely occur.) What they do show, though, is that people have trouble accepting both at the same time. And given the substance of these respective worldviews, this is no surprise.
This disharmony is a dirty little secret in scientific circles. It is in our personal and professional interest to proclaim that science and religion are perfectly harmonious. After all, we want our grants funded by the government, and our schoolchildren exposed to real science instead of creationism. Liberal religious people have been important allies in our struggle against creationism, and it is not pleasant to alienate them by declaring how we feel. This is why, as a tactical matter, groups such as the National Academy of Sciences claim that religion and science do not conflict. But their main evidence--the existence of religious scientists--is wearing thin as scientists grow ever more vociferous about their lack of faith. Now Darwin Year is upon us, and we can expect more books like those by Kenneth Miller and Karl Giberson. Attempts to reconcile God and evolution keep rolling off the intellectual assembly line. It never stops, because the reconciliation never works.
world's finest minds have responded with some of the most insightful,
humbling, fascinating confessions and anecdotes, an intellectual
treasure trove. ... Best three or four hours of intense, enlightening
reading you can do for the new year. Read it now."
great event in the Anglo-Saxon culture." El
[Forthcoming, January 9, 2009]
Contributors include: STEVEN PINKER on the future of human evolution • RICHARD DAWKINS on the mysteries of courtship • SAM HARRIS on why Mother Nature is not our friend • NASSIM NICHOLAS TALEB on the irrelevance of probability • ALUN ANDERSON on the reality of global warming • ALAN ALDA considers, reconsiders, and re-reconsiders God • LISA RANDALL on the secrets of the Sun • RAY KURZWEIL on the possibility of extraterrestrial life • BRIAN ENO on what it means to be a "revolutionary" • HELEN FISHER on love, fidelity, and the viability of marriage…and many others.
Praise for the online publication of What Have You Change Your Mind About?
splendidly enlightened Edge website (www.edge.org) has rounded off
each year of inter-disciplinary debate by asking its heavy-hitting
contributors to answer one question. I strongly recommend a visit." The
great event in the Anglo-Saxon culture." El
fascinating and weighty as one would imagine." The
are the intellectual elite, the brains the rest of us rely on to
make sense of the universe and answer the big questions. But in
a refreshing show of new year humility, the world's best thinkers
have admitted that from time to time even they are forced to change
their minds." The Guardian
"Even the world's
best brains have to admit to being wrong sometimes: here, leading scientists
respond to a new year challenge." The
ideas put forward today by leading figures."The
world's finest minds have responded with some of the most insightful,
humbling, fascinating confessions and anecdotes, an intellectual
treasure trove. ... Best three or four hours of intense, enlightening
reading you can do for the new year. Read it now." San
in the past, these world-class thinkers have responded to impossibly
open-ended questions with erudition, imagination and clarity." The
News & Observer
jolt of fresh thinking...The answers address a fabulous array of issues.
This is the intellectual equivalent of a New Year's dip in the lake — bracing,
possibly shriek-inducing, and bound to wake you up." The
Globe and Mail
ring like scientific odes to uncertainty, humility and doubt; passionate
pleas for critical thought in a world threatened by blind convictions." The
an exceptionally high quotient of interesting ideas to words, this
is hard to beat. ...What a feast of egg-head opinionating!" National
"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times
"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times
"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer
unprecedented roster of brilliant minds, the sum of which is nothing
short of an oracle — a book ro be dog-eared and debated." Seed
pipedreams at their very best." The
"Makes for some astounding
reading." Boston Globe
stimulating...It's like the crack cocaine of the thinking world....
Once you start, you can't stop thinking about that question." BBC
and creative magnificence" The
March 2, 2009, 2009 THE EVOLUTION OF ART
James Q. Wilson
Art suffuses our lives. Whether it's bluegrass, heavy metal, Frank Sinatra or Mozart, music moves us all. On a trip to a foreign city, visiting an art museum is a mandatory exercise. Imaginative writing affects many of us, though—alas—with decreasing frequency.
Why should art be important? Being seen as an "art lover" may increase our status, but otherwise art is not useful. Yet art has been part of the human experience since Paleolithic man painted on the walls of caves in Lascaux, France, and Altamira, Spain, more than 30,000 years ago. Art preceded cities, agriculture and writing.
Denis Dutton, an art professor in New Zealand, has proposed a bold new explanation. He argues that humankind's universal interest in art is the result of human evolution. We enjoy sex, grasp facial expressions, understand logic and spontaneously acquire language—all of which make it easier for us to survive and produce children. In "The Art Instinct: Beauty, Pleasure, and Human Evolution," Dutton contends that an interest in art belongs on this list of evolutionary adaptations.
In making his case, Dutton has to refute the late Stephen Jay Gould's argument that human culture is a socially formed byproduct of our large brains. Dutton easily overcomes this argument by pointing out how many "byproducts"—such as a spoken language—have given humans a huge evolutionary gain. But he must still explain why an interest in art gives us an edge. This is no easy task. Just because many people have a trait does not mean that it confers an evolutionary advantage. I like the Boston Red Sox, but I doubt that preference was genetically passed on to my children. (Happily, they became Sox fans anyway.)...
...Evolution has, without any doubt, left people with an appreciation for both natural and man-made beauty, but sexual selection explains, I think, only a small part of the reason. But read Dutton's book: his masterful knowledge of art and his compelling prose make it a thing of beauty.
YOU are what you eat, or so the saying goes. ButRichard Wrangham, of Harvard University, believes that this is true in a more profound sense than the one implied by the old proverb. It is not just you who are what you eat, but the entire human species. And with Homo sapiens, what makes the species unique in Dr Wrangham’s opinion is that its food is so often cooked.
Cooking is a human universal. No society is without it. No one other than a few faddists tries to survive on raw food alone. And the consumption of a cooked meal in the evening, usually in the company of family and friends, is normal in every known society. Moreover, without cooking, the human brain (which consumes 20-25% of the body’s energy) could not keep running. Dr Wrangham thus believes that cooking and humanity are coeval.
In fact, as he outlined to the American Association for the Advancement of Science (AAAS), in Chicago, he thinks that cooking and other forms of preparing food are humanity’s “killer app”: the evolutionary change that underpins all of the other—and subsequent—changes that have made people such unusual animals.
Humans became human, as it were, with the emergence 1.8m years ago of a species called Homo erectus. This had a skeleton much like modern man’s—a big, brain-filled skull and a narrow pelvis and rib cage, which imply a small abdomen and thus a small gut. Hitherto, the explanation for this shift from the smaller skulls and wider pelvises of man’s apelike ancestors has been a shift from a vegetable-based diet to a meat-based one. Meat has more calories than plant matter, the theory went. A smaller gut could therefore support a larger brain.
Dr Wrangham disagrees. When you do the sums, he argues, raw meat is still insufficient to bridge the gap. He points out that even modern “raw foodists”, members of a town-dwelling, back-to-nature social movement, struggle to maintain their weight—and they have access to animals and plants that have been bred for the table. Pre-agricultural man confined to raw food would have starved. ...
...Human beings are supremely social animals. We recognise people and judge their feelings and intentions from their expressions and actions. Our thoughts about ourselves, and the words we use to describe those thoughts, are infused with wishes and wants. We feel that we are the helmsmen of our actions, free to choose, even to sin.
But increasingly, those who study the human brain see our experiences, even of our own intentions, as being an illusory commentary on what our brains have already decided to do.
Perhaps we humans come with a false model of ourselves, which works well as a means of predicting the behaviour of other people - a belief that actions are the result of conscious intentions. Then could the pervasive human belief in supernatural forces and spiritual agents, controlling the physical world, and influencing our moral judgments, be an extension of that false logic, a misconception no more significant than a visual illusion?
I'm dubious about those "why" questions: why are we here? Why do we have a sense of right and wrong? Either they make no sense or they can be recast as the kind of "how" questions that science answers so well.
When we understand how our brains generate religious ideas, and what the Darwinian adaptive value of such brain processes is, what will be left for religion?
...What do you make of that? The Origin of Species was published 150 years ago. Why is the debate still ongoing?
Well, it's not happening in many other countries. I say in the book that of 34 industrialized countries in the world that were surveyed, we ranked 33 in accepting evolution, just above Turkey. In Europe acceptance of evolution is very high. There's no doubt that it's because of the pervasiveness of religion in the United States, and fundamentalist religion. That's the reason why the opposition persists and will keep persisting.
Some creationists seem to feel that it's the scientists who are being dogmatic here—that you're somehow invested in this idea or want it to be true, or that your training has blinded you to other possibilities. How do you respond to that?
I think they're the ones who are dogmatic, because the difference between religion and science, which is the difference between religion and evolution, is that we question things. Nobody worships Darwin as a religion. We don't adhere to a set of dogmas that are unchanging and unquestionable. We all recognize that Darwin was wrong about a lot of stuff. His theories of genetics were wrong, his theories of biogeography were wrong—that's been corrected by plate tectonics—his stuff on sexual selection is very good but not complete. Evolutionary biology is constantly changing and revising its conclusions. But the main conclusions that Darwin made—that evolution occurred, that it occurred through natural selection, that there were common ancestry and splitting and that it happened slowly—those have all been supported. We accept those things because mountains of evidence have shown them to be true. They've been subsumed in what we call neo-Darwinism or modern evolutionary theory. There's a lot of stuff that Darwin said and that other early evolutionists said that is wrong, so we're constantly revising and changing our stuff. It's just that Darwin happened to be right on the main points of the theory. We're not dogmatic about it. I might still be willing to give up my idea that evolution occurred if we got certain evidence from the fossil record, but we haven't gotten it. Whereas there's no observation that will make a religious person give up [his beliefs]. I say in the New Republic article that if the Holocaust didn't do that, then nothing ever will. That's the ultimate argument against belief in at least a certain kind of god. ...
What on earth were you thinking when you produced a garish cover proclaiming that "Darwin was wrong" (24 January)?
First, it's false, and second, it's inflammatory. And, as you surely know, many readers will interpret the cover not as being about Darwin, the historical figure, but about evolution.
Nothing in the article showed that the concept of the tree of life is unsound; only that it is more complicated than was realised before the advent of molecular genetics. It is still true that all of life arose from "a few forms or... one", as Darwin concluded in The Origin of Species. It is still true that it diversified by descent with modification via natural selection and other factors.
Of course there's a tree; it's just more of a banyan than an oak at its single-celled-organism base. The problem of horizontal gene-transfer in most non-bacterial species is not serious enough to obscure the branches we find by sequencing their DNA.
The accompanying editorial makes it clear that you knew perfectly well that your cover was handing the creationists a golden opportunity to mislead school boards, students and the general public about the status of evolutionary biology. Indeed, within hours of publication members of the Texas State Board of Education were citing the article as evidence that teachers needed to teach creationist-inspired "weaknesses of evolution", claiming: "Darwin's tree of life is wrong".
You have made a lot of extra, unpleasant work for the scientists whose work you should be explaining to the general public. We all now have to try to correct all the misapprehensions your cover has engendered.
...The somewhat aging enfant terrible Christopher Hitchens, author of an oddly dyspeptic attack on Mother Teresa ("The Missionary Position") and the recent bestseller "God Is Not Great: How Religion Poisons Everything," is simply the most public face of American atheism. Also on the bestseller list in the past have been Sam Harris's "Letter to a Christian Nation" and Richard Dawkins's "The God Delusion." And now, behind the scenes, groups like American Atheists, the Freedom From Religion Foundation and the Council for Secular Humanism have been busy publishing journals, funding college scholarships and establishing Web sites. ...
...So far, American atheists have no figurehead with the brilliance or literary and scientific prizes of Britain's Mr. Dawkins, the recently retired Simonyi Professor of the Public Understanding of Science at Oxford, where Balliol College named one of its most prestigious awards after him. Even so, these new American atheists are far better advocates for their cause than the dysfunctional O'Hare clan. Now that they have broken the ice, in fact, we should only hope that even more thoughtful atheists will follow them into the pool. ...
THE NEW YORKER
FEBRUARY 12, 1009
ABSTRACT: PROFILE of novelist Ian McEwan. Writer accompanies Ian McEwan on a hike near McEwan's country home in Buckinghamshire and notes that McEwan punctuates his observations about human nature with references to scientific studies and publications. McEwan's interest in science isn't antiseptic; it sets his mind at play. His empirical temperament distinguishes him from his friends Martin Amis, Salman Rushdie, and Julian Barnes. Tells about McEwan's love of walking and hiking. He and his wife recently traveled to the Himalayas and New Zealand. McEwan was spending much of his summer in Buckinghamshire, trying to settle into a new novel. McEwan is a connoisseur of dread. His success has a lot to do with his talent for creating suspense. His plots defy what he calls the "dead hand of modernism." Mentions his novels "Black Dogs," "The Comfort of Strangers," "The Child in Time," and "The Innocent." Some critics have disparaged McEwan as a hack with elegant prose. McEwan is insistent that something stirring should happen in a novel. His novel-in-progress, which is about global warming, was inspired by a hiking trip he took along a fjord in Spitsbergen, a group of Norwegian islands. He started work on the new book, as yet untitled, in December, 2007. He promised that it would not be didactic. Tells about the book's protagonist, Michael Beard, and McEwan's research into climate change and solar technologies. Writer visits McEwan at his London home in Fitzroy Square. Mentions Neil Kitchen, the brain surgeon whom McEwan shadowed while writing his novel "Saturday." ...
...First idea: use all five moral senses. A scientific consensus is emerging that human moral psychology was shaped by multiple evolutionary forces and that our minds therefore detect many—sometimes conflicting—properties of social situations. The two best studied moral senses pertain to harm (including our capacities for sympathy and nurturing) and fairness (including anger at injustice). You can travel the world but you won't find a human culture that doesn't notice and care about harm and fairness.
Political conservatives in the US, Britain and many other nations value three additional sets of moral concerns. Like liberals, they care about harm and fairness, but they care more than liberals about loyalty to the in-group (which political party cares most about flags and borders?), authority (which side demands respect for parents and teachers?) and spiritual purity (which side most wants to restrict homosexuality and drug use?). It's as though conservatives can hear five octaves of music, but liberals respond to just two, within which they have become particularly discerning. (My research colleagues and I have not just plucked these "senses" from the air; they emerged from a review of both evolutionary and anthropological theory, and were tested in internet surveys, face-to-face interviews and even in the decoding of religious sermons.)
This hypothesis doesn't mean that liberals are wrong or defective, but it does mean that they often have more trouble understanding conservatives than vice versa. Liberals tend to relate most moral issues to potential harms and injustices. They therefore can't understand why anyone—including the majority of Americans—would oppose gay marriage, for example, because legalising gay marriage would hurt nobody and end an injustice. Arguments about the sanctity of marriage or the authority of tradition sound like empty words sent out to cover irrational homophobia. But the culture war is not primarily a disagreement about what's harmful or fair; it is better described as a battle between two visions of the ideal society, one that is designed to appeal to two moral senses, the other designed to appeal to five. ...
THE history of science could have been so different. When Charles Darwin applied to be the "energetic young man" that Robert Fitzroy, the Beagle's captain, sought as his gentleman companion, he was almost let down by a woeful shortcoming that was as plain as the nose on his face. Fitzroy believed in physiognomy - the idea that you can tell a person's character from their appearance. As Darwin's daughter Henrietta later recalled, Fitzroy had "made up his mind that no man with such a nose could have energy". Fortunately, the rest of Darwin's visage compensated for his sluggardly proboscis: "His brow saved him."
The idea that a person's character can be glimpsed in their face dates back to the ancient Greeks. It was most famously popularised in the late 18th century by the Swiss poet Johann Lavater, whose ideas became a talking point in intellectual circles. In Darwin's day, they were more or less taken as given. It was only after the subject became associated with phrenology, which fell into disrepute in the late 19th century, that physiognomy was written off as pseudoscience.
Now the field is undergoing something of a revival. Researchers around the world are re-evaluating what we see in a face, investigating whether it can give us a glimpse of someone's personality or even help to shape their destiny. What is emerging is a "new physiognomy" which is more subtle but no less fascinating than its old incarnation. ...
February 11, 2009
THE MASTERLY BLASPHERE
The John Updike opus is so vast, so varied and rich that we will not have its full measure for years to come, writes Man Booker-prizewinning novelist Ian McEwan
A "BIG-BELLIED Lutheran God" within the young Updike looked on in contempt as he struggled to give up cigarettes.
...This most Lutheran of writers, driven by intellectual curiosity all his life, was troubled by science as others are troubled by God. When it suited him, he could easily absorb and be impressed by physics, biology, astronomy, but he was constitutionally unable to "make the leap of unfaith". The "weight" of personal death did not allow it, and much seriousness and dark humour derives from this tension between intellectual reach and metaphysical dread.
In a short story from 1984, The Wallet, Mr Fulham (who, we are told in the first line, "had assembled a nice life") experiences death terrors when he takes his grandchildren to a local cinema. While "starships did special-effects battle" Fulham's "true situation in time and space" was revealed: "a speck of consciousness now into its seventh decade, a mortal body poised to rejoin the minerals, a member of a lost civilisation that once existed on a sliding continent". This "lonely possession" of his own existence, he concludes, is "sickeningly serious".
God makes no appearance in this story, but it is unlikely that an atheist could have conjured so much from the minor domestic disturbance that follows. First, a large cheque "in the low six figures", a return on canny investments, fails to show up in the post. Fulham makes many phone calls to the company in Houston, the matter begins to loom too large -- "He slept poorly, agitated by the injustice of it." He suspects a thief, a "perpetrator", or there is a flaw in the mindless system. He is tormented by "outrageous cosmic unanswerableness". ...
February 14, 2009 Who Says Stress Is Bad For You?
It can be, but it can be good for you, too—a fact scientists tend to ignore and regular folks don't appreciate.
By Mary Carmichael
...More recently, Robert Sapolsky of Stanford University has studied a similar phenomenon in alpha males. He's seen plenty of "totally insane son of a bitch" types who respond to stress by lashing out, but he's also interested in another type that gets less press: the nice guy who finishes first. These alphas don't often get into fights; when they do, they pick battles they know they can win. They're just as dominant as their angry counterparts, and they're subject to the same stressors—power struggles, unsuccessful sexual overtures, the occasional need to slap down a subordinate—but their hormone levels never get out of whack for long, and they probably don't suffer much stress-induced brain dysfunction.Sapolsky likes to joke that they've all been relaxing in hot tubs in Big Sur, transforming themselves into "minimalist Zen masters." This is a joke because they've clearly come by their attitudes unconsciously: Sapolsky studies wild baboons...
...As Maddi's work makes clear, a lot of the explanation stems from early experiences. This may be true of Sapolsky's baboons as well. Sapolsky suspects that part of what makes an animal a dominant Zen master instead of an angry alpha lies in what sort of childhood he had. If an adult baboon picks up on conflict around him but keeps his cool, "quelling the anxiety and exercising impulse control," that may be behavior his mom modeled for him years earlier. The key? Factors such as how many steps the baby baboon could take away from his mother before she pulled him back—i.e., how much she allowed him to learn for himself, even if that meant a few bumps and bruises along the way. "I think the males who had mothers who were less anxious, who allowed them to be more exploratory in the absence of agitated maternal worry, are more likely to be the Zen ones who are calm enough to resist provocation," he says. A little properly handled stress, then, may be necessary to turn children into well-adjusted adults....
The theory of evolution by natural selection has prospered in its first 150 years and provides a consistent account of species as highly adapted and rare survivors in the struggle for existence. It now faces the challenge of finding order in the evolution of complex systems, including human society.
...The question of whose interests are served is sharpened once natural selection is allowed to venture into the realms of cultural and societal evolution. The big complex adaptive system that is human society is leaky: there are many different independent replicators — both biological individuals and cultural elements — each potentially with its own strategies for survival and reproduction. Should human society be viewed as a vehicle for the combined, cumulative effects of these replicators, rather than as a replicating system in its own right? If so, what rules govern which vehicles are successful, and do they bear any relationship to those for biological phenotypes?
There is a growing sense, for example, that human languages have adapted to human minds. Humans have domesticated languages: languages show features related to how they are used and to society, and this probably enhances their survival. Language might also to some degree have domesticated humans. It might have a regulatory role in human society not unlike that of gene regulation, and this may have enhanced human survival. Much the same could be said about the interactions between humans and the varieties of religion, art and music, topics that interested Darwin. The ability of natural selection to keep up with the times as more and more questions are asked shows that, far from being old at 150, Darwin's theory still has a spring in its step. ...
February 12, 2009
Darwin 200: Should scientists study race and IQ?
In the first of two opposing commentaries, Steven Rose argues that studies investigating possible links between race, gender and intelligence do no good. In the second, Stephen Ceci and Wendy M. Williams argue that such research is both morally defensible and important for the pursuit of truth.
Steven Rose: In a society in which racism and sexism were absent, the questions of whether whites or men are more or less intelligent than blacks or women would not merely be meaningless — they would not even be asked. The problem is not that knowledge of such group intelligence differences is too dangerous, but rather that there is no valid knowledge to be found in this area at all. It's just ideology masquerading as science.
Stephen Ceci & Wendy M. Williams: When scientists are silenced by colleagues, administrators, editors and funders who think that simply asking certain questions is inappropriate, the process begins to resemble religion rather than science. Under such a regime, we risk losing a generation of desperately needed research.
People's mindsets are neither fixed by evolution nor infinitely malleable by culture. Dan Jones looks for the similarities that underlie the diversity of human nature.
...Ideas with a distinct Chomskyan flavour have been a stimulus to recent thinking about morality. What counts as a moral transgression, and how one should react to the transgressor, vary from culture to culture. But deeper patterns seem to lurk beneath this surface diversity.
Following Chomsky's lead, a number of researchers are working on the idea that an innate and universal moral grammar might underlie human ethical judgements. A series of web-based studies led by Marc Hauser of Harvard University have suggested that moral judgements can be explained in terms of such universal and fundamental moral principles. Harm caused by direct physical contact, for instance, is generally deemed to be morally worse than harm arising as a side effect, as are harms caused by specific actions rather than omissions.
But these are early days in fleshing out the tool kit of putative moral principles and parameters. "By the time Chomsky started his work in the 1950s he already had a massive amount of descriptive linguistics from all over the world to play with," says Hauser. "In the case of morality, we don't have anything like what the linguists had 50 years ago. We don't know whether the distinctions we're making are at the right level of abstraction, or whether they are principles or parameters."...
...A bird's-eye perspective on moral diversity and uniformity comes from psychologists Jonathan Haidt, of the University of Virginia in Charlottesville and Craig Joseph, of Northwestern University in Evanston, Illinois. Surveying anthropology and evolutionary psychology, they argue that evolution has built into the human mind a preparedness to care about five sets of social issues: fairness and justice; avoiding harm to and caring for others; in-group loyalty; social hierarchy and respect for authority; and the domain of divinity and purity, both bodily and spiritual.
"Morality is a social construction, but each society constructs it on top of these five innate moral foundations, relying on them to varying degrees," says Haidt. "Some moralities, such as those of secular Europe, rest primarily on the first two, prizing concerns about harm and fairness above all else; other cultures, such as those of traditional India, emphasize fairness less, and the virtues of respect and spiritual purity more." ...
...The Guardian reported that, in February 2006, "Muslim medical students in London distributed leaflets that dismissed Darwin's theories as false". The Muslim leaflets were produced by the Al-Nasr Trust, a registered charity with tax-free status. The British taxpayer, that is to say, is subsidizing the systematic distribution of scientific falsehood to educational institutions. Science teachers across Britain will confirm that they are coming under slight, but growing, pressure from creationist lobbies, usually inspired by American or Islamic sources.
So, let nobody have the gall to deny that Coyne's book is necessary. Not just his book, and here I must declare an interest. February 12, 2009, was Charles Darwin's 200th birthday, and the 150th anniversary of The Origin of Species falls this autumn. Publishers being as anniversary-minded as they are, Darwin-related books were obviously to be expected this year. Nevertheless, it is true to say that neither Jerry Coyne nor I was aware of the other's book on the evidence for evolution when we began our own – his published now, mine in the autumn. And our two books may not be the only ones. Bring them on, I say. The more the merrier. The evidence is massive, the modern version of the story would surprise and inspire even Darwin, and it cannot be told too often. Evolution is, after all, the true story of why we all exist, and an exhilaratingly powerful and satisfying explanation. It supersedes – and devastates – all predecessors, no matter how devoutly and sincerely believed.
Why Evolution Is True is outstandingly good. Coyne's knowledge of evolutionary biology is prodigious, his deployment of it as masterful as his touch is light. His coverage is enviably comprehensive, yet he simultaneously manages to keep the book compact and readable. His nine chapters include "Written in the Rocks", laced with examples that make short work of the most popular of all creationist lies, the one about unbridgeable "gaps" in the fossil record: "Show me your intermediates!", say the creationists. Jerry Coyne shows them, and very numerous and convincing they are. Not just fossils of large charismatic animals like whales and birds, and the coelacanth-cousins that made the transition from water to land, but also microfossils. These have the advantage of sheer numbers: some kinds of sedimentary rock are almost entirely made of the tiny fossilized skeletons of foraminiferans, radiolarians and other calcareous or siliceous protozoa. This means you can plot a sensitive graph of some chosen measurement, as a continuous function of geological time, while you systematically work your way through a core of sediments. One of Coyne's graphs shows a genus of radiolarians (beautiful protozoans with minute, lantern-like shells) caught in the act, two million years ago, of "speciating" – splitting into two species. ...
...The most compelling rebuttal of the rational model, paradoxically, was delivered by the ultimate rationalist, Alan Greenspan. "I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders," the former Fed chairman told Congress last October.
That's why Greenspan didn't see it coming, argues Daniel Kahneman, a Princeton professor who is often described as the father of behavioral economics. His rational-actor model wouldn't let him.
Let me put in a plug here for the godfather of behavioral economics, John Maynard Keynes. His 1936 "General Theory" is often interpreted simplistically as a call for fixing recessions by boosting demand with government spending. But at a deeper level, Keynes was analyzing the role of psychological factors, such as greed and fear, in economic decisions. He understood that markets freeze when people panic and start hoarding cash. ("Extreme liquidity preference," he called it.) Conversely, economies start to roar when investors feel a surge of what Keynes called "animal spirits."
One of the most powerful ideas I heard at Davos was the idea of "pre-mortem" analysis, which was first proposed by psychologist Gary Klein and has been taken up by Kahneman.
A pre-mortem analysis can provide a real "stress test" to conventional thinking. Let's say that a company or government agency has decided on a plan of action. But before implementing it, the boss asks people to assume that five years from now, the plan has failed -- and then to write a brief explanation of why it didn't work. This approach stands a chance of bringing to the surface problems that the decision makers had overlooked -- the "black swans," to use former trader Nassim Nicholas Taleb's phrase, that people assumed wouldn't happen in the near future because they hadn't occurred in the recent past. ...
THE flies in the men's-room urinals of the Amsterdam airport have been enshrined in the academic literature on economics and psychology. The flies — images of flies, actually — were etched in the porcelain near the urinal drains in an experiment in human behavior.
After the flies were added, "spillage" on the men's-room floor fell by 80 percent. "Men evidently like to aim at targets," saidRichard Thaler of the University of Chicago, an irreverent pioneer in the increasingly influential field of behavioral economics.
Mr. Thaler says the flies are his favorite example of a "nudge" — a harmless bit of engineering that manages to "attract people's attention and alter their behavior in a positive way, without actually requiring anyone to do anything at all." What's more, he said, "The flies are fun."
...Nudging derives from research by Daniel Kahneman, a Nobel laureate in economics; by Mr. Kahneman's late colleague, Amos Tversky; and by Mr. Thaler and others over several decades. Mr. Kahneman, a psychologist, gives Mr. Thaler considerable credit for the birth of behavioral economics.
Mr. Thaler has found that people often don't act rationally and in their own best interests, as is assumed by traditional economic models. He calls such idealized people "Econs," as distinguished from "Humans." Econs are walking computers, and behave according to the laws of classical economics; Humans are quirky, like the people you meet on the street. Humans may know that they should eat less and exercise more, but they often miss the mark. They may know that they should save more, but often don't. And so, Mr. Thaler says, most of us would benefit from a nudge.
...James Heckman, a Nobel Prize-winning economist at the University of Chicago, has estimated that for every dollar spent on a prekindergarten like Perry, $8 has been gained in higher incomes for participants and in savings on the costs of extra schooling, crime and welfare.
Similarly, a program called KIPP (for Knowledge Is Power Program) is having remarkable success with poor minority children in middle schools. KIPP students attend school from 7:30 a.m. to 5 p.m., their term is three weeks longer than normal, and every other Saturday they have classes for half a day. The curriculum includes sports, visits to museums and instruction in dance, art, music, theater and photography. During one academic year, the percentage of fifth-graders at KIPP schools in the San Francisco Bay Area who scored at or above the national average on the reading portion of the Stanford Achievement Test rose to 44 percent from 25 percent. And while only 37 percent started the year at or above the national average in math, 65 percent reached that level by spring.
Such creative programs must be tested to ensure that they work as they are meant to. The United States Department of Education's What Works Clearinghouse, which was established by the Bush administration, has the job of making public all significant evaluations of educational interventions. The Obama administration should heed the Clearinghouse's reports. Stimulus money should be spent only on programs that work well — and on creating new programs, which in turn should be properly tested for effectiveness.
President Obama is in a position to not only inspire black youngsters by his example, but also make an enormous difference in their schooling — as long as he supports successful educational interventions, from the smallest to the most ambitious. ...
Bank nationalizations are "absolutely necessary" to stop them damaging the financial system further with more losses, said Nassim Nicholas Taleb, author of the best-selling finance book "The Black Swan."
"You cannot trust the banks in taking risks," Taleb said in an interview with Bloomberg Television in Davos. "We have a very strange situation in which it's the worst of capitalism and socialism, a situation in which profits were privatized and losses were socialized. We taxpayers have the worst."
The global economy will slow close to a halt this year as more than $2 trillion of bad assets in the U.S. help sink economies from there to the U.K. and Japan, the International Monetary Fund said yesterday. Taleb echoed comments from New York University Professor Nouriel Roubini, who says the majority of U.S. banks are insolvent. ...
I've been taking part in the Technology, Entertainment and Design conference (aka TED) this week, being held for the first time in Long Beach, California after many years in Monterey.
I've been struck by how different the mood is here than it was last week in Davos. Much more upbeat. Maybe it's because TED is brimming with innovators, people less interested in figuring out how to prop up the collapsed economy of the last century than in creating an economy for the 21st century.
You also run into more quirky and interesting people per square inch than anywhere I've ever been. For instance, last night I found myself chatting with a stranger (this happens, of course, all the time at TED). When I asked him what he did, he told me that he owned The Kitchen in Boulder, Colorado, "America's greenest restaurant"... and is CEO of an Internet software company... and sits on the boards of Tesla Motors, SpaceX Corp., and ProgressNow.org. He turned out to be Kimbal Musk, the younger polymath brother of Elon Musk, the co-founder of PayPal and CEO of Tesla Motors.
I also spent some time with perhaps the world's most maniacal polymath: Bill Gates. Gates played a big role at Davos, as he has here -- but here the conference and the crowd fit him to a T.
And he's been a real presence here, starting with delivering a much talked about keynote address, during which he drove home the importance of investing in malaria prevention by releasing a swarm of mosquitoes on the crowd, saying, "There is no reason only poor people should be infected." The stunt caused quite a bit of buzz (sorry!) around the blogosphere. "The mosquitoes had been irradiated," he reassured me. Okay, so all they could do was suck a little blood.
...He has clearly been leading by example in changing both the business world and the world of philanthropy. But when it comes to sleep, all I can say is that when I left a dinner given by EDGE's John Brockman after midnight last night, Gates was still there talking away with X Prize's Peter Diamandis about providing big rewards for scientific breakthroughs.
WHILE many institutions collapsed during the Great Depression that began in 1929, one kind did rather well. During this leanest of times, the strictest, most authoritarian churches saw a surge in attendance.
This anomaly was documented in the early 1970s, but only now is science beginning to tell us why. It turns out that human beings have a natural inclination for religious belief, especially during hard times. Our brains effortlessly conjure up an imaginary world of spirits, gods and monsters, and the more insecure we feel, the harder it is to resist the pull of this supernatural world. It seems that our minds are finely tuned to believe in gods.
Religious ideas are common to all cultures: like language and music, they seem to be part of what it is to be human. Until recently, science has largely shied away from asking why. "It's not that religion is not important," says Paul Bloom, a psychologist at Yale University, "it's that the taboo nature of the topic has meant there has been little progress."
The origin of religious belief is something of a mystery, but in recent years scientists have started to make suggestions. One leading idea is that religion is an evolutionary adaptation that makes people more likely to survive and pass their genes onto the next generation. In this view, shared religious belief helped our ancestors form tightly knit groups that cooperated in hunting, foraging and childcare, enabling these groups to outcompete others. In this way, the theory goes, religion was selected for by evolution, and eventually permeated every human society (New Scientist, 28 January 2006, p 30)
The religion-as-an-adaptation theory doesn't wash with everybody, however. As anthropologistScott Atran of the University of Michigan in Ann Arbor points out, the benefits of holding such unfounded beliefs are questionable, in terms of evolutionary fitness. "I don't think the idea makes much sense, given the kinds of things you find in religion," he says. A belief in life after death, for example, is hardly compatible with surviving in the here-and-now and propagating you. ...
God of the gullibile
In The God Delusion,Richard Dawkins argues that religion is propagated through indoctrination, especially of children. Evolution predisposes children to swallow whatever their parents and tribal elders tell them, he argues, as trusting obedience is valuable for survival. This also leads to what Dawkins calls "slavish gullibility" in the face of religious claims.
If children have an innate belief in god, however, where does that leave the indoctrination hypothesis? "I am thoroughly happy with believing that children are predisposed to believe in invisible gods - I always was," says Dawkins. "But I also find the indoctrination hypothesis plausible. The two influences could, and I suspect do, reinforce one another." He suggests that evolved gullibility converts a child's general predisposition to believe in god into a specific belief in the god (or gods) their parents worship.
READERS are going to start thinking I'm obsessed, but I think the final proof that Barack Obama plans once and for all to elevate respect for Americans who don't practice a religion came at this morning's National Prayer Breakfast:
"There is no doubt that the very nature of faith means that some of our beliefs will never be the same. We read from different texts. We follow different edicts. We subscribe to different accounts of how we came to be here and where we're going next – and some subscribe to no faith at all.
"We know too that whatever our differences, there is one law that binds all great religions together. Jesus told us to "love thy neighbor as thyself." The Torah commands, "That which is hateful to you, do not do to your fellow." In Islam, there is a hadith that reads "None of you truly believes until he wishes for his brother what he wishes for himself." And the same is true for Buddhists and Hindus; for followers of Confucius and for humanists. It is, of course, the Golden Rule - the call to love one another; to understand one another; to treat with dignity and respect those with whom we share a brief moment on this Earth. (Emphasis added.)"
A notable repetition—not just once, rote, but twice, to let you know he means it.
"Politicians don't think they even have to pay us lip service, and leaders who wouldn't be caught dead making religious or ethnic slurs don't hesitate to disparage the "godless" among us. From the White House down, bright-bashing is seen as a low-risk vote-gette."
In a string of hot articles, two social scientists report that obesity, smoking, and other facets of health "spread" in networks. As the two friends expand their theory, doubters sharpen their questions
BOSTON—On the first snowy day in December, Nicholas Christakis and James Fowler are ensconced in Christakis's rambling home in Concord, Massachusetts, plotting their next conquest. Christakis, at his desk, is nearly hidden behind two enormous Apple computer screens that beam dizzying network patterns of lines and circles representing community ties. Fowler sits cross-legged and barefoot on
the couch, a laptop balanced on his knees. The pair are deep at work on their upcoming book, Connected: The Surprising Power of Social Networks and How They Shape Our Lives. On a mock cover taped to the wall, an orange goldfish leaps from one bowl of fish into another. The two men haven't left the house in 48 hours, and Christakis's watch stopped some time ago.
Christakis, a social scientist and hospice physician—cheerful, given his line of work—and Fowler, an easygoing political scientist, hatched a plan about 6 years ago to study how social relations influence health. Their initial scheme required a massive number of volunteers and $25 million. It didn't take off, as funders balked at the price tag. But soon after, they stumbled upon something even better that would catapult their careers: a collection of loose-leaf papers locked in a record room in Framingham, Massachusetts, home to patient files of the nearly 15,000 participants in the Framingham Heart Study, begun in 1948. ...
In a battered economy, free goods and services online are more attractive than ever. So how can the suppliers make a business model out of nothing?
By Chris Anderson
Over the past decade, we have built a country-sized economy online where the default price is zero -- nothing, nada, zip. Digital goods -- from music and video to Wikipedia -- can be produced and distributed at virtually no marginal cost, and so, by the laws of economics, price has gone the same way, to $0.00. For the Google Generation, the Internet is the land of the free.
Which is not to say companies can't make money from nothing. Gratis can be a good business. How? Pretty simple: The minority of customers who pay subsidize the majority who do not. Sometimes that's two different sets of customers, as in the traditional media model: A few advertisers pay for content so lots of consumers can get it cheap or free. The concept isn't new, but now that same model is powering everything from photo sharing to online bingo. The last decade has seen the extension of this "two-sided market" model far beyond media, and today it is the revenue engine for all of the biggest Web companies, from Facebook and MySpace to Google itself. ...
"Brilliant, essential and addictive...It interprets, it interrogates,
"Not just wonderful but plausible."
WALL STREET JOURNAL
"Where the dawning of the age of biology was officially announced."
"Thoughts to make jaws drop."
"The world's finest minds ...an intellectual treasure trove... intense, enlight-ening reading."
"A jolt of fresh thinking ... bracing, possibly shriek-inducing, and bound to wake you up."
GLOBE AND MAIL
pleas for critical thought in a world threatened by blind convictions."
"Hard to beat."
"The world's best brains."
"The coolest online forum."
"A remarkable feast of the intellect... an amazing group of the most interesting people on the planet." O'REILLY RADAR
"A great event in the Anglo-Saxon culture."
"As fascinating and weighty as one would imagine."
"The intellectual elite."
by everyone...they are the frontier."
O. THE OPRAH MAGAZINE
"Much to astonish...vitally engaging."
MAIL ON SUNDAY
andprovoking ... radiates
and provocative reading."
"Brilliant... a eureka
at the edge of knowledge."
"A major force on the intellectual
scene in the US ... required reading."
and creative magnificence."
"Unprecedented roster of brilliant minds ... nothing short of visionary."
stimulating ...It's like the crack cocaine of the thinking
BBC RADIO 4
"Brilliant minds at work... exhilarating, hilarious,
"The most explosive ideas of our age."
pipedreams at their very best."
"The greatest virtual
research university in the world."
ARTS & LETTERS DAILY
"Audacious and stimulating."
"A running fire of a provocative and fascinating theses."
"One of the most interesting stopping places on the Web."
NEW YORK TIMES
"A stellar cast of thinkers."
"The ultimate scientific
"Fascinating and thought-provoking
"Today's visions of science tomorrow."
NEW YORK TIMES
"A major force on the intellectual
scene in the US ... required reading."
"Awesome ... brilliant."
"Websites of the year ... Inspired Arena...the world's foremost scientific thinkers."
"Deliciously creative ... the variety astonishes ... intellectual
skyrockets of stunning brilliance. Nobody in the world
is doing what Edge is doing."
ARTS & LETTERS DAILY
"Marvellous ... highly recommended."
and outright entertaining."
thought provoking." THE GUARDIAN
brightest scientists and thinkers ... heady ... deep
— Kevin Kelly, WIRED
"Fascinating survey of
intellectual and creative wonders of the world ... Thoughtful
and often surprising ...reminds me of how wondrous our world
Bill Gates, THE NEW YORK TIMES
"An Enjoyable read".
"A-list: Dorothy Parkers Vicious
Circle without the food and alcohol ... a brilliant format."
"Big, deep and ambitous questions... breathtaking in scope."
obscure and almost always ambitious."
free ranging, intellectually playful ... an unadorned pleasure in curiosity, a collective expression of wonder at the
living and inanimate world ... an ongoing and thrilling
colloquium. "— Ian McEwan, THE TELEGRAPH