Intellectuals are not just people who know things, but people who shape the thoughts of their generation...
Edge is not so much the "Internet as highbrow cocktail party," as it is the "Internet as Center for Advanced Studies." Here, Brockman and the leading thinkers in a raft of scientific and social disciplines exchange ideas and build theories…and we get to watch.
EVERYONE HAS A fleeting fantasy in which they are reborn as, say, a Hollywood star or a stupendously wealthy author. My occasional fancy is that I am a science reporter of some repute, bringing beard-tuggingly important matters — such as the dialogue between science and religion — to the attention of readers and opinion-formers.
So I flirted with the idea of applying for a Templeton-Cambridge Journalism Fellowship in Science and Religion. The placement at Cambridge University would undoubtedly be fun — I’d spend two months listening to scientists, religious scholars and philosophers. I’d hang out with serious thinkers, meet high-minded hacks, my credentials as an intellectual would soar. With a stipend of about £10,000, plus book allowance and travel expenses, it wouldn’t be a badly paid gig, either.
The only hitch, apart from selling the jolly to my editors, was the origin of the cheque. The John Templeton Foundation is an enormously wealthy charity that awards an annual prize of $1.4 million for Progress Toward Research or Discoveries about Spiritual Realities (Sir John Templeton, a financier, insisted that the prize should be more lucrative than the Nobel Prize).
Over the past decade Templeton prizes have gone to scientists who have explored such concepts as nothingness, infinity, and multiple universes, exactly the kind of “wow” subjects that inspire awed contemplation. Next month the Cambridge University cosmologist John D. Barrow will receive his cheque at Buckingham Palace; he is praised for work that “has illuminated understanding of the Universe and cast the intrinsic limitations of scientific inquiry into sharp relief” .
Ah, yes, the “limitations of scientific enquiry”. This quote hints at the religious agenda of the foundation, which has become a significant donor to such institutions as Oxford University, where it is funding research to discover whether religious belief can reduce pain. The foundation is also paying for studies about the effect of prayer on health. That would be fine, were it not for the aims stated on the section of its website devoted to spirituality and health: “. . . the foundation hopes to contribute to the reintegration of faith into modern life”.
The foundation wisely rejects intelligent design but nevertheless emphasises the metaphysical dimension of any funded research: “What can research tell us about God, about the nature of divine action in the world, about meaning and purpose?” it asks. Which, to my reading, assumes the existence of both God and divine action.
Anyway, at the end of their jaunt, Templeton journalism fellows are “encouraged to write and publish news stories, editorial pieces, or magazine articles ... contributing to a more informed public discussion of the relationship between science and religion”.
Now, consider that one of my more memorable articles about just this topic contended that illusions of the divine may point to mental illness. Another article rubbished a study that claimed that childless couples could double their chances of IVF success by getting strangers to pray for them. Neither study was associated in any way with the foundation, but I wonder whether it would have considered those pieces “more informed”?
My vague misgivings have now been articulated by John Horgan, a science writer and agnostic who became a 2005 Templeton fellow. “I rationalised that taking the foundation’s money did not mean that it had bought me, as long as I remained true to my views,” he wrote last week in The Chronicle of Higher Education, the US equivalent of The Times Higher (click here to read his essay).
So, what happened when Horgan told a foundation official that he had no wish for religion and science to be reconciled? “She told us that . . . she didn’t think someone with those opinions should have accepted a fellowship.”
I applaud those writers who become Templeton fellows; I commend their desire to learn more and I wish them well in their efforts to keep an open mind. In truth, I envy them their two-month summer sabbatical.
Perhaps I lack backbone, but I worry that accepting the foundation’s largesse might make me a bit soft. And a soft reporter is the last thing needed by infertile couples who wrongly believe that a stranger’s prayer will help to bring them a child.
From left, Elizabeth Spelke, John Brockman, Seth Lloyd, and Daniel C. Dennett engage in a panel discussion of the book “What We Believe But Cannot Prove,” to which they all contributed, in Radcliffe’s Longfellow Hall yesterday.
Last night, three Harvard professors, a Massachusetts Institute of Technology (MIT) professor, and a Tufts professor provided their own answers to this question before a crowded audience in Askwith Lecture Hall at the Graduate School of Education.
The ideas that they debated included individual consciousness, a common human gene pool, and the existence of electrons.
The discussion, sponsored by the Harvard Bookstore and Seed Magazine, marked the recent release of the essay collection, “What We Believe But Cannot Prove: Today’s Leading Thinkers on Science in the Age of Certainty,” which was edited by John Brockman.
The panelists, who all contributed essays to the book, featured Harvard psychology professors, Daniel Gilbert, Mark D. Hauser, and Elizabeth Spelke, as well as a Tufts philosophy professor, Daniel C. Dennett, and an MIT engineering professor, Seth Lloyd.
Spelke said she believes human beings are alike, but that she also believes they are predisposed to believe they are fundamentally different.
She said, though, that she remained convinced that people are capable of overcoming their beliefs when these are disproved.
Gilbert claimed that “the only fact that proves itself is our own experience.”
“The fact of your experience is not a fact to me,” he said.
He argued that we can demonstrate “to our own satisfaction” that a creature has a consciousness.
After the introductory remarks, discussion focused largely on human consciousness and language.
The ability to communicate through language may be a key to determining other people’s consciousness and experience, according to Dennett.
But the panelists also pointed out that another unique feature of language is that it allows humans to hide intentions.
“God gave us language so we can conceal our thoughts,” Dennett responded.
Questions from the audience focused on issues such as free will, spirituality, and what constitutes certainty and proof.
“Proof is that which makes everybody shut up,” Gilbert said.
“No,” Hauser responded to break the silence, prompting laughter from the audience.
Todos los años, el sitio de Internet www.edge.org, que nuclea a los científicos más importantes y prestigiosos del mundo, inaugura el calendario haciéndoles a sus miembros una pregunta crucial. La de este año fue ni más ni menos que: ¿cuál es la idea más peligrosa del mundo? A continuación, las diez respuestas más explosivas, y una yapa.
...Presented with photos on a screen, the white Israeli infants preferred looking at new faces of their own race; African babies raised in Ethiopia preferred to look at African faces. But the Ethiopian-Israeli infants, who had been exposed since birth too people of both races, showed no preference. The import of this study is ambiguous, Spelke said. The finding could mean that babies aren't born prejudiced after all—that they earn to be wary of others only if they grow up in an isolated environment. Or it would mean that babies are programmed to to use people who look more like their own parents, and this instinct can be counterbalanced through enlightened education.
If the latter interpretation proved to be the case, Spelke would be optimistic. As she recently posted on Edge [*], a Web publication that airs scientific controversies, "Humans are capable of discovering that our core intuitions about geometry once led humans to believe that the world was flat—until the science that humans perfected proved otherwise—core intuitions night lead us to believe that linguistic and racial differences mean something more fundamental than they really do.
"Nobody should ever be troubled by our research, whatever we come to find," Spelke told me. "Everybody should be troubled by the phenomena that motivate it: the pervasive tendency of people all over the world to categorize others into different social groups, despite our common and universal humanity, and to endow these groups with social and emotional significance that fuels ethnic conflict conflict and can even lead to war and genocide." This mirrors her belief that, in time, feminism will embolden more women to take up high-level careers in the physical sciences, and more of us will recognize hoe alike men's and women's minds really are. For Spelke, who has spent most of her life documenting the core knowledge that we're born with, the most important thing about it is our uniquely human abilities to rise above it.
[* ED. NOTE: See "The Science of Gender and Science—Pinker vs. Spelke, A Debate"]
John Brockman: 40 years of "intermedia kinetic environments"
Here's what the New York Timeshad to say about "cultural impresario," sci/tech literary uber-agent, and EDGE founder John Brockman -- 40 years ago, today. Snip from "So What Happens After Happenings," an article dated Sunday, September 4, 1966. "Hate Happenings. Love Intermedia Kinetic Environments." John Brockman is partly kidding, while conveying the notion that Happenings are Out and Intermedia Kinetic Environments are In in the places where the action is.
John Brockman, the New York Film Festival's 25-year-old coordinator of a special events program on independent cinema in the United States, plugging into the switched-on "expanded cinema" world in which a film is not just a movie, but an Experience, an Event, an Environment. ...
posted by Xeni Jardin at 09:26:03 PM
...The above are the opinions of experts on profound issues of love, consciousness, existence of God. However, the laypeople, too, reach a similar conclusion with the help of their common sense, which are often vague, prejudiced, and what an expert would term as irrational. Paradoxically, the rational as well as the irrational mind reaches a similar conclusion though from the opposite directions. What is then the path to truth?
Democracy is not the best way to rule a country. The concept of the free will disappear the more we learn about the brain. Internet undermines the quality of our relationships. Read the leading brains of the world list their most dangerous ideas.
You might have wondered who all those people are who write explicitly mean anonymous comments online. Face to face, most people are pretty well behaved, but a worrying number of them show a whole other face protected by their digital KuKlux Klan-hood. The danger with anonymity is one of the thoughts being debated when New York-based literary agent John Brockman asks the world's leading thinkers about their most dangerous ideas. ...
"Woran glaubst du, obwohl du es nicht beweisen kannst?", wollte Edge-Herausgeber John Brockman letztes Jahr wissen. Zuvor waren Fragen wie "Welches ist die wichtigste unerzählte Geschichte?", "Was ist die bedeutsamste Erfindung der letzten zweitausend Jahre?", "Was sind die akutesten wissenschaftlichen Probleme?" oder schlicht "Was nun?", die in die Runde geworfen worden. Mit der Fragestellung für 2006 ist es gelungen, die Atmosphäre der Dringlichkeit im Generellen noch weiter anzuheizen. "Was ist deine gefährlichste Idee?", will Edge wissen. Geantwortet haben 172 Wissenschaftler, die sich als der Third Culture-Community zugehörig begreifen und das Ideal eines Intellektuellentypus hochhalten, der den Naturwissenschaften statt der Literatur als Leitdisziplin zugewandt ist.
172 Wissenschaftler antworteten auf die Edge-Frage 2006
Seit nunmehr neun Jahren startet die Stiftung Edge mit einer Umfrage zu einem großen generellen Thema ins neue Jahr. 172 Wissenschaftler haben diesmal geantwortet. Sie geben preis, was sie für ihre gefährlichste Idee halten, die wahr werden könnte.
In a front-page article, Il Sole 24 Ore, Italy's largest financial daily, announced the "Edge Question Forum" in "Domenica", the weekend Arts & Culture section. The Forum, an ongoing project designed to bring third culture thinking to Italy, features excerpts from the Edge responses in addition to articles solicited rom Italian humanist intellectuals and scientists.
What We Believe But Cannot Prove: Today’s Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Harper Perennial, 252 pp., paperback, $13.95
Chasing Spring: An American Journey Through a Changing Season
By Bruce Stutz
Scribner, 239 pp., $24
Pilgrim on the Great Bird Continent: The Importance of Everything and Other Lessons From Darwin’s Lost Notebooks
By Lyanda Lynn Haupt
Little, Brown, 276 pp., illustrated, $24.95
For the past eight years, the website www.edge.org has tried to provoke its distinguished roster of contributors with a big, elegant question. Last year's question was this: What do you believe to be true even though you cannot prove it?
A hundred and nine prominent thinkers, including folks as accomplished as Richard Dawkins, Steven Pinker, Rebecca Goldstein, and Freeman Dyson, responded. Their answers are collected in a new book, ''What We Believe But Cannot Prove," and it makes for some astounding reading.
What do they believe (but can't prove)? Many believe there is an external reality independent of consciousness. Many believe life is pervasive in the universe. Several believe, in the words of neurologist Robert M. Sapolsky, that ''there is no God(s) or such a thing as a soul." Theoretician Judith Rich Harris believes the Neanderthals disappeared because Homo sapiens ate them. Two believe there is a God. One believes in true love. The longtime New Scientist editor Alun Anderson believes cockroaches are conscious. Harvard professor Daniel Gilbert believes you are true -- that is, he believes you have an inner life and a sense of self -- even though he cannot prove it.
Taken as a whole, this little compendium of essays will send you careening from mathematics to economics to the moral progress of the human race, and it is marvelous to watch this muddle of disciplines overlap. Will the human brain eventually be able to discover all there is to discover about the physical world? Or will there always be things that we will not know?
A few months before edge.org proposed its 2005 question, former Natural History editor Bruce Stutz was recovering from heart-valve surgery in the torpid gloom of a Brooklyn winter. Dazed, drained of energy, and feeling suddenly that his ''most verdant years" were behind him, Stutz began yearning for spring.
He yearned not just for the material manifestations of the season, but for its sparkle, the ''transformative energy possessed by growing, blossoming, transmuting things." By the end of June he traveled almost 10,000 miles in a 1984 Chevy Impala, watching spring creep northward across the United States, seeing everything he could, and writing it all down.
Part travelogue, part environmental assessment, part midlife crisis, ''Chasing Spring" is as much about Stutz himself as it is about the season. He slogs from Louisiana to Arizona to Utah to Alaska, chatting with scientists, watching birds, testing his heart on mountain passes and in the tundra of the Arctic National Wildlife Refuge. By the end of his odyssey, the spring of Arctic Alaska and the spring of Stutz's own soul have become inseparable.
''Chasing Spring" is an eclectic and digressive book. Its author makes the baffling claim that 18,000-foot peaks rise from the deserts of Arizona. He is effusive enough to offer abbreviated ruminations on the Great Salt Lake, the Roman festival of Lupercalia, and the picking of morel mushrooms.
But the charm of ''Chasing Spring" is in its raw enthusiasm, Stutz's personal invigoration braided into the ongoing invigoration of the continent. ''Bring on that juice and joy!" he writes. ''I'm ready now for the spring forest in its mist, spring it its musk, in its spring greens: canopy green, shafts of wild iris green." Reading Stutz is a bit like reading Whitman: You imagine the author stomping gleefully in puddles, peering on his hands and knees into the forget-me-nots and saxifrage.
Spring, of course, means someone is publishing another book about Darwin. Thankfully, this year it's an excellent one. Lyanda Lynn Haupt's ''Pilgrim on the Great Bird Continent" probes Darwin's journals, pocket notebooks, and letters with the goal of understanding how a beetle-obsessed, squeamish, overprivileged 22-year-old could spend five years circumnavigating South America and emerge as a polished naturalist whose vision would change human understanding forever.
Even more than ''Chasing Spring," ''Pilgrim" is about what Haupt calls ''deep watchfulness." She argues that Darwin's notebooks ''foist upon us his strict but beautiful maxim. Nothing in the natural world is beneath our notice -- he almost whacks us on the head with it. Nothing."
Here is a young, apprehensive, occasionally self-absorbed Darwin who gradually strips away his vanities to find an intensity of observation that borders on the religious. He swims with iguanas; he waits four hours on his knees in the mud to glimpse a sedge wren. He stands perfectly motionless in a forest until a shy bird will, in his words, finally ''approach within a few feet, in the most familiar manner." He lies on his back for an entire hour simply to watch the slow circling of condors. Ultimately Haupt's portrait is of a devastatingly sensitive man who teaches himself to approach the world with a profound humility.
Watch what's going on around you, forget being hungry or wet, and bring all your intelligence to being present. If any time of the year is about throwing open your windows and letting the energy of the world pour over you, it has to be springtime.
Perhaps Darwin's own son said it best: ''I used to like to hear him admire the beauty of a flower; it was a kind of gratitude to the flower itself and a personal love for its delicate form and colour. I seem to remember him gently touching a flower he delighted in; it was the same simple admiration that a child might feel."
Anthony Doerr is the author of ''The Shell Collector" and ''About Grace."
The headlines were intriguing. “A Free-for-All on Science and Religion,” wrote the New York Times. “Losing Our Religion: A gathering of scientists and atheists explores whether faith in science can ever substitute for belief in God,” was Newsweek’s version. New Scientist magazine called its article “Beyond Belief—In Place of God: Can secular science ever oust religious belief—and should it even try?”
The reports summarized the highlights of a conference, held Nov. 5-7 at the Salk Institute in La Jolla, Calif., that attracted a large number of very prominent scientists, mostly from the United States and Britain, for a discussion called “Beyond Belief: Science, Religion, Reason and Survival.”
Richard Dawkins was there, an evolutionary biologist from Britain who wrote “The God Delusion,” currently a best seller.
Sam Harris, a doctoral student in neuroscience, also spoke. He is author of “Letter to a Christian Nation,” another recent best seller, as well as an earlier book, “The End of Faith: Religion, Terror and the Future of Reason.”
Physicist and Nobel laureate Steven Weinberg also spoke, as did Neil deGrasse Tyson, director of the Hayden Planetarium in New York. Carolyn Porco of the Space Science Institute in Boulder, Colo., seemed to be one of the very few women speakers in a conference dominated by white men.
The published accounts mentioned above emphasize that the overwhelming majority of the conferees identified themselves as atheists or non-believers and the speakers posed the issue as a conflict between reason and dogma. But they sharply debated one another on what scientists’ attitude should be toward religion.
If anyone at the conference took a historical materialist view of this question—that is, a Marxist view—the mass media did not report it.
That alone is worthy of note, because for many years a conference in the U.S. that promoted atheism would have been branded “communist” by much of the commercial media. That certainly was the case during the years of the Reagan administration, when the influence of the religious right in politics was very consciously promoted at the same time that a major assault was being made on social programs benefiting the working class.
It was considered a noteworthy break with these political and ideological forces when Nancy Reagan later disagreed publicly with the religious right over the issue of stem-cell research, after her husband was diagnosed with Alzheimer’s disease.
But since the collapse of the USSR, the debate over science and religion has taken a new turn. The prominent speakers at this conference could not be considered leftists by any stretch of the imagination.
What Marx said about religion
When Karl Marx wrote about religion in the mid-19th century, at a time when much of the new ruling bourgeois class in Europe still identified with the Enlightenment as against medieval dogma, he was able to say about the German intellectual establishment that, “[T]he criticism of religion has been essentially completed.”
But he went on to explain why religion continued to have a strong influence among the masses.
“Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people.
“The abolition of religion as the illusory happiness of the people is the demand for their real happiness. ... The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo.” (Karl Marx, “A Contribution to the Critique of Hegel’s Philosophy of Right,” 1844)
Marx’s term “the opium of the people” is often quoted out of context, as though it were nothing but a slur against religion. But here it is obvious that he was referring quite eloquently to how people turn to religion to dull their pain over unbearable social conditions that need to be abolished.
Marxism goes to the heart of the problem. The new capitalist class needed rationalism as against dogma in order to lay the basis for the tremendous scientific-technological development that vastly expanded its means of production and commerce. But capitalism brought with it new horrors for the masses—the conversion of much of the peasantry into wage laborers working 12 to 14 hours a day in the hellish mines and factories.
Thus this new system, which needed rationalism and science in order to grow, at the same time propagated the social conditions that ensured a continued place for religion among the masses. Even today, after several centuries of scientific discoveries that have transformed the way in which every daily task is done—and have brought immense fortunes to those in the ruling class—a large percentage of the people cling to religion as “the heart of a heartless world,” to use Marx’s phrase.
Did the conference in La Jolla look at religion in this social context? Not if the published accounts correctly represent it.
What, then, spurred on scientists to organize such a gathering at this time?
One would certainly expect that much of the energy for it came from the need to respond to the increasing efforts by the religious right and certain corporate interests to impose anti-scientific views on society. The attempts to legislate the teaching of “creationism” as opposed to evolution, the opposition to stem-cell research by churches claiming to defend the “unborn,” the denial of global warming by scientists funded by energy companies—all this cries out for a counter-attack by scientists. Undoubtedly, many of the attendees at the conference came because of this political climate.
But there was another and more disturbing motivation, and it was pushed by some of the most prominent speakers.
The Web site edge.org is devoted to scientific discussion. According to a critique of the conference written for Edge by participant Scott Atran, “We first heard from Steven Weinberg, and then from every other second speaker, about the history of Islam, about why Muslim science went into decline after the 13th or 14th centuries, and about why suicide bombers, the most fanatically religious of all would-be mass murderers, are an outgrowth of Islam. Missing at ‘Beyond Belief’ was erudition and deep understanding of Islamic history other than the usual summaries of names and achievements. ...
“We heard from Sam Harris that Muslims represent less than 10 percent of the population in Western European countries such as France, but over 50 percent of the prison population. The obvious inference expected from the audience is that Islam encourages criminal behavior. ...
“Richard Dawkins tells us that Islam oppresses women.”
The New York Times article of Nov. 21 confirmed that Islam-bashing was a strong component of this conference. “By shying away from questioning people’s deeply felt beliefs, even the skeptics, Mr. [Sam] Harris said, are providing safe harbor for ideas that are at best mistaken and at worst dangerous. ‘I don’t know how many more engineers and architects need to fly planes into our buildings before we realize that this is not merely a matter of lack of education or economic despair,’ he said.”
In Harris’s book “Letter to a Christian Nation,” he tries to ingratiate himself with Christians in the United States by saying, “Nonbelievers like myself stand beside you dumbstruck by the Muslim hordes who chant death to whole nations of the living. But we stand dumbstruck by you as well—by your denial of tangible reality, by the suffering you create in service to your religious myths, and by your attachment to an imaginary God.”
Harris says he started writing the book the day after 9/11.
Clearly, the time has not yet come when scientists in the imperialist countries can be expected to organize a truly scientific discussion on religion. That would require an honest, dispassionate view of the world today as it is: divided between the rich and the poor, the oppressor and the oppressed, the imperialist countries and those fighting against efforts to re-colonize them.
Islamic fundamentalism is flourishing among the oppressed as U.S. and British imperialists inflict unspeakable atrocities on the peoples of the Middle East. It cannot be equated with Christian fundamentalism in Western imperialist countries.
What is needed to counteract dogma is not just atheism but a Marxist-Leninist world view that understands religion and all social phenomena in their real context and can apply this to the current period in human history, which is characterized above all by the capitalist division of society into opposing social classes and a world system in which a few imperialist countries super-exploit the majority of the human race. The triumph of “reason” will come when the masses of people overturn this unjust, antiquated social system.
|Thinkers Lay Out the Beliefs They Can't Prove|
Our day-to-day beliefs often come from established theories, but what about beliefs based on theories in progress? A new book asks literary and scientific thinkers about what they believe but cannot prove.
John Brockman, editor, What We Believe But Cannot Prove: Today's Leading Thinkers in Science in the Age of Certainty; author and literary agent; publisher and editor of Edge.org
Richard Dawkins, evolutionary biologist; professor of the public understanding of science at Oxford University; author of many books about science and evolution, including The Selfish Gene and most recently, The Ancestor's Tale: A Pilgrimage to the Dawn of Evolution
Alison Gopnik, professor of psychology at the University of California, Berkeley; her books include The Scientist in the Crib
Paul Steinhardt, theoretical physicist; Albert Einstein professor of science at Princeton University
Cultural impressario John Brockman publishers at his website Edge.org answers to dangerous questions. Mankind does not come off well.
...You get absorbed in reading answers to the question, publishes as the Edge Annual Question 2006. Though the intellectuals and scientists form no coherent group, there is a general tenor. Philosophizing on free weill has no purpose if you haven'y been buried in neuro-biology for a year or two.
A wide cross-section of people from among the intelligentsia responded to this fundamental paradox of life. The cynic and the optimist, the agnostic and the believer, the rationalist and the obscurantist, the scientist and the speculative philosopher, the realist and the idealist-all converge on a critical point in their thought process where reasoning loses its power. Love, existence of God, primacy of the entity called consciousness or life were the issues that came within the purview of the deliberation.
Edge.org has an article titled "Who Really Won the Super Bowl?" by Marco Iacoboni, a neuroscientist at the U.C.L.A. Ahmanson-Lovelace Brain Mapping Center. Dr. Iacoboni and his colleagues used fast magnetic resonance imaging technology to observe brain responses to commercials shown during the Super Bowl.
The overwhelming winner among the Super Bowl ads is the Disney-NFL "I am going to Disney" ad. The Disney ad elicited strong responses in orbito-frontal cortex and ventral striatum, two brain regions associated with processing of rewards. Also, the Disney ad induced robust responses in mirror neuron areas, indicating identification and empathy. Further, the circuit for cognitive control, encompassing anterior cingulate cortex and dorsolateral prefrontal cortex, was highly active while watching the Disney ad....
The three biggest flops seem to be the Burger King ad, the FedEx ad, and the GoDaddy ad. Three quite interesting features that come out of this instant study are the following: first, people — when interviewed — tend to say what they are expected to say, but their brain seems to say the opposite. For instance, female subjects may give verbally very low "grades" to ads using actresses in sexy roles, but their mirror neuron areas seem to fire up quite a bit, suggesting some form of identification and empathy. Second ... we saw strong habituation effects, such that the second time around the commercial induces much weaker responses. Third — and this is probably interesting to neuroscientists — among brain regions associated with complex social behavior, we observed a mix of activation and de-activation.
Can a person be considered cultured today with only slight knowledge of fields such as molecular biology, artificial intelligence, chaos theory, fractals, biodiversity, nanotechnology or the human genome? Can we construct a proposal of universal knowledge without such knowledge? The integration of "literary culture" and "scientific culture" is the basis for what some call the "third culture": a source of metaphors that renews not only the language, but also the conceptual tookit of classic humanism
The New Humanists
A polifacética figure
Brockman and the New Intellectuals
“Science won the battle”
“¿Qué queda del marxismo? ¿Qué queda de Freud? La neurociencia le ha dejado como una superstición del siglo XVIII, de ideas irrelevantes"
FIRST CAME the Beethoven concert, the boat trip on a lake and the fine dinner; then the tearful goodbyes and the barbiturates. On the eve of her 67th birthday, surrounded by her adoring children, Dr Anne Turner finally ended a life that would have been cruelly curtailed by progressive supranuclear palsy, an incurable degenerative disease. “I don’t think death has ever held any fear for me,” she once said.
Suicide is a horribly arresting phenomenon. Remember the photograph of the lawyer teetering on a window ledge in West London before jumping to her death? Remember the footage of the young Indian woman who threw herself and her two young children under the Heathrow Express?
Taking one’s own life goes against one of our strongest urges — the instinct of self-preservation. The deterioration of this instinct, says Thomas Joiner, Bright-Burton Professor of Psychology at Florida State University, should be regarded as a symptom of disease. “There’s an idea that suicide is a mode of death that stands apart from others, but there are clear reasons why people die by suicide,” he says. “Just like heart disease, if you understand it, you can prevent it.”
His theory, outlined in Why People Die By Suicide (Harvard University Press), published this month, is that it happens when severely depressed people acquire fearlessness. How do people become fearless? Through practice and learning, he says. This explains the bouts of self-harm or failed suicide attempts that are not cries for help so much as rehearsals for a deadly finale.
He also points out that certain groups who are exposed repeatedly to pain and suffering — anorexics, doctors, athletes, prostitutes — have higher rates of suicide than other groups. Their acquired immunity to fear and pain is the extra crucial ingredient that, combined with a perception of being a burden and a feeling of not belonging, can have a fatal outcome.
Joiner, whose father killed himself, adds that anti-suicide campaigns may be counterproductive because they serve as a reminder of the act. He says that the most effective way of preventing suicide is to improve a person’s sense of belonging and contribution to society. Since killing oneself requires fearlessness, shouldn’t we revise the portrayal of suicide as the ultimate act of cowardice?
ON TO cheerier matters. When people turn up to a dinner before the appointed 7pm start, you know it’s going to be fun. And so it was on Tuesday when the literary agent John Brockman hosted a gathering in Soho. I showed up at 7.10pm, depriving myself of ten minutes of serious schmoozing.
Brian Eno was there, as were Richard Dawkins and Simon Baron-Cohen, the autism researcher. Colin Blakemore, the head of the Medical Research Council, came along, joining the authors Olivia Judson, Matt Ridley, Armand Leroi and David Bodanis (the fastest talker I’ve ever met). Ian McEwan dropped by. The editors ofNature, New Scientist and Prospect mingled amiably.I ended up sharing a pudding plate with Craig Venter, the Celera Genomics entrepreneur who helped to unravel the human genome and in whose honour the dinner was held. Venter feels aggrieved at his portrayal in the British press as a ruthless, money-grabbing maverick (I’d be a bit miffed, too, if my enemies compared me with Hitler, as happened to Venter in a book extract in The Guardian). He points out that he owns fewer patents than Francis Collins, the publicly funded American scientist who was another leading figure in the Human Genome Project.
But when you’re clever enough to start your own research institute, rich enough to drive an Aston Martin and famous enough to have inspired several unauthorised biographies, surely you can rise above it? “Underneath, it still hurts,” says Venter, with endearing honesty. I await his autobiography, due out next year, with eagerness.
- TRUST Ken Livingstone to come up with another original political idea. At a debate in City Hall to mark the paperback publication of Collapse: How Societies Choose to Fail or Succeed, by Jared Diamond, the Mayor of London suggested that anyone believing in the afterlife should be barred from public office. If politicians thought they had only this life, he argued, they would make a better fist of it. The audience clapped with delight. I don’t think Ken was joking.
La afirmación políticamente más incorrecta, a cuyo autor pueden acusarlo de racista si no de nazi, es que hay grupos humanos cuyas características genéticas los hacen más inteligentes que otros.
Lo malo es que esto lo afirman algunos científicos al contestar a la pregunta que hace cada año The Edge (www.edge.org), órgano de un club de sabios de todo el planeta que se plantean problemas aparentemente simples que son comple- jísimos. La cuestión de 2006, que responderán hasta 2007 miles de investigadores, la presentó Steven Pinker, psicolingüista, profesor de psicología en Harvard. Recuerda Pinker que la historia de la ciencia está repleta de descubrimientos que fueron considerados social, moral y emocionalmente peligrosos; los más obvios, la revolución copernicana y la darwiniana.