Edge in the News

http://latimesblogs.latimes.com/jacketcopy/2009/06/does-language-shape-our-think... [7.17.09]


An essay on how language influences thought from the pop-science anthology "What's Next: Dispatches on the Future of Science" has been posted on The Edge. Author Lera Boroditsky, an assistant professor of psychology, neuroscience and symbolic systems at Stanford, writes:

Most questions of whether and how language shapes thought start with the simple observation that languages differ from one another. And a lot! Let's take a (very) hypothetical example. Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.

She brings up experiments and other examples involving use of language and direction, time, color and gender, all of which seem to demonstrate that yes, language shapes how we think.

But my favorite is this example above. Only a linguist -- or perhaps a social scientist -- would put Chomsky in a hypothetical. 

-- Carolyn Kellogg

http://timestranscript.canadaeast.com/opinion/article/727019 [7.12.09]

By Norbert Cunningham

Hello everyone! I've a little science today, but first note that language is a communication tool; it's what allows us to relate our experiences and thoughts, all, of course, as processed by our brains.

And while we don't yet understand how our brains work very well, scientists have lately been making remarkable progress, in part thanks to new technologies such as MRI scanners that allow them to observe healthy (as well as damaged) brains as they work.

Time flies

Now to connect this to language: we've all heard or used the cliché that "time flies."

Anybody over 40 has also likely remarked at how much faster time flies as you get older. It's a common observation many find puzzling: why does summer as a youngster seem "endless," yet pass in the blink of an eye to adults who swear it was only a couple weeks ago that they put the snow shovel away?

Does time speed up in some magical, bizarre way as we age?

Do people in traumatic events like car crashes actually witness time slowing down as they so often report?

The intuitive answer is that these are matters of perception more than reality in which time has been said to flow like a river, sure and steady (only, as Einstein showed, does time actually slow significantly enough to truly notice if you travel at incredibly high speeds beyond anything most of us will ever experience).

That intuition it is our perception has been shown to be correct, so why do we perceive time to speed up as we age? Our everyday language and the millions of people commenting on the fact are not wrong: the perception is real.


I take the following from an essay titled "Brain Time" by Dr. David M. Eagleman which appears in a book called "What's Next? Dispatches on the Future of Science," edited by Max Brockman.

Dr. Eagleman is a bright young scientist who has undergraduate degrees from Rice University and Oxford University in literature, but who obtained a doctorate in neuroscience from the Baylor College of Medicine 11 years ago. Today he is director of the Baylor College of Medicine's Laboratory for Perception and Action. The lab's long-term goal is to "understand the neural mechanisms of time perception," which in plain English is to figure out how our brains make us think time has slowed or sped up when it hasn't. It was his and his colleagues' work that allowed me to say that our intuitions are correct: people in traumatic situations do perceive time to slow, but a hair-raising experiment shows they have no extra time to react or do anything extra beyond what would normally be possible.

An explanation

We perceive the slow motion because time and memory are "tightly linked," says Dr. Eagleman. In such critical situations a part of our brain called the amygdala kicks into high gear and takes over most of the brain's resources. This forces a secondary memory system to do the processing, a system that can later produce flashbacks of the sort soldiers with post-traumatic stress experience. This backup memory is "stickier" than what our brains usually use to store memories, producing more vivid and clear images in our minds; more detail. And in remembering these, since there are many more images, just like inserting extra images in a movie reel, it makes the event appear to last longer and slows motion down. That much is fairly certain.

Less certain, but strongly suspected by Dr. Eagleman, is that the same or a similar process is what makes time seem to speed up as we age. As we experience ever more in life, familiar patterns recur and the memories our brains store get ever more compressed. Our brain can skip or compress a lot of things we know or have already experienced because we've got the general template from the first time and it need add only new details. As a result, when we draw on our memory, it is much less vivid and detailed, having the effect of cutting some frames out of a film, which seemingly speeds time up. Children, on the other hand, are frequently having first-time experiences, encountering novel things. Their brains store that information in all its detail and richness since it is the first time. Recalling it, even decades later, we remember those "endless summers" and wonder whatever happened to them. This is not "proven" yet, but it sounds logical and fits with what is known about the sensation of time slowing when in a critical situation. It's a good tentative explanation.

Full circle

So it's back to language, which does describe what we perceive well. But time to us is not really a steady flowing river after all. It's relative, according to how our brains stored our memories. And time flies as we age . . . it seems.

The last word

Here is author and wit Douglas Adams:

Time is an illusion. Lunchtime doubly so.

* Lex Talk! is researched and written by Times & Transcript editorial page editor Norbert Cunningham. It appears in this space every Monday.

NEWSWEEK [7.12.09]

About 10 years ago, biology entered betting season. An upstart scientist named J. Craig Venter jolted the genetics establishment by launching his own gene-sequencing outfit, funded by commercial investment, and setting off toward biology's holy grail—the human genome—on his own. It was Venter versus the old guard—old because of where they got their money (governments and trusts) and the sequencing technique they wanted to hold onto. Venter won that race, and not because he got there first. By combining the freedom of academic inquiry and commercial capital, he came up with a new way of doing science so effective that it forced the old institutions to either ramp up or play second fiddle.

With Venter's momentum, biology has continued to surge into new territory, but now he's not alone in pushing the pace. In fact, with his staff of hundreds at the J. Craig Venter Institute, he is looking dangerously like the establishment he raced past almost a decade ago. Another maverick in the stable, Harvard biologist George Church, is a titan in the academic world, tackling the major challenges of genomic-age biology with an ingenuity distinct from Venter's. Both are building on the foundation of DNA sequencing, trying to drive down the cost of decoding individual genomes and—the more radical enterprise—using their digital control of cells and DNA to design new organisms. Between them, Venter and Church direct or influence a major portion of work in both sequencing and synthetic biology, including three different commercial efforts to develop bacteria that could produce the next generation of biofuels.

There's reason to believe that Church has a decent chance of unseating Venter as biology's next wunderkind. The field of genomics is only at the beginning of its growth spurt—sequencing, it turns out, was just phase one. Far from producing answers, the sequenced genome has instead led scientists into a thicket of questions: What exactly do combinations of genetic code produce in an organism over a lifetime? If we can read the script, can we also write it? Leading science out of the genomic wilderness arguably calls for a vision more deeply imaginative than the task of the Human Genome Project, which was clearly framed and, at heart, a code-reading slog. Radical invention—the kind of out-of-left-field inspiration that makes a thinker either brilliant or totally unrealistic—is the strength of Church, as opposed to Venter, who is more of an aggregator, a connector of existing ideas and methods. The script of this new biology is largely unwritten, and just because Venter turned the first page doesn't mean that in the end his vision will prevail. "Sometimes," Church says, "it's best to be second."

The quest for ideas farther afield may be one reason Venter joined the Harvard faculty this spring—his first academic post since 1982. (Venter declined to be interviewed for this article.) He and Church are even members of the same research initiative, called Origins of Life, where they're investigating life in its most basic genetic and molecular forms. Venter's participation is a sign of just how widely applicable the high-concept work of the university could be. More than ever, over the uncarved terrain of the new biology, Venter and Church are blurring the distinction between the academic and the commercial. Steven Shapin, a Harvard historian of science, says that at this point we must "stop categorizing—and just look at what these people are doing." On top of all the daring science, Venter and Church are also conducting a "sociology experiment": "They're making up their own social roles," Shapin says, "making up themselves." All the while, Church insists that he and Venter are "not right on top of each other" but are "part of the same ecosystem," fulfilling different roles. Then again, Shapin points out, "the lion and the wildebeest are in the same ecosystem." The question is, who's the lion?


The humanities are in the same state financial markets were in before they crashed. Assessing the growing mountain of toxic intellectual debt, Philip Gerrans considers going short on some overvalued research.. ...

...The academic market is also like the financial market in another way. Stocks trade above their value, which leads to bubbles and crashes. Brain- imaging studies, for example, are a current bubble, not because they don't tell us anything about the brain, but because the claims made for them so vastly exceed the information they actually provide. As with a leveraged investment in mortgage bonds hedged by a foreign-exchange credit swap, most customers have no idea how a brain-imaging result is produced and what it is really worth. Those who do - the ones in labs using complicated statistical algorithms to map impossibly messy signals to artificial 3D models of brains - are usually very circumspect about the results. But every week we read in the science pages that brain-imaging studies prove X, where X is what the readers or columnists already believe. Women can't read maps! Men like sex! Childhood trauma affects brain development! There is an Angelina Jolie neuron! The bosses of big labs that employ hundreds of people use these studies, along with artfully placed articles about them, to get funding for future research. In a similar way, directors of mining companies raise funds on the basis of prospecting reports "leaked" to the financial press.

Consider, as an unrivalled piece of hyperbole, this statement from the websiteEdge.org, which aims "to arrive at the edge of the world's knowledge" by seeking out "the most complex and sophisticated minds". It is by Vilayanur S. Ramachandran, a brilliant experimental neuroscientist as well as a master publicist: "The discovery of mirror neurons in the frontal lobes of monkeys, and their potential relevance to human brain evolution ... is the single most important 'unreported' (or at least, unpublicised) story of the decade. I predict that mirror neurons will do for psychology what DNA did for biology: they will provide a unifying framework and help explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments."
That's not very likely. Mirror neurons are neurons in the monkey premotor cortex that are active both when a monkey produces an action such as grasping, and when it observes the action. No one yet knows quite why there is an overlap in patterns of neural activity. Ramachandran would like to find out, so he has made his pitch to investors. They know he has done some beautiful experiments and he is a charismatic public performer and Edge.org regular, so we can expect the mirror neuron boom to continue for a while. ...

[ED. NOTE: Philip Gerrans writes: "So we can expect the mirror neuron boom to continue for a while". Is nine years enough time to make this point? See Ramachandran's Edge essay "Mirror Neurons and imitation learning as the driving force behind 'the great leap forward' in human evolution" published on June 1, 2000. —JB]

Read the full article →

http://www.washingtontimes.com/news/2009/jul/19/books-whats-next-dispatches-futu... [7.8.09]

Edited by Max Brockman
Vintage, $15, 256 pages

People's exposure to the world of science is too often limited to watching the Discovery Channel or "reading" National Geographic. But the essence of science is not only what is happening today, but what could happen tomorrow. "What's Next? Dispatches on the Future of Science" is a book of science essays collected and edited by Max Brockman. It boasts that the authors of the 18 original essays that make up this book come from a "new generation of scientists" and are the future of science.

The essays cover a range of topics. In "Will We Decamp for the Northern Rim?," Lawrence C. Smith writes that the world can't escape global warming, regardless of policy changes. Stephon H.S. Alexander discusses dark matter and vacuum energy in "Just What Is Dark Energy." Vanessa Woods and Brian Hare's "Out of Our Minds: How Did Homo sapiens Come Down From the Trees, and Why Did No One Follow?" notes the theory of evolution and its relation to humans is still a work in progress.

In his essay, "Watching Minds Interact," Jason P. Mitchell argues that humans are superior because "natural selection has equipped us with an adaptation more fearsome than teeth or claws: the human brain." He reports how neuroscience has begun to show "how exquisitely sensitive our minds are to the goings-on of the minds around us by suggesting that our brains spontaneously mirror the pattern of activity of other brains in our vicinity." This is important because it means we're social beings; "our brains prefer to be in register with the brains around us."

In tandem, Matthew D. Lieberman's "What Makes Big Ideas Sticky?" explores how minds relate to one another. Mr. Lieberman references great thinkers like Descartes, Thomas Aquinas and Plato and compares Eastern and Western religions, saying that while we would "like to think of our beliefs as stemming from some combination of logical analysis and peer influence," they more likely come from genetic roots. This has been seen recently in multiple studies and Lieberman points to "Baldwin Way, a postdoctoral fellow in my lab at UCLA, [who] has recently come across a key genetic difference between individuals of Eastern and Western descent that differentially affects their brains."

Religion and science are usually subjects that get along as well as water and oil, but it does not stop these scientists from tackling them. Evolution and the big bang theory are both discussed at length from differing perspectives in Sean Carroll's "Our Place in an Unnatural Universe" and in Nick Bostrom's "How to Enhance Human Beings."

"Medical science is difficult," writes Mr. Bostrom. "We know this because, despite our best efforts, it often fails. Yet medicine typically aims merely to fix something that's broken. Human enhancement, by contrast, aims to take a system that's not broken and make it better — in many ways a more ambitious goal." He discusses enhancement to give people more mental energy, to increase DNA repair activity in cells, and improve concentration.

Whether scientists should even be making these types of changes is also called into question; the need to make ethical decisions in science are not uncommon, but Sam Cooke asks in his essay "Memory Enhancement, Memory Erasure: the Future of Our Past" whether scientists should. "Some may argue that it is not the role of scientists to make ethical judgments about the potential impact of their work — that such decisions are the job of the government, or the electorate, who should decide which scientific research is funded by public money and which is not."

Nonetheless, Joshua D. Greene believes there is a science to making moral and ethical decisions. In his essay "Fruit Flies of the Moral Mind," he discusses the "complex interplay between intuitive emotional responses and more effortful cognitive processes" involved with making moral judgments.

"People sometimes ask me why I bother with these bizarre hypothetical dilemmas," says Mr. Greene. "Shouldn't we be studying real moral decision making instead? To me, these dilemmas are like a geneticist's fruit flies. They're manageable enough to play around with in the lab but complex enough to capture something interesting about the wider and wilder world outside." An interesting way to view moral dilemmas; it therefore should not be a surprise that Mr. Greene ends the essay wondering if we can ever "transcend the limitations of our moral instincts." This is especially intriguing after reading Christian Keysers' "Mirror Neurons: Are We Ethical By Nature?" and his remark that the "brain is ethical by design."

Story Continues →

http://edge.org/3rd_culture/bios/brockmanm.html [6.30.09]

Written by Sarah Boslaugh 

Each essay is self-contained, making it possible to choose those most relevant to your own interest


If your favorite day of the week is Tuesday, because that's when the Science section of The New York Times is published, and your favorite NPR show is Ira Flatow'sScience Times, then you'll love What's Next? Dispatches on the Future of Science, a collection of essays written by young scientists about what they do and how they see the future of their fields. Even if you're not quite that much of a science geek, if you have an interest in the world around you and the process by which scientific research can both explain and mold that world, you'll enjoy this collection edited by Max Brockman. No expertise in any field is required to understand these essays; if you can follow Malcolm Gladwell, you'll have no troubles with What's Next?

Brockman's essayists represent a variety of fields, from physics to paleoanthropology, with a heavy leaning toward the human sciences. This is a good choice from the marketing point of view, since non-scientists tend to be more interested in topics relating to human psychology than, say, the role played by dark energy in accelerating the expansion of the universe, but fans of hard science may feel slighted. That objection aside, this is the perfect collection for people who like to stay up on recent scientific research but haven't the time or expertise to go to the original sources (which, in the case of modern science, usually means articles published in professional journals, which are not generally available to those without access to an academic library).

Each essay is self-contained, making it possible to choose those most relevant to your own interests. And it's a great airplane or beach book because you can read the essays in any order; each is brief enough to be read between the interruptions of gate announcements or children demanding attention. My personal favorite is "What Makes Big Ideas Sticky?" by UCLA psychologist Matthew Lieberman, which argues that ideas which mirror the structure and function of the human brain may seem so obviously true to us that they resist being discarded, even in the face of overwhelming amounts of scientific research demonstrating their lack of merit.

The collection closes with an essay by NASA climatologist Gavin Schmidtentitled "Why hasn't specialization led to the Balkanization of science?" He argues that in contradiction to the stereotype of the scientist as someone who knows more and more about less and less, interdisciplinary research is central to modern science and describes both the factors which lead to greater isolation among fields of research, and those which encourage cooperation and sharing of ideas. Communication of major ideas in nontechnical language is one of the factors which encourages cooperation, and What's Next? represents an important contribution to that effort.

Sarah Boslaugh

256 pages. $14.95 (paperback)

http://latimesblogs.latimes.com/jacketcopy/2009/06/does-language-shape-our-think... [6.17.09]


An essay on how language influences thought from the pop-science anthology "What's Next: Dispatches on the Future of Science" has been posted on The Edge. Author Lera Boroditsky, an assistant professor of psychology, neuroscience and symbolic systems at Stanford, writes:

Most questions of whether and how language shapes thought start with the simple observation that languages differ from one another. And a lot! Let's take a (very) hypothetical example. Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.

She brings up experiments and other examples involving use of language and direction, time, color and gender, all of which seem to demonstrate that yes, language shapes how we think.

But my favorite is this example above. Only a linguist -- or perhaps a social scientist -- would put Chomsky in a hypothetical. 

http://www.veryshortlist.com/vsl/daily.cfm/review/1272/Current_cinema/whats-next... [6.16.09]

venn diagram



What's Next? Dispatches
on the Future of Science

Sure, we often hear from the prominent, popular scientists of today: Steven Pinker, Richard Dawkins, E. O. Wilson. But what about the next generation? Who are they, and what are they thinking about? The answers can be found in the engrossing essay collection What’s Next? Dispatches on the Future of Science, which offers a youthful spin on some of the most pressing scientific issues of today—and tomorrow.

Take, for example, Laurence C. Smith's essay on global warming. Instead of rehashing the debate, Smith wonders about the possibility of people migrating to the Northern Rim as temperatures rise and inhospitable environments become more livable. Associate professor of physics Stephon H. S. Alexander tackles dark energy; other pieces address memory, morality, why viruses matter and the inevitability of human extinction. Kinda scary? Yes! Super smart and interesting? Definitely.


Wenn der Kopf im Internet nicht mehr mitkommt: Frank Schirrmachers Buch "Payback" bringt die digitale Debatte zwar auf den neuesten Stand, aber nicht weiter.

Es gibt in der industrialisierten Welt kein Land, in dem die Debatte um den Einfluss des Internets auf die Gesellschaft mit so vielen dogmatischen Verkrustungen und ideologischen Verschärfungen geführt wird, wie in Deutschland. Die digitale Kluft, die sich durch unser Land zieht, verläuft meist entlang der Generationengrenze zwischen "Digital Natives" und "Digital Immigrants", also zwischen jenen, die mit dem Internet aufgewachsen sind, und jenen, die den digitalen Technologien erst als Erwachsene begegneten.

Bild vergrößern

Schirrmachers Stärke ist es, den intellektuellen Wissensdurst mit den Jagdinstinkten eines Boulevardjournalisten zu verbinden. (© Foto: dpa)

SDE.init.trigger( SDE.events.ContentReady.zoomable, edge_ "id": "uid-1-131933-1274122617" );

Dabei ist das Thema längst größer als der knickrige Streit um alte und neue Mediengewohnheiten und Urheberrechtsfragen oder die politische Panikmache vor Amokspielen und Kinderpornos, auf die die digitalen Debatten in Deutschland meist hinauslaufen. Das neue Buch des FAZ-Herausgebers und Feuilletonisten Frank Schirrmacher "Payback" (Blessing Verlag München, 2009, 240 Seiten, 17,95 Euro) erweitert die Debatte nun endlich um kluge Gedanken. Auch wenn der Untertitel "Warum wir im Informationszeitalter gezwungen sind zu tun, was wir nicht tun wollen, und wie wir die Kontrolle über unser Denken zurückgewinnen" zunächst nach der üblichen Mischung aus Kulturpessimismus und Selbsthilfe klingt.

Unterschätzen darf man den Untertitel nicht. Schirrmachers publizistische Stärke ist es, den intellektuellen Wissensdurst eines Universalgelehrten mit den Jagdinstinkten eines Boulevardjournalisten zu verbinden. Das macht den Konkurrenzkampf mit ihm so sportlich und seine Bücher und Debattenanstöße zu Punktlandungen im Zeitgeist. Dass er dabei oft mit Ängsten spielt, wie der Angst vor der Überalterung der Gesellschaft in seinem Bestseller "Das Methusalem-Komplott" oder der Furcht vor der sozialen Entwurzelung in "Minimum", ist seinem Boulevard-Instinkt geschuldet, der solche Ängste schon früh aufspüren und in einen Kontext setzen kann.

Druck der sozialen Verpflichtungen

Auch "Payback" verkauft sich als Begleitbuch zu aktuellen Ängsten. Schirrmacher greift jenes Gefühl der digitalen Überforderung auf, das sich nicht nur in Deutschland, sondern in allen digitalisierten Ländern breitmacht. Denn die Siegeszüge dreier digitaler Technologien haben in den vergangenen beiden Jahren die Grenzen der digitalen Aufnahmebereitschaft ausgereizt.

Da war zunächst das iPhone mit seinen inzwischen rund 20000 "Apps" - Programmen, die aus dem Apple-Handy einen Supercomputer machen. Dann erhöhte die Netzwerkseite Facebook den Druck der sozialen Verpflichtungen im Netz ins Unermessliche. Und schließlich öffnete der Kurznachrichtendienst Twitter die Schleusen für eine Informationsflut, die sich nur noch mit einer Palette von Hilfsprogrammen bewältigen lässt. Längst gibt es in Europa und Amerika unzählige Artikel und Bücher, die diese Überforderung thematisieren.

"Mein Kopf kommt nicht mehr mit", heißt auch das erste Kapitel von "Payback". Da beschreibt Schirrmacher, stellvertretend für viele, seine ganz persönliche kognitive Krise, in die ihn die digitalen Datenmengen gestürzt haben. Wie ein Fluglotse fühle er sich, immer bemüht, einen Zusammenstoß zu vermeiden, immer in Sorge, das Entscheidende übersehen zu haben. Mehr als ein Lassowurf ist dieser Einstieg nicht, denn letztlich führt er über den Identifikationsmoment nur in den ersten der beiden Teile des Buches ein. Und da geht es um mehr.

Lesen Sie auf Seite 2, wie es im zweiten Teil von "Payback" weitergeht.

http://edge.org/3rd_culture/brockman_next09/NS_WhatNext.pdf [6.8.09]

Fallout from the amazing advance in neuroscience dominates this fascinating foray into the future

By Amanda Gefter

FOR PROPHETIC visions of the future, some people turn to horoscopes or fortune tellers. But if you really want to know what the future holds, ask a scientist.

Not just a renowned, seasoned scientist, but a fresh mind, someone who is asking themselves the questions that will define the next generation of scientific thought.

That's precisely what Max Brockman has done in this captivating collection of essays, written by "rising stars in their respective disciplines: those who, in their research, are tackling some of science's toughest questions and raising new ones".

The result is a medley of big ideas on topics ranging from cosmology and climate change, to morality and cognitive enhancement.

The collection is diverse, but one theme resounds: when it comes to the human race, the whole is greater than the sum of its parts. We owe our evolutionary success to our unique modes of social behaviour.

Social species

In their essay "Out of our minds", journalist Vanessa Woods and anthropologist Brian Hare suggest that it wasn't intelligence that led to social behaviour, but rather social behaviour that paved the way for the evolution of human intelligence. "Humans got their smarts only because we got friendlier first," they write.

We are a social species, and we have our brains to thank. As Harvard University neuroscientist Jason Mitchell writes: "The most dramatic innovation introduced with the rollout of our species is not the prowess ofindividual minds, but the ability to harness that power across many individuals."

Language allows us to do this in an unprecedented way — it serves as a vehicle for transferring one's own mental states into another's mind. Lera Boroditsky — a professor of psychology, neuroscience and symbolic systems at Stanford University — has an interesting piece about the ways in which our native language shapes the way we think about such basic categories as space, time and colour. ...

KOREA TIMES [4.9.09]

Translated from English to Korean by Jang Seok-bong and Kim Dae-yeon; Galleon; 563pp., 19,800 won

From global warming to economic crises, things seem to be turning worse. At this time of pessimism prevailing over optimism, the world needs some antidotes to this epidemic of negative views. But what's out there to be positive about?

This is the question that the author asked 160 scholars and scientific thinkers. John Brockman, the founder of Edge, the influential online salon, complied their answers in this book.

Nobel Laureates, Pulitzer Prize winners, Harvard professors and other world class thinkers laid bare their minds about what they're positive about. They are neither blindly nor naively optimistic. Their optimism is based on logical, professional views and insight.

Topics are wide-ranging, from physics and medicine to education and religion or the end of the world. They illustrate diverse sides of the world's future and why they're optimistic about it.

These great thinkers also present tasks that we should tackle to make a better world and this book may help change readers' perceptions of the future of mankind in a more positive way.


THE MAUI NEWS [4.9.09]

While Christians this week observe their holiest week, it might be a good time to contemplate whether their religion is a matter of faith.

There is a faction in the spectrum of Christian believers that persist in seeking to have the government mandate that their beliefs are a science.

Labeled at one time as creationism, it's reincarnated as intelligent design, with proponents insisting that it should be included in public school curriculum alongside Darwin's thesis in "On the Origin of Species."

Numerous court decisions have ruled that efforts to introduce Bible-based curricula on a God-created universe amount to an unconstitutional introduction of state-sponsored religion. Still, advocates continue to pursue mandates to add their "theory" of creation to public school curriculum.

Over the past year, legislative proposals have been offered in Alabama, Florida, Michigan, Missouri, South Carolina and Louisiana seeking to require curriculum that challenge the theory of evolution in the interest of critical analysis and academic freedom.

The persistent effort to cast personal faith in God as a science suggests that proponents of intelligent design don't understand what faith is or lack it. Biblical exhortations to faith occurred when Jesus Christ faced disciples who appeal to him to intercede in a storm at sea (Matthew 8:26): "And he saith to them, Why are ye fearful, O ye of little faith? Then he arose and rebuked the winds and the sea; and there was a great calm."

Over the past century, since Tennessee's Butler Act ban on teaching of evolution as scientific theory was ruled unconstitutional, the effort to inject some other form of Christian belief into science instruction has continued.

It has been a contest of separation of church and state, almost unique to the United States and its Constitution.

Those involved in the church should see it as a separation of faith and scientific theory. To force Christian belief into a science curriculum is to reduce Christianity to a scientific theory that has not been proved.

Intelligent designers claim the theory has been proved, although their hypothesis and proof are a self-fulfilling circular argument. It assumes that complex systems must be designed. The universe is complex, ergo, there must be a designer of the universe.

A counter hypothesis would be that a complex system is a series of anomalies that evolve into patterns occurring as a matter of chance. The universe is a complex system made up of anomalies that have evolved into patterns. Therefore the universe is a matter of chance.

Physicists and cosmologists conduct observations and experiments to test the validity of assumptions about the formation of the universe. To the extent that the evidence of quantum mechanics doesn't align with predictions, even Einstein's general theory of relativity remains theoretical. But a flaw in one theory doesn't prove the validity of another, assuming the Christian God (who also happens to be the Jewish and Islamic God of Abraham) is only theoretical.

Religious belief and science evolved from the same element in the human psyche that needs to explain what we are and what is happening in the world we see. Long before Abraham, tribal shaman were creating versions of gods to explain behavior of plants, animals, Earth's atmosphere, sun, moon and the stars. Forecasts of natural phenomenon were based on observations and those who were more observant of natural cycles were more successful in guiding their tribes.

That is still how science works, even as the technology for observing and analyzing natural phenomena have grown to a high level of sophistication.

It is not how religion works. Faith is a sense of human spirituality that does not rely wholly on empirical observations. It relies on a cognitive element not evident in other animals, but one that is biologically based, according to Marc Hauser, Harvard professor of psychology and biological anthropology ("Moral Minds: How nature designed our universal sense of right and wrong," HarperCollins, 2006).

Hauser says a human's moral sense results from a human's ability "to foresee future rewards" in making decisions about how to behave toward another human being. Religious beliefs are not a deciding factor in moral behavior, Hauser said. Rather, he said, moral decisions are based on the ability of the person to forecast an outcome.

Religion and science also forecast outcomes, but one relies on faith, the other on testable concepts.

University of Chicago ecology professor Jerry Coyne cites elements of scientific inquiry include having testable ideas and relying on evidence in testing a theory (www.edge.org "Must we always cater to the faithful when teaching science?")

The presence of God is not a testable idea, unless the faithful accept that God is only a theory.

Proponents of intelligent design appear to be fearful that individuals cannot exercise faith while they engage in scientific study. Matthew 8:26 offers: "Why are ye fearful, Oh ye of little faith?"

* Edwin Tanji is a former city editor of The Maui News. He can be reached at [email protected]. "Haku Mo'olelo," "writing stories," is about stories that are being written or have been written. It appears every Friday.


Few literary phrases have had as enduring an after­life as “the two cultures,” coined by C. P. Snow to describe what he saw as a dangerous schism between science and literary life. Yet few people actually seem to read Snow’s book bearing that title. Why bother when its main point appears so evident?

Jack Manning/The New York Times

C. P. Snow in 1969.

It was 50 years ago this May that Snow, an English physicist, civil servant and novelist, delivered a lecture at Cambridge called “The Two Cultures and the Scientific Revolution,” which was later published in book form. Snow’s famous lament was that “the intellectual life of the whole of Western society is increasingly being split into two polar groups,” consisting of scientists on the one hand and literary scholars on the other. Snow largely blamed literary types for this “gulf of mutual incomprehension.” These intellectuals, Snow asserted, were shamefully unembarrassed about not grasping, say, the second law of thermodynamics — even though asking if someone knows it, he writes, “is about the scientific equivalent of: Have you read a work of Shakespeare’s?”

In the half-century since, “the two cultures” has become a “bumper-sticker phrase,” asNASA’s administrator, Michael Griffin, said in a 2007 speech. (Naturally, as a scientist, Griffin also declared that Snow had hit on an “essential truth.”) And Snow has certainly been enlisted in some unlikely causes. Writing in Newsweek in 1998, Robert Samuelson warned that our inability to take the Y2K computer bug more seriously “may be the ultimate vindication” of Snow’s thesis. (It wasn’t.) Some prominent voices in academia have also refashioned his complaint. “We live in a society, and dare I say a university, where few would admit — and none would admit proudly — to not having read any plays by Shakespeare,” Lawrence Summers proclaimed in his 2001 inaugural address as president of Harvard, adding that “it is all too common and all too acceptable not to know a gene from a chromosome.” This is Snow for the DNA age, complete with a frosty reception from the faculty.

There is nothing wrong with referring to Snow’s idea, of course. His view that education should not be too specialized remains broadly persuasive. But it is misleading to imagine Snow as the eagle-eyed anthropologist of a fractured intelligentsia, rather than an evangelist of our technological future. The deeper point of “The Two Cultures” is not that we have two cultures. It is that science, above all, will keep us prosperous and secure. Snow’s expression of this optimism is dated, yet his thoughts about progress are more relevant today than his cultural typologies.

After all, Snow’s descriptions of the two cultures are not exactly subtle. Scientists, he asserts, have “the future in their bones,” while “the traditional culture responds by wishing the future did not exist.” Scientists, he adds, are morally “the soundest group of intellectuals we have,” while literary ethics are more suspect. Literary culture has “temporary periods” of moral failure, he argues, quoting a scientist friend who mentions the fascist proclivities of Ezra PoundWilliam Butler Yeats and Wyndham Lewis, and asks, “Didn’t the influence of all they represent bring Auschwitz that much nearer?” While Snow says those examples are “not to be taken as representative of all writers,” the implication of his partial defense is clear.

Snow’s essay provoked a roaring, ad hominem response from the Cambridge critic F. R. Leavis — who called Snow “intellectually as undistinguished as it is possible to be” — and a more measured one from Lionel Trilling, who nonetheless thought Snow had produced “a book which is mistaken in a very large way indeed.” Snow’s cultural tribalism, Trilling argued, impaired the “possibility of rational discourse.”

Today, others believe science now addresses the human condition in ways Snow did not anticipate. For the past two decades, the editor and agent John Brockman has promoted the notion of a “third culture” to describe scientists — notably evolutionary biologists, psychologists and neuroscientists — who are “rendering visible the deeper meanings in our lives” and superseding literary artists in their ability to “shape the thoughts of their generation.” Snow himself suggested in the 1960s that social scientists could form a “third culture.”

So why did Snow think the supposed gulf between the two cultures was such a problem? Because, he argues in the latter half of his essay, it leads many capable minds to ignore science as a vocation, which prevents us from solving the world’s “main issue,” the wealth gap caused by industrialization, which threatens global stability. “This disparity between the rich and the poor has been noticed . . . most acutely and not unnaturally, by the poor,” Snow explains, adding: “It won’t last for long. Whatever else in the world we know survives to the year 2000, that won’t.” (For some reason, Y2K predictions and Snow did not mix well.) Thus Snow, whose service in World War II involved giving scientists overseas assignments, recommends dispatching a corps of technologists to industrialize the third world.

This brings “The Two Cultures” to its ultimate concern, which has less to do with intellectual life than with geopolitics. If the democracies don’t modernize undeveloped countries, Snow argues, “the Communist countries will,” leaving the West “an enclave in a different world.” Only by erasing the gap between the two cultures can we ensure wealth and self-government, he writes, adding, “We have very little time.”

Some of this sounds familiar; for decades we have regarded science as crucial to global competitiveness, an idea invoked as recently as in Barack Obama’s campaign. But in other ways “The Two Cultures” remains irretrievably a cold war document. The path to industrialization that Snow envisions follows W. W. Rostow’s “take-off into sustained growth,” part of 1950s modernization theory holding that all countries could follow the same trajectory of development. The invocation of popular revolution is similarly date-stamped in the era of decolonization, as is the untroubled embrace of ­government-dictated growth. “The scale of the operation is such that it would have to be a national one,” Snow writes. “Private industry, even the biggest private industry, can’t touch it, and in no sense is it a fair business risk.”

This is, I think, why Snow’s diagnosis remains popular while his remedy is ignored. We have spent recent decades convincing ourselves that technological progress occurs in unpredictable entrepreneurial floods, allowing us to surf the waves of creative destruction. In this light, a fussy British technocrat touting a massive government aid project appears distinctly uncool.

Yet “The Two Cultures” actually embodies one of the deepest tensions in our ideas about progress. Snow, too, wants to believe the sheer force of science cannot be restrained, that it will change the world — for the better — without a heavy guiding hand. The Industrial Revolution, he writes, occurred “without anyone,” including intellectuals, “noticing what was happening.” But at the same time, he argues that 20th-century progress was being stymied by the indifference of poets and novelists. That’s why he wrote “The Two Cultures.” So which is it? Is science an irrepressible agent of change, or does it need top-down direction?

This question is the aspect of “The Two Cultures” that speaks most directly to us today. Your answer — and many different ones are possible — probably determines how widely and deeply you think we need to spread scientific knowledge. Do we need to produce more scientists and engineers to fight climate change? How should they be deployed? Do we need broader public understanding of the issue to support governmental action? Or do we need something else?

Snow’s own version of this call for action, I believe, finally undercuts his claims. “The Two Cultures” initially asserts the moral distinctiveness of scientists, but ends with a plea for enlisting science to halt the spread of Communism — a concern that was hardly limited to those with a scientific habit of mind. The separateness of his two cultures is a very slippery thing. For all the book’s continuing interest, we should spend less time merely citing “The Two Cultures,” and more time genuinely reconsidering it.

Peter Dizikes is a science journalist based in Boston.

Short takes on three books

WHAT HAVE YOU CHANGED YOUR MIND ABOUT? Today's Leading Minds Rethink Everything. Edited by John Brockman. Harper Perennial, $14.95, paper.

...Last year's question, "What have you changed your mind about?," brought a typically brilliant array of brief essays, by turns provocative, playful and profound. Last year's question, "What have you changed your mind about?," brought a typically brilliant array of brief essays, by turns provocative, playful and profound. Brockman has collected them into a volume with the question as its title.

In one of the essays, MIT quantum-mechanical engineer Seth Lloyd describes how his students have given him a new appreciation of technology. In another, mathematicianKeith Devlin explains his growing conviction that human mathematics is peculiar to the human mind. Nature news editor Oliver Morton has abandoned his support for human spaceflight. And journalist Charles Seife, who once assumed that democracy and science shared the same ideals, now believes that the egalitarian and the skeptic are natural opponents.

These contributions are typically only two or three pages long, which makes them compulsively readable. The only disappointment is that there's no discussion among the participants—but that's what the Web site is for.—Greg Ross

Read the full article →

H/PD — Germany [2.19.09]

Entdeckung des menschlichen Gehirns (c) bandolino.no

Sind Wissenschaft und Religion miteinander vereinbar? Nein, sagte der Evolutionsbiologe Jerry Coyne und argumentierte für diese Haltung ausführlich bei Edge.org. Daraufhin entbrannte eine Debatte zwischen amerikanischen Intellektuellen um diese Frage. Der „Neue Atheist" Sam Harris beantwortet sie im folgenden Essay und geht dabei satirisch auf seine Mitdiskutanten ein.


Einige Dinge stehen über der Vernunft

Es ist schade, dass Leute wie Jerry Coyne und Daniel Dennett nicht erkennen, wie einfach man Religion und Wissenschaft miteinander vereinbaren kann. Ich verstehe, wie sie ihre fundamentalistische Vernunft geblendet und von tieferen Wahrheiten abgehalten hat. Ich möchte diesen beiden Männern schon lange sagen: „Einige Dinge stehen über der Vernunft. Weit darüber!" Zum Glück hat George Dyson das für mich in einem genialen Essay auf dieser Websitegetan. Er zerstört die intellektuellen Anmaßungen von militanten Atheisten wie Coyne und Dennett auf die eleganteste Art und Weise, die man sich nur vorstellen kann: Indem er einfach den Titel einer Arbeit aus dem 17. Jahrhundert des großen Robert Boyle zitiert. Als ich ein militanter Neo-Rationalist war, hatte ich den tiefgehenden Eindruck, dass sich meine Kollegen und ich in Bezug auf das Design-Argument nicht genügend mit Boyle befasst hatten und darum öffentliche Demütigung riskierten. Nun ist es passiert...


Die unsterbliche Magie

Bei einer Kleinigkeit bin ich nicht Dysons Meinung, er war bis jetzt nämlich viel zu bescheiden, wenn es darum geht, die Folgen seiner Argumentation offen zu legen. Er hat natürlich recht festzustellen, dass „Wissenschaft und Religion von Dauer sind". Aber auch die Magie ist von Dauer, George: Afrika ist voll davon. Gibt es eine Auseinandersetzung zwischen der wissenschaftlichen Vernunft und magischen Zaubersprüchen? Gibt es, genauer gesagt, eine Auseinandersetzung zwischen dem Glauben, dass Epilepsie ein Ergebnis ungewöhnlicher Gehirnaktivität ist und dem Glauben, dass es sich dabei um ein Zeichen dämonischer Besessenheit handelt? Dogmatiker wie Coyne und Dennett sind klar dieser Meinung. Sie realisieren im Gegensatz zu Dyson nicht: Je besser man die Neurologie versteht, desto besser versteht - und schätzt - man die Dämonologie. Haben Coyne und Dennett die Arbeiten von erfahrenen Magiern wie Aleister Crowley oder Eliphas Levi gelesen? Darauf würde ich nicht wetten. Fragen Sie sich, wie Geist und Materie in irgendeiner Weise im Konflikt stehen könnten? Antwort: Das können sie nicht. Entschuldigen Sie mich, aber ich finde es peinlich, diese Dinge Leuten erklären zu müssen, die angeblich hoch gebildet sind.


Wissenschaftler haben keine Ahnung

Emanuel Derman ermahnt neo-säkulare Militante wie Coyne und Dennett, „nicht länger bei dem Versuch ihre Zeit zu verschwenden, die Gottesidee im Namen der Wissenschaft aufzumischen". Angesichts dieser so umfassenden Dekonstruktion ihrer Arbeit gehe ich davon aus, dass sich Coyne und Dennett für immer verändern werden. Derman erinnert uns mit herausragender Geduld daran, dass Wissenschaftler außerhalb des engen Fokus der wissenschaftlichen Weltsicht keine Autorität besitzen. Kann ein Biologe irgendwelche begründeten Zweifel an der Jungferngeburt Jesu unterhalten? Nein - denn die menschliche Parthenogenese hat überhaupt nichts mit Biologie zu tun. Kann sich ein Physiker eine informierte Meinung über die Wahrscheinlichkeit der Himmelfahrt bilden? Wie könnte er? Die körperliche Translokation in den Himmel erfordert keine Interaktion mit den Naturgewalten. Können entweder ein Biologe oder ein Physiker realistischerweise die kommende Wiederbelebung der Toten bezweifeln? Viele haben es versucht - keinem ist es gelungen. (Bedenken Sie bitte, dass jede Erwähnung von „Entropie" in diesem Zusammenhang bloße Angeberei ist). Wie Derman erkennt, ist es die blankeste Arroganz, die atheistische Wissenschaftler dazu gebracht hat, sich dermaßen zu übernehmen.


Attraktive Männer beweisen Schöpfung

Dieser Austausch bei Edge war ein Festmahl für den Verstand! Denken Sie nur an Lisa Randalls bewegenden Bericht über ihre Flugzeugreise in Begleitung eines „entzückenden jungen Schauspielers", der einfach in seinem Herzen wusste, dass unsere Art nicht von affenartigen Vorfahren abstammt, sondern vom biblischen Adam. Ich beschwöre die Leser, sich länger mit diesen Punkten zu befassen, da Randalls Prosa beinahe bis zur Planck-Skala verdichtet ist. Stellen Sie sich nur einmal vor, wie es gewesen sein muss, sich in 30 000 Fuß Höhe in Begleitung eines Mannes zu befinden, der Molekularbiologie auf Hochschulniveau studiert. Nun bedenken Sie, dass dieses Wunderkind sowohl Schauspieler von Beruf ist, als auch ein begeisterter Unterstützer von Barack Obama. Und schließlich machen Sie sich klar, dass dieser Fremde an Ihrer Seite die Evolution für nichts weiter hält als ein bösartiges Stück säkularer Propaganda. Ich kann mir vage vorstellen, wie sich Coyne und Dennett nach Lektüre von Randalls Geschichte bis hierhin gefühlt haben.


Logik wird überschätzt

Doch Randall gräbt tiefer:

„Auf Erfahrungen basierende und logisch schlussfolgernde Wissenschaft und Glauben sind zwei völlig verschiedene Methoden, um an die Wahrheit heranzugehen. Sie können einen Widerspruch nur dann feststellen, wenn Ihre Regeln logisch sind. Wenn Sie an die geoffenbarte Wahrheit glauben, dann haben Sie die Regeln verlassen. Es gibt beim besten Willen keinen Widerspruch."

Ich bin zuversichtlich, dass Randalls Abenteuer im Flugzeug einen Wendepunkt markiert in unserem intellektuellen Diskurs. Nicht nur hat sie alle Widersprüche zwischen Wissenschaft und Religion (und Magie, UFO-Kulten, Astrologie, Tarot, Handlesen, etc.) aufgelöst, sie hat auch scheinbar widersprüchliche Religionen miteinander vereinbart. Hindus verehren eine Vielzahl von Göttern; Muslime erkennen nur die Existenz von einem an und sie glauben, dass Polytheismus ein Kapitalvergehen ist. Befinden sich Hinduismus und Islam im Konflikt miteinander? Nur „wenn Ihre Regeln logisch sind". So wie Pfade, die einen Berghang hinaufführen, am Fuße des Berges diskrepant aussehen können, stellen wir fest, sobald wir auf dem Gipfel stehen, dass alle Wege zum selben Ziel geführt haben - so wird es mit jeder Anwendung des menschlichen Verstandes sein! Der Gipfel der Wahrheit erwartet euch, meine Freunde. Wähle einfach deinen Pfad...


Schützt religiöse Gefühle

Und doch gibt es mehr zu sagen gegen Leute wie Coyne, Dennett und Dawkins (er ist der Schlimmste!). Patrick Bateson teilt uns mit, dass es „atemberaubend unsensibel" ist, die religiösen Überzeugungen von Menschen zu unterwandern, die jene Überzeugungen als tröstend empfinden. Ich stimme vollkommen zu. Nur ein Beispiel: In Afghanistan und Pakistan ist es nun eine übliche Praxis, kleine Mädchen für das Verbrechen, zur Schule gegangen zu sein, mit Säure zu blenden und zu entstellen. Als ich ein neo-fundamentalistischer rationaler Neo-Atheist war, hatte ich die Angewohnheit, ein solches Verhalten als ein besonders beschämendes Zeichen religiöser Blödheit zu kritisieren. Ich sehe nun ein - verspätet und zu meiner großen Verlegenheit - dass ich nichts wusste von dem Schmerz, den ein frommer muslimischer Mann beim Anblick junger Frauen empfinden könnte, die Lesen lernen. Wer bin ich, um den öffentlichen Ausdruck seines Glaubens zu kritisieren? Bateson hat recht. Der Glaube an die Unfehlbarkeit des heiligen Koran ist eindeutig unverzichtbar für diese angeschlagenen Menschen.


Am besten, man redet nicht darüber

Warum kann ein militanter säkularistisch-atheistischer Neo-Dogmatist wie Coyne die nackte Wahrheit nicht erkennen? Es GIBT einfach keinen Konflikt zwischen Religion und Wissenschaft. Und selbst wenn es einen gäbe, so wäre es eine unglaubliche Zeitverschwendung, irgendetwas darüber zu sagen. Lawrence Krauss hat diesen zweiten Punkt jenseits jeden Zweifels etabliert. Gehen sie zurück und lesen sie seinen Aufsatz. Es wird Sie nur fünf Sekunden Zeit kosten. Ich habe ihn 70 Mal runtergelesen und jede Durchsicht bringt frische Einblicke.

Schlussendlich: Ankunft von Kenneth Miller in der Rolle des wahren Gläubigen und Verteidiger seiner Arbeit gegen die unerfahrene Lesung Coynes:


Gottes Wille ist unergründlich

„Ich habe keineswegs argumentiert, dass dieser glückliche Zusammenfluss natürlicher Ereignisse und physischer Konstanten die Existenz Gottes irgendwie beweist - nur, dass sie von einer gläubigen Person als mit dem Göttlichen vereinbar verstanden oder gedeutet werden könnte."

Genau so muss man an einen neo-militanten Rationalisten wie Coyne herangehen. Diese Leute sind einfach besessen davon, die beste Erklärung für die Muster zu finden, die wir in der natürlichen Welt beobachten. Aber der Glaube lehrt uns, dass das Beste, leider, oft der Feind des Guten ist. Zum Beispiel fragen Leute wie Coyne, ob die Datenlage, dass Viren zehn Mal so häufig vorkommen wie Tiere und dass ein einziger Virus wie Grippe 500 Millionen menschliche Wesen im 20. Jahrhundert getötet hat (viele von ihnen Kinder), am besten mit Hilfe eines allwissenden, allmächtigen, allguten Gottes erklärt werden kann, der die Menschheit für seine geliebteste Schöpfung hält. Falsche Frage, Coyne! Sehen Sie, die Weisen haben zu fragen gelernt, wie auch Miller, ob es angesichts der Fakten einfach nur möglich ist, dass ein mysteriöser Gott mit einem unergründlichen Willen die Welt erschaffen haben könnte. Natürlich ist es das! Und das Herz frohlockt...


Gott mag Schmelzkäse

Natürlich darf man es mit dieser erhabenen Untersuchung nicht zu weit treiben. Manche haben die Frage aufgeworfen, ob es möglich ist, dass ein mysteriöser Gott mit einem unergründlichen Willen nur an Dienstagen arbeitet, oder ob Er besonders auf Schmelzkäse steht. Es besteht kein Zweifel, dass auch solche Offenbarungen möglich sind - und bevorstehen könnten. Aber sie tragen nicht zu Freude, Keuschheit, Homophobie, oder anderen irdischen Werten bei - und darum geht es ja. Männer wie Coyne und Dennett übersehen diese theologischen Nuancen. Tatsächlich darf man befürchten, dass sie geboren wurden, um eben diese Nuancen zu übersehen.


Man kann, muss alles glauben

Miller erkennt auf der anderen Seite an, dass jeder Wissenschaftler die Freiheit hat, die Welt so zu sehen, wie er oder sie das möchte: Wenn Francis Collins zum Beispiel glauben möchte, dass der historische Jesus tatsächlich von den Toten auferstanden ist und noch immer in ätherischer Form existiert, was ihn scharfsichtig und der Masturbation gegenüber leicht abneigend macht, dann weichen diese Überzeugungen nicht einmal ein bisschen von seiner Statur als Wissenschaftler ab. Ein Mann wie Dawkins, der vor langer Zeit als strenger Anhänger des biologischen Naturalismus entblößt wurde, mag sich dazu entschließen, solche Dinge nicht zu glauben. Das ist seine Entscheidung. Doch angesichts seiner entschlossenen Leugnung des erstandenen Christus - und, wahrlich, der bloßen Existenz eines liebenden und fürsorglichen Schöpfers - ist Dawkins nicht in der Position, Collins Ansatz zu kritisieren, weil er einfach keinen inneren Einblick hat, wie brüchig die wissenschaftliche Vorstellungskraft werden kann, sobald sie vom christlichen Glauben herausgefordert wird.


Der Böse Blick

Miller ist besonders gut darin, die wissenschaftliche Vernunft von jeder anderen Art menschlicher Erkenntnis zu trennen. Es ist von zentraler Bedeutung für den Leser zu verstehen, dass die Wissenschaft ein Gewerbe ist: Was ein Wissenschaftler glaubt, ist bedeutungslos, solange er seine wissenschaftliche Arbeit sauber macht. Das war schon ein Stolperstein für zahlreiche Möchtegern-Intellektuelle, die sich einbilden, dass Wissenschaft etwas mit einem umfassenden Verständnis des Universums zu tun haben könnte, oder dass die Kenntnis der Quantität und Qualität von Belegen vielleicht keine Grenzen kennt. Womöglich wird eine Analogie hilfreich sein: Sagen wir einmal, ein Herzchirurg glaubt, dass Autounfälle nicht von menschlicher Unachtsamkeit, versagenden Bremsen, etc. ausgelöst werden, sondern durch den Bösen Blick. Würde das seine Statur als Arzt verringern? Natürlich nicht - weil Herzchirurgie nichts zu tun hat mit den Indiskretionen zwischen Auto und Fahrer. Wie Miller sagt: „Die wahre Frage lautet, ob die Meinung eines Wissenschaftlers über Gottes Existenz mit seiner wissenschaftlichen Arbeit inkompatibel ist. Das ist sie eindeutig nicht." Ja, das ist so eindeutig wie die aufgehende Sonne. Ich würde nur hinzufügen, dass der Glaube an den Bösen Blick problemlos vereinbar ist mit der modernen Medizin - mit der möglichen Ausnahme der Augenheilkunde. Manche haben dies die „Balkanisierung der Epistemologie" genannt. Ich denke, dass Begriffe wie „Epistemologie" überbewertet sind. Und das denken auch die meisten Amerikaner.


Die tiefgründigste Frage

Endlich gelangt Miller zur tiefgründigsten Frage von allen:

„Man kann sich in der Tat die Wissenschaft in jeder Hinsicht zu eigen machen und trotzdem noch eine tiefergehende Frage stellen, eine, für die sich Coyne nicht zu interessieren scheint: Warum funktioniert die Wissenschaft? Warum ist die Welt um uns herum auf eine Weise organisiert, die sie unseren logischen und intellektuellen Kräften zugänglich macht?"

Ich habe mich oft gefragt, warum das Gehen geht. Warum ist die Welt auf eine Weise beschaffen, dass wir auf ihr herumlaufen können? Und warum sollten unserer Fähigkeit, uns derart frei zu bewegen, Grenzen gesetzt sein, wie etwa jene, die uns die höchsten Höhenlagen aufnötigen? Tatsächlich hielt ich dieses Thema meiner doktoralen Disseration für angemessen, wurde jedoch auf grausame Weise durch einen fantasielosen Berater davon abgebracht. Und doch meine ich, geht Millers Frage sogar noch tiefer. Männer wie Coyne und Dennett haben ihre Augen von der Antwort eindeutig abgewandt - eine Antwort, auf die über 90% ihrer am wenigsten gebildeten Nachbarn ohne Probleme gekommen sind: Das Universum ist für die Vernunft erkennbar, weil der Gott Abrahams es so erschaffen hat. Dieser Gott, der einst eine Vorliebe für Menschenopfer an den Tag legte und dessen einzige direkte Kommunikation mit der Menschheit (durch die Bibel, vertreten durch den Heiligen Geist) nicht das geringste wissenschaftliche Verständnis preisgibt, hat uns trotzdem die geistige Fähigkeit eingeflößt, um hierauf seinen wundervollen und Furcht einflößenden Kosmos in wissenschaftlichen Begriffen zu erfassen. Warum die Wissenschaft nun als der größte Agent der Abschwächung religiösen Glaubens in der Welt angesehen worden ist und warum die Wissenschaft von religiösen Menschen in beinahe allen Kontexten als Bedrohung angesehen wurde, das ist eines der letzten Mysterien, die menschlicher Analyse nicht zugänglich sind. Wenn Gott von uns erwartet hätte, dass wir gute und schlechte Gründe, etwas zu glauben, unterscheiden können, so meinte ich oft, hätte er diesen Unterschied für jeden verständlich gemacht.


Was nicht passt, ...

Das Universum ist vollkommen und widerspruchsfrei. Was auf einer Ebene der Physik oder Biologie als ein Widerspruch erscheinen kann, wird stets durch höhere vibrierende Energien miteinander vereinbart, oder, wie Miller hervorhebt, durch „Wunder". Wunder, wie man kaum zu erwähnen braucht, sind genau die Art von Ereignissen, die sich dem rationalen Verständis entziehen und die jeden, der ein umfassendes Verständnis der Welt anstrebt, an ihnen zweifeln lassen würde. Das heißt also, wenn Jesus von einer Jungfrau geboren wurde, die Toten auferweckte, selbst von diesen nach einem kurzen Zwischenspiel auferweckt wurde, dann körperlich in den Himmel gefahren wäre und daraufhin von dort oben Juden und Homosexuellen für zwei Jahrtausende ein beständiges Misstrauen gegenüber unterhalten hätte - das wäre genau die Art von Ereignissen mit geringer Wahrscheinlichkeit, von denen Leute wie Coyne, Dennett und Dawkins annehmen würden, dass sie sich niemals zugetragen haben. Das bedeutet, dass die Zweifel von fundamentalistisch-atheistischen, rationalistisch-neo-humanistischen säkularen Militanten die Wunder von Jesu' Wirken tatsächlich plausibler machen, als sie es andernfalls wären. Jerry, Dan, Richard - bitte macht euch darüber einmal Gedanken.

OHMY NEWS — Korea [2.17.09]

"What game-changing scientific ideas and developments do you expect to live to see?"

OMNI's New Approach to Citizen Journalism
[Opinion] Democracy's Downfall
Technology Can Save Money, Planet
[Opinion] Iran Defends Peaceful 'Right'
Couchsurfing in Gaza
Women March From Capinas to Sao Paulo
The art of writing with clarity
Defining United States of America
Tyler, Tx. Remembers Martin Luther King Jr.
Mixing Gods, Devils, and Geishas

Last December, following an annual tradition, the Edge Foundation asked a select group of intellectuals, researchers, artists and visionaries to reply to this brief question. Their answers, totalling 151 contributions and an estimated 107,000 words, are posted online at the Web site of the World Question Center under this year's heading: "What will change everything?"

Since 1998, the Edge Annual Question has been bringing together some of the world's finest minds to reflect on a specific matter chosen for its relevance or thought-provoking potential. It has sometimes implied a bold challenge, like 2000's question, "What is today's most important unreported story?," and at other times the exercise has been more theoretical, like last year's "What have you changed your mind about?" This year, the topic proposed for consideration is about the ultimate breakthrough that shall have a radical and permanent effect on life as we know it.

This collection of answers, which, as did its most recent predecessors, will surely find its way to printed publication in a few months, not only serves as a precise sketch of the current state-of-the-art in future studies; above all, its separate viewpoints and differing emphases converge to weave a consistent panorama of what the near future will very probably look like.

Technological utopia

Everything will change if we work our way up to Kardashev 1. The most optimistic respondents coincided in echoing the prophets of the Singularity, the qualitative leap in technological development that is expected to take us beyond all currently imaginable standards of innovation, productivity, efficiency and affluence.

Bacteria modified to synthesize fuel will boost our energy sustainability while giving the atmosphere a much-needed relief. The mastery of fusion under controlled conditions will supply us with endless clean energy. And with affordable nanoreplication devices in each home, manufacture of any commodity --or even food-- will become a mere pastime. The tenets of economic theory will collapse under this post-scarcity scenario.

Alas, since the Singularity lies by definition beyond the conceivable horizon, it's easier to imagine what dreams we'll fulfill once there than how we're going to reach it. For example, space exploration and colonization are inspiring prospects, and may someday prove indispensable to our survival, but the degree of progress needed for such endeavors still seems unattainable.

Some of the proposed paths toward that goal involve upgrading our computing power to the point where digital intelligence is capable not only of self-awareness and meaningful communication, but also of studying and improving itself in an accelerating feedback loop that will make it truly superhuman.

There is already a distributed community of millions of individual computers that can provide the physical infrastructure for the nascent AI. The current trend in semantic Web is toward enabling the machine to transcend the mere storage and processing of information and advance towards recognizing and making sense of it. A synthetic brain, encompassing the whole world, aided by our progress in mathematics and thinking at the quantum level, won't be too far from meeting our definition of omnisapience.

New life

Everything will change if life as we know it loses its traditional connotation. A sizable portion of the academics consulted by the Edge Foundation speculated on the possibilities open to synthetic genomics, astrobiology and neuroscience. Each of these holds the key to bypassing the barriers imposed upon us by the contingencies of evolution.

Genomic medicine will offer each patient a tailor-made treatment fitted to his or her genetic profile. Artificial lifeforms will help us understand the mechanisms that brought life into existence, as well as the secrets of aging and degenerative diseases. By thus extending our lifespan, we will be freer to explore the potential of human creativity, curiosity and self-realization. Being human --indeed, being alive-- may soon need redefinition.

The simplest sample of alien life would settle at once a host of burning questions. Even an independent lineage of organisms on Earth itself would support a vision of a universe where life is welcome to arise. For us, the most immediate cultural consequence of such a discovery would be a deepened sense of brotherhood with all lifeforms, including those made in our laboratories or emerged from our electronic minds.

The boundaries between us and not-us shall gradually shift. We could engineer ourselves to be smarter, healthier, or just prettier. If we manage to overcome the first wave of prejudices, the use of embryonic cells (some even with hybrid DNA) and robotic body parts could put an end to most inherited diseases and nearly all disabilities. When brain-machine interfaces and neural modelling reach the point where the whole content of a mind can be run in a digital medium, uploads will be the ultimate release from death.

Revolution, sans the blood

For some respondents, new times will demand new manners and conventions. The trend toward global decision-making is sure to defy the presuppositions we are used to living under. Institutions and laws, traditions and standards cast in the shape of times gone, will prove irrelevant for our coming preoccupations. This change will be most evident as it influences creative solutions to ethnic conflicts, economic inequality and other social concerns. If we learn to cooperatively address our real problems instead of engaging in endless arguments over who's to blame, who must pay, or who owns which piece of the cake, everything will change.

The proposals among the Edge Foundation respondents, though varied, speak of a common sentiment: humankind as a whole entity, with an essentially good nature that survives the cruelest enmities, and whose heterogeneous elements are not a potential for chaos to be feared, but a source of power to be embraced. The risk of wishful thinking remains, but the consequences of inaction are much worse; and only coordinated effort can succeed in addressing critical issues as nuclear proliferation, climate change, and financial instability.

Even without reaching such extremes, there are areas where a cooperative approach cannot but be beneficial. One interesting idea within this collection involves using wireless Internet to bring the best education resources in e-book format to every remote village under the guide of connected tutors. Simple schemes like this can have profound long-term effects.

The nature of change

Not everything needs to turn out so well. Catastrophe was another common theme in this series of essays. It may be a hurt nature taking its revenge, or a critical increase in our already unsustainable population, or an accidental nuclear detonation that sparks the next great war. The potential collapse of our industrial civilization is a real possibility we have to live with, and the authors who decided to treat this subject would prefer us not to forget it in the midst of our optimism.

Everything is changing. Or has already changed. Or won't. Or it doesn't matter. Change, as another group of authors pointed out, is in the eye of the beholder, and what "changing everything" means depends as much on our concept of "change" as on our concept of "everything." The next radical change to come may imply a redressing of the same old trends and values, or a complete reengineering of our way of life; and "everything" can mean the cultural climate of our time as well as the very fabric of existence. Change is natural, and is always occurring. And the selection made by the Edge Foundation for this year is an excellent and absorbing anthology of the best informed judgments on what is to come.

PÁGINA 12 [2.14.09]

Every year, the site Edge.org has asked a question of its members and friends, the best of the forefront of science today. The year it was the following: "What Will Change Everything: What game-changing scientific ideas and developments do you expect to live to see?" And, as every year, Radar ran a selection of those responses — enthusiastic, hopeful, murky, skeptical, encouraging, original from more than 150 physicists, neuroscientists, philosophers, biologists, chemists and mathematicians, among others. Go ahead: you know what awaits us.

By Carlos Silber 

Observe, quantify, predict, compare. Science develops and is maintained by these four pillars in balance with the deduction of the scientific method. It was Galileo 400 years ago who finally, after so much blind faith in Aristotle and validity of the argument of authority, one day left his house with these four keys to enter fully into the nature and understand it in its tracks.

If Darwin abused and wore the act of observing (and write in their journals hiperdetallistas), Einstein won his fame in 1919 when their predictions (encapsulated in the Theory of General Relativity) coincided with the facts: the comments made during a total eclipse Sun had shown that the light is diverted to pass near a massive body.

Prediction is often seen as the most valued scientific tool, able to quell that uncertainty and allow the moment to act with foresight. Many use it with restraint and other abuse it. ...

Scientists hate the hard but closely admire his vision extended. So when John Brockman, editor and head of the U.S. site of the agora Edge.org, on the forefront of science, found the question with which every year since 1998, takes the temperature to contemporary thought, biologists, physicists, chemists and all kinds of intellectuals of the "third culture" was flooded with mail box, a resounding "yes, and give you my answer."

"What Will Change Everything: What game-changing scientific ideas and developments do you expect to live to see?", asked Brockman this time, who received 151 bright, optimistic, pessimistic, short, long, cryptic, theoretical as well as surprisingresponses — which in this custom-Radar, are condensed below:


[ED. NOTE: Feature article features the following contributors: Kevin Kelly, Steven Pinker, Freeman Dyson, Ian McEwan, George Dyson, Karl Sabbagh, Richard Dawkins, Zeilinger, Douglas Rushkoff, David Eagleman, Steve Nadis, Brian Eno, Craig Venter, Sherry Turkle, Marcel Kinsbourne] ...

[Spanish language original]

The 50 best blogs and sites; Web / VN Favorites
VRIJ NEDERLAND — Netherlands [2.13.09]


Fantastic online the biggest breeding ground where the spirits of the U.S. on anything discussed. Editor, society and intellectual impresario John Brockman beast Each year a question to a variety of scientists and thinkers—the Edge annual question '- and their answers are also published in book form, as warm rolls over the fly.

Read the full article →