Edge in the News

SPIEGEL ONLINE [8.12.09]

Von aktuellen Entwicklungen aus der schönen neuen Welt der Genom-Sequenzierung berichtet Andrian Kreye: "Am letzten Juliwochenende trafen sich Craig Venter und George Church in Los Angeles, um für John Brockmans Wissenschaftsforum Edge.org ein Seminar über synthetische Gentechnik zu leiten. Die Gentechnik, so Church, habe die Informatik dabei längst hinter sich gelassen und entwickle sich mit einem Faktor von zehn pro Jahr. Immerhin - der Preis für die Sequenzierung eines Genoms ist von drei Milliarden Dollar im Jahr 2000 auf rund 50.000 Dollar gefallen, wie der Ingenieur der Stanford University Dr. Steven Quake diese Woche bekanntgab. 17 kommerzielle Firmen bieten ihre Dienste schon an."

NEW SCIENTIST [8.6.09]

The Genome Wager

In the spirit of famous scientific wagers by notable scientists, such as Stephen Hawking and Richard Feynman, two leading biologists, Professor Lewis Wolpert and Dr Rupert Sheldrake, have set up a wager on the predictive value of the genome.

The wager will be decided on May 1, 2029, and if the outcome is not obvious, the Royal Society, the world’s most venerable scientific organization, will be asked to adjudicate. The winner will receive a case of fine port, Quinta do Vesuvio, 2005, which should have reached perfect maturity by 2029 and is being stored in the cellars of The Wine Society.

Prof Wolpert bets that the following will happen. Dr Sheldrake bets it will not:

By May 1, 2029, given the genome of a fertilized egg of an animal or plant, we will be able to predict in at least one case all the details of the organism that develops from it, including any abnormalities.

Prof Wolpert and Dr Sheldrake agree that at present, given the genome of an egg, no one can predict the way an embryo will develop. The wager arose from a debate on the nature of life between Wolpert and Sheldrake at the 2009 Cambridge University Science Festival.

[ED. NOTE: This wager began with the replies by Wolpert and Sheldrake to the EdgeQuestion Center 2009.]

THE NEW YORK TIMES — TIERNEY LAB [8.2.09]

There is a growing consensus (at least in Silicon Valley) that the information age is about to give way to the era of synthetic genetics. That was underscored recently when Harvard geneticist George Church and J. Craig Venter — of the race to decode the human genome fame — gave lectures before a small group of scientists, technologists, entrepreneurs, and writers in West Hollywood.

The event, billed as “A Short Course on Synthetic Genomics,” was organized by John Brockman, the literary impresario (and book agent for several New York Times reporters, including this one) who publishes the cybersalon-style website www.edge.org, a forum dedicated to scientists (many of whom are his clients) and their ideas.

In roughly six hours of lectures, both scientists tried to convey how the world will be changed by the ability to routinely read genetic sequences into computing systems and then store, replicate, alter and insert them back into living cells.

The rate at which this technology is now improving puts silicon to shame. Dr. Church noted that between 1970 and 2005 gene sequencing had taken place on a Moore’s Law pace, improving at about 1.5 times per year. Since then it has improved at the rate of an order of magnitude, or ten times annually.

In the process the cost of sequencing the human genome has plunged from $3 billion to $5 thousand and continues to fall. Dr. Church identified 17 companies and one “open source” project all pursuing different technologies to further push down cost and speed up the pace of sequencing.

As a consequence, the structure of the emerging synthetic genetics industry is beginning to mirror that of the semiconductor and computer industries, which are based on modular components and design tools.

The key to the vast growth of the computer industry took place during the 1970s when physicist Carver Mead helped give the industry a standard design approach based on modular components. Now that appears to be happening in the synthetic biology world as well.

For someone who has spent the past three decades writing about computing, Dr. Venter’s talk was eye-opening.

“I view DNA as an analog information system,” he said. “ and I hope to convince you in fact that it is absolutely the software of life.”

http://www.washingtontimes.com/news/2009/jul/19/books-whats-next-dispatches-futu... [7.18.09]

WHAT'S NEXT: DISPATCHES ON THE FUTURE OF SCIENCE
Edited by Max Brockman
Vintage, $15, 256 pages
REVIEWED BY JULIE ROBISON

People's exposure to the world of science is too often limited to watching the Discovery Channel or "reading" National Geographic. But the essence of science is not only what is happening today, but what could happen tomorrow. "What's Next? Dispatches on the Future of Science" is a book of science essays collected and edited by Max Brockman. It boasts that the authors of the 18 original essays that make up this book come from a "new generation of scientists" and are the future of science.

The essays cover a range of topics. In "Will We Decamp for the Northern Rim?," Lawrence C. Smith writes that the world can't escape global warming, regardless of policy changes. Stephon H.S. Alexander discusses dark matter and vacuum energy in "Just What Is Dark Energy." Vanessa Woods and Brian Hare's "Out of Our Minds: How Did Homo sapiens Come Down From the Trees, and Why Did No One Follow?" notes the theory of evolution and its relation to humans is still a work in progress.

In his essay, "Watching Minds Interact," Jason P. Mitchell argues that humans are superior because "natural selection has equipped us with an adaptation more fearsome than teeth or claws: the human brain." He reports how neuroscience has begun to show "how exquisitely sensitive our minds are to the goings-on of the minds around us by suggesting that our brains spontaneously mirror the pattern of activity of other brains in our vicinity." This is important because it means we're social beings; "our brains prefer to be in register with the brains around us."

In tandem, Matthew D. Lieberman's "What Makes Big Ideas Sticky?" explores how minds relate to one another. Mr. Lieberman references great thinkers like Descartes, Thomas Aquinas and Plato and compares Eastern and Western religions, saying that while we would "like to think of our beliefs as stemming from some combination of logical analysis and peer influence," they more likely come from genetic roots. This has been seen recently in multiple studies and Lieberman points to "Baldwin Way, a postdoctoral fellow in my lab at UCLA, [who] has recently come across a key genetic difference between individuals of Eastern and Western descent that differentially affects their brains."

Religion and science are usually subjects that get along as well as water and oil, but it does not stop these scientists from tackling them. Evolution and the big bang theory are both discussed at length from differing perspectives in Sean Carroll's "Our Place in an Unnatural Universe" and in Nick Bostrom's "How to Enhance Human Beings."

"Medical science is difficult," writes Mr. Bostrom. "We know this because, despite our best efforts, it often fails. Yet medicine typically aims merely to fix something that's broken. Human enhancement, by contrast, aims to take a system that's not broken and make it better — in many ways a more ambitious goal." He discusses enhancement to give people more mental energy, to increase DNA repair activity in cells, and improve concentration.

Whether scientists should even be making these types of changes is also called into question; the need to make ethical decisions in science are not uncommon, but Sam Cooke asks in his essay "Memory Enhancement, Memory Erasure: the Future of Our Past" whether scientists should. "Some may argue that it is not the role of scientists to make ethical judgments about the potential impact of their work — that such decisions are the job of the government, or the electorate, who should decide which scientific research is funded by public money and which is not."

Nonetheless, Joshua D. Greene believes there is a science to making moral and ethical decisions. In his essay "Fruit Flies of the Moral Mind," he discusses the "complex interplay between intuitive emotional responses and more effortful cognitive processes" involved with making moral judgments.

"People sometimes ask me why I bother with these bizarre hypothetical dilemmas," says Mr. Greene. "Shouldn't we be studying real moral decision making instead? To me, these dilemmas are like a geneticist's fruit flies. They're manageable enough to play around with in the lab but complex enough to capture something interesting about the wider and wilder world outside." An interesting way to view moral dilemmas; it therefore should not be a surprise that Mr. Greene ends the essay wondering if we can ever "transcend the limitations of our moral instincts." This is especially intriguing after reading Christian Keysers' "Mirror Neurons: Are We Ethical By Nature?" and his remark that the "brain is ethical by design."

Story Continues →

http://latimesblogs.latimes.com/jacketcopy/2009/06/does-language-shape-our-think... [7.17.09]

Checkpoint_languages

An essay on how language influences thought from the pop-science anthology "What's Next: Dispatches on the Future of Science" has been posted on The Edge. Author Lera Boroditsky, an assistant professor of psychology, neuroscience and symbolic systems at Stanford, writes:

Most questions of whether and how language shapes thought start with the simple observation that languages differ from one another. And a lot! Let's take a (very) hypothetical example. Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.

She brings up experiments and other examples involving use of language and direction, time, color and gender, all of which seem to demonstrate that yes, language shapes how we think.

But my favorite is this example above. Only a linguist -- or perhaps a social scientist -- would put Chomsky in a hypothetical. 

-- Carolyn Kellogg

http://timestranscript.canadaeast.com/opinion/article/727019 [7.12.09]

By Norbert Cunningham

Hello everyone! I've a little science today, but first note that language is a communication tool; it's what allows us to relate our experiences and thoughts, all, of course, as processed by our brains.

And while we don't yet understand how our brains work very well, scientists have lately been making remarkable progress, in part thanks to new technologies such as MRI scanners that allow them to observe healthy (as well as damaged) brains as they work.

Time flies

Now to connect this to language: we've all heard or used the cliché that "time flies."

Anybody over 40 has also likely remarked at how much faster time flies as you get older. It's a common observation many find puzzling: why does summer as a youngster seem "endless," yet pass in the blink of an eye to adults who swear it was only a couple weeks ago that they put the snow shovel away?

Does time speed up in some magical, bizarre way as we age?

Do people in traumatic events like car crashes actually witness time slowing down as they so often report?

The intuitive answer is that these are matters of perception more than reality in which time has been said to flow like a river, sure and steady (only, as Einstein showed, does time actually slow significantly enough to truly notice if you travel at incredibly high speeds beyond anything most of us will ever experience).

That intuition it is our perception has been shown to be correct, so why do we perceive time to speed up as we age? Our everyday language and the millions of people commenting on the fact are not wrong: the perception is real.

Perception

I take the following from an essay titled "Brain Time" by Dr. David M. Eagleman which appears in a book called "What's Next? Dispatches on the Future of Science," edited by Max Brockman.

Dr. Eagleman is a bright young scientist who has undergraduate degrees from Rice University and Oxford University in literature, but who obtained a doctorate in neuroscience from the Baylor College of Medicine 11 years ago. Today he is director of the Baylor College of Medicine's Laboratory for Perception and Action. The lab's long-term goal is to "understand the neural mechanisms of time perception," which in plain English is to figure out how our brains make us think time has slowed or sped up when it hasn't. It was his and his colleagues' work that allowed me to say that our intuitions are correct: people in traumatic situations do perceive time to slow, but a hair-raising experiment shows they have no extra time to react or do anything extra beyond what would normally be possible.

An explanation

We perceive the slow motion because time and memory are "tightly linked," says Dr. Eagleman. In such critical situations a part of our brain called the amygdala kicks into high gear and takes over most of the brain's resources. This forces a secondary memory system to do the processing, a system that can later produce flashbacks of the sort soldiers with post-traumatic stress experience. This backup memory is "stickier" than what our brains usually use to store memories, producing more vivid and clear images in our minds; more detail. And in remembering these, since there are many more images, just like inserting extra images in a movie reel, it makes the event appear to last longer and slows motion down. That much is fairly certain.

Less certain, but strongly suspected by Dr. Eagleman, is that the same or a similar process is what makes time seem to speed up as we age. As we experience ever more in life, familiar patterns recur and the memories our brains store get ever more compressed. Our brain can skip or compress a lot of things we know or have already experienced because we've got the general template from the first time and it need add only new details. As a result, when we draw on our memory, it is much less vivid and detailed, having the effect of cutting some frames out of a film, which seemingly speeds time up. Children, on the other hand, are frequently having first-time experiences, encountering novel things. Their brains store that information in all its detail and richness since it is the first time. Recalling it, even decades later, we remember those "endless summers" and wonder whatever happened to them. This is not "proven" yet, but it sounds logical and fits with what is known about the sensation of time slowing when in a critical situation. It's a good tentative explanation.

Full circle

So it's back to language, which does describe what we perceive well. But time to us is not really a steady flowing river after all. It's relative, according to how our brains stored our memories. And time flies as we age . . . it seems.

The last word

Here is author and wit Douglas Adams:

Time is an illusion. Lunchtime doubly so.

* Lex Talk! is researched and written by Times & Transcript editorial page editor Norbert Cunningham. It appears in this space every Monday.

NEWSWEEK [7.12.09]

About 10 years ago, biology entered betting season. An upstart scientist named J. Craig Venter jolted the genetics establishment by launching his own gene-sequencing outfit, funded by commercial investment, and setting off toward biology's holy grail—the human genome—on his own. It was Venter versus the old guard—old because of where they got their money (governments and trusts) and the sequencing technique they wanted to hold onto. Venter won that race, and not because he got there first. By combining the freedom of academic inquiry and commercial capital, he came up with a new way of doing science so effective that it forced the old institutions to either ramp up or play second fiddle.

With Venter's momentum, biology has continued to surge into new territory, but now he's not alone in pushing the pace. In fact, with his staff of hundreds at the J. Craig Venter Institute, he is looking dangerously like the establishment he raced past almost a decade ago. Another maverick in the stable, Harvard biologist George Church, is a titan in the academic world, tackling the major challenges of genomic-age biology with an ingenuity distinct from Venter's. Both are building on the foundation of DNA sequencing, trying to drive down the cost of decoding individual genomes and—the more radical enterprise—using their digital control of cells and DNA to design new organisms. Between them, Venter and Church direct or influence a major portion of work in both sequencing and synthetic biology, including three different commercial efforts to develop bacteria that could produce the next generation of biofuels.

There's reason to believe that Church has a decent chance of unseating Venter as biology's next wunderkind. The field of genomics is only at the beginning of its growth spurt—sequencing, it turns out, was just phase one. Far from producing answers, the sequenced genome has instead led scientists into a thicket of questions: What exactly do combinations of genetic code produce in an organism over a lifetime? If we can read the script, can we also write it? Leading science out of the genomic wilderness arguably calls for a vision more deeply imaginative than the task of the Human Genome Project, which was clearly framed and, at heart, a code-reading slog. Radical invention—the kind of out-of-left-field inspiration that makes a thinker either brilliant or totally unrealistic—is the strength of Church, as opposed to Venter, who is more of an aggregator, a connector of existing ideas and methods. The script of this new biology is largely unwritten, and just because Venter turned the first page doesn't mean that in the end his vision will prevail. "Sometimes," Church says, "it's best to be second."

The quest for ideas farther afield may be one reason Venter joined the Harvard faculty this spring—his first academic post since 1982. (Venter declined to be interviewed for this article.) He and Church are even members of the same research initiative, called Origins of Life, where they're investigating life in its most basic genetic and molecular forms. Venter's participation is a sign of just how widely applicable the high-concept work of the university could be. More than ever, over the uncarved terrain of the new biology, Venter and Church are blurring the distinction between the academic and the commercial. Steven Shapin, a Harvard historian of science, says that at this point we must "stop categorizing—and just look at what these people are doing." On top of all the daring science, Venter and Church are also conducting a "sociology experiment": "They're making up their own social roles," Shapin says, "making up themselves." All the while, Church insists that he and Venter are "not right on top of each other" but are "part of the same ecosystem," fulfilling different roles. Then again, Shapin points out, "the lion and the wildebeest are in the same ecosystem." The question is, who's the lion?

BUBBLE TROUBLE
TIMES HIGHER EDUCATION SUPPLEMENT [7.8.09]

The humanities are in the same state financial markets were in before they crashed. Assessing the growing mountain of toxic intellectual debt, Philip Gerrans considers going short on some overvalued research.. ...

...The academic market is also like the financial market in another way. Stocks trade above their value, which leads to bubbles and crashes. Brain- imaging studies, for example, are a current bubble, not because they don't tell us anything about the brain, but because the claims made for them so vastly exceed the information they actually provide. As with a leveraged investment in mortgage bonds hedged by a foreign-exchange credit swap, most customers have no idea how a brain-imaging result is produced and what it is really worth. Those who do - the ones in labs using complicated statistical algorithms to map impossibly messy signals to artificial 3D models of brains - are usually very circumspect about the results. But every week we read in the science pages that brain-imaging studies prove X, where X is what the readers or columnists already believe. Women can't read maps! Men like sex! Childhood trauma affects brain development! There is an Angelina Jolie neuron! The bosses of big labs that employ hundreds of people use these studies, along with artfully placed articles about them, to get funding for future research. In a similar way, directors of mining companies raise funds on the basis of prospecting reports "leaked" to the financial press.

Consider, as an unrivalled piece of hyperbole, this statement from the websiteEdge.org, which aims "to arrive at the edge of the world's knowledge" by seeking out "the most complex and sophisticated minds". It is by Vilayanur S. Ramachandran, a brilliant experimental neuroscientist as well as a master publicist: "The discovery of mirror neurons in the frontal lobes of monkeys, and their potential relevance to human brain evolution ... is the single most important 'unreported' (or at least, unpublicised) story of the decade. I predict that mirror neurons will do for psychology what DNA did for biology: they will provide a unifying framework and help explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments."
That's not very likely. Mirror neurons are neurons in the monkey premotor cortex that are active both when a monkey produces an action such as grasping, and when it observes the action. No one yet knows quite why there is an overlap in patterns of neural activity. Ramachandran would like to find out, so he has made his pitch to investors. They know he has done some beautiful experiments and he is a charismatic public performer and Edge.org regular, so we can expect the mirror neuron boom to continue for a while. ...

[ED. NOTE: Philip Gerrans writes: "So we can expect the mirror neuron boom to continue for a while". Is nine years enough time to make this point? See Ramachandran's Edge essay "Mirror Neurons and imitation learning as the driving force behind 'the great leap forward' in human evolution" published on June 1, 2000. —JB]

Read the full article →

http://www.washingtontimes.com/news/2009/jul/19/books-whats-next-dispatches-futu... [7.8.09]

WHAT'S NEXT: DISPATCHES ON THE FUTURE OF SCIENCE
Edited by Max Brockman
Vintage, $15, 256 pages
REVIEWED BY JULIE ROBISON

People's exposure to the world of science is too often limited to watching the Discovery Channel or "reading" National Geographic. But the essence of science is not only what is happening today, but what could happen tomorrow. "What's Next? Dispatches on the Future of Science" is a book of science essays collected and edited by Max Brockman. It boasts that the authors of the 18 original essays that make up this book come from a "new generation of scientists" and are the future of science.

The essays cover a range of topics. In "Will We Decamp for the Northern Rim?," Lawrence C. Smith writes that the world can't escape global warming, regardless of policy changes. Stephon H.S. Alexander discusses dark matter and vacuum energy in "Just What Is Dark Energy." Vanessa Woods and Brian Hare's "Out of Our Minds: How Did Homo sapiens Come Down From the Trees, and Why Did No One Follow?" notes the theory of evolution and its relation to humans is still a work in progress.

In his essay, "Watching Minds Interact," Jason P. Mitchell argues that humans are superior because "natural selection has equipped us with an adaptation more fearsome than teeth or claws: the human brain." He reports how neuroscience has begun to show "how exquisitely sensitive our minds are to the goings-on of the minds around us by suggesting that our brains spontaneously mirror the pattern of activity of other brains in our vicinity." This is important because it means we're social beings; "our brains prefer to be in register with the brains around us."

In tandem, Matthew D. Lieberman's "What Makes Big Ideas Sticky?" explores how minds relate to one another. Mr. Lieberman references great thinkers like Descartes, Thomas Aquinas and Plato and compares Eastern and Western religions, saying that while we would "like to think of our beliefs as stemming from some combination of logical analysis and peer influence," they more likely come from genetic roots. This has been seen recently in multiple studies and Lieberman points to "Baldwin Way, a postdoctoral fellow in my lab at UCLA, [who] has recently come across a key genetic difference between individuals of Eastern and Western descent that differentially affects their brains."

Religion and science are usually subjects that get along as well as water and oil, but it does not stop these scientists from tackling them. Evolution and the big bang theory are both discussed at length from differing perspectives in Sean Carroll's "Our Place in an Unnatural Universe" and in Nick Bostrom's "How to Enhance Human Beings."

"Medical science is difficult," writes Mr. Bostrom. "We know this because, despite our best efforts, it often fails. Yet medicine typically aims merely to fix something that's broken. Human enhancement, by contrast, aims to take a system that's not broken and make it better — in many ways a more ambitious goal." He discusses enhancement to give people more mental energy, to increase DNA repair activity in cells, and improve concentration.

Whether scientists should even be making these types of changes is also called into question; the need to make ethical decisions in science are not uncommon, but Sam Cooke asks in his essay "Memory Enhancement, Memory Erasure: the Future of Our Past" whether scientists should. "Some may argue that it is not the role of scientists to make ethical judgments about the potential impact of their work — that such decisions are the job of the government, or the electorate, who should decide which scientific research is funded by public money and which is not."

Nonetheless, Joshua D. Greene believes there is a science to making moral and ethical decisions. In his essay "Fruit Flies of the Moral Mind," he discusses the "complex interplay between intuitive emotional responses and more effortful cognitive processes" involved with making moral judgments.

"People sometimes ask me why I bother with these bizarre hypothetical dilemmas," says Mr. Greene. "Shouldn't we be studying real moral decision making instead? To me, these dilemmas are like a geneticist's fruit flies. They're manageable enough to play around with in the lab but complex enough to capture something interesting about the wider and wilder world outside." An interesting way to view moral dilemmas; it therefore should not be a surprise that Mr. Greene ends the essay wondering if we can ever "transcend the limitations of our moral instincts." This is especially intriguing after reading Christian Keysers' "Mirror Neurons: Are We Ethical By Nature?" and his remark that the "brain is ethical by design."

Story Continues →

http://edge.org/3rd_culture/bios/brockmanm.html [6.30.09]

Written by Sarah Boslaugh 

Each essay is self-contained, making it possible to choose those most relevant to your own interest

 

If your favorite day of the week is Tuesday, because that's when the Science section of The New York Times is published, and your favorite NPR show is Ira Flatow'sScience Times, then you'll love What's Next? Dispatches on the Future of Science, a collection of essays written by young scientists about what they do and how they see the future of their fields. Even if you're not quite that much of a science geek, if you have an interest in the world around you and the process by which scientific research can both explain and mold that world, you'll enjoy this collection edited by Max Brockman. No expertise in any field is required to understand these essays; if you can follow Malcolm Gladwell, you'll have no troubles with What's Next?

Brockman's essayists represent a variety of fields, from physics to paleoanthropology, with a heavy leaning toward the human sciences. This is a good choice from the marketing point of view, since non-scientists tend to be more interested in topics relating to human psychology than, say, the role played by dark energy in accelerating the expansion of the universe, but fans of hard science may feel slighted. That objection aside, this is the perfect collection for people who like to stay up on recent scientific research but haven't the time or expertise to go to the original sources (which, in the case of modern science, usually means articles published in professional journals, which are not generally available to those without access to an academic library).

Each essay is self-contained, making it possible to choose those most relevant to your own interests. And it's a great airplane or beach book because you can read the essays in any order; each is brief enough to be read between the interruptions of gate announcements or children demanding attention. My personal favorite is "What Makes Big Ideas Sticky?" by UCLA psychologist Matthew Lieberman, which argues that ideas which mirror the structure and function of the human brain may seem so obviously true to us that they resist being discarded, even in the face of overwhelming amounts of scientific research demonstrating their lack of merit.

The collection closes with an essay by NASA climatologist Gavin Schmidtentitled "Why hasn't specialization led to the Balkanization of science?" He argues that in contradiction to the stereotype of the scientist as someone who knows more and more about less and less, interdisciplinary research is central to modern science and describes both the factors which lead to greater isolation among fields of research, and those which encourage cooperation and sharing of ideas. Communication of major ideas in nontechnical language is one of the factors which encourages cooperation, and What's Next? represents an important contribution to that effort.

Sarah Boslaugh

256 pages. $14.95 (paperback)

http://latimesblogs.latimes.com/jacketcopy/2009/06/does-language-shape-our-think... [6.17.09]

Checkpoint_languages

An essay on how language influences thought from the pop-science anthology "What's Next: Dispatches on the Future of Science" has been posted on The Edge. Author Lera Boroditsky, an assistant professor of psychology, neuroscience and symbolic systems at Stanford, writes:

Most questions of whether and how language shapes thought start with the simple observation that languages differ from one another. And a lot! Let's take a (very) hypothetical example. Suppose you want to say, "Bush read Chomsky's latest book." Let's focus on just the verb, "read." To say this sentence in English, we have to mark the verb for tense; in this case, we have to pronounce it like "red" and not like "reed." In Indonesian you need not (in fact, you can't) alter the verb to mark tense. In Russian you would have to alter the verb to indicate tense and gender. So if it was Laura Bush who did the reading, you'd use a different form of the verb than if it was George. In Russian you'd also have to include in the verb information about completion. If George read only part of the book, you'd use a different form of the verb than if he'd diligently plowed through the whole thing. In Turkish you'd have to include in the verb how you acquired this information: if you had witnessed this unlikely event with your own two eyes, you'd use one verb form, but if you had simply read or heard about it, or inferred it from something Bush said, you'd use a different verb form.

She brings up experiments and other examples involving use of language and direction, time, color and gender, all of which seem to demonstrate that yes, language shapes how we think.

But my favorite is this example above. Only a linguist -- or perhaps a social scientist -- would put Chomsky in a hypothetical. 

http://www.veryshortlist.com/vsl/daily.cfm/review/1272/Current_cinema/whats-next... [6.16.09]

venn diagram

 

thumbnail

NONFICTION
What's Next? Dispatches
on the Future of Science


Sure, we often hear from the prominent, popular scientists of today: Steven Pinker, Richard Dawkins, E. O. Wilson. But what about the next generation? Who are they, and what are they thinking about? The answers can be found in the engrossing essay collection What’s Next? Dispatches on the Future of Science, which offers a youthful spin on some of the most pressing scientific issues of today—and tomorrow.

Take, for example, Laurence C. Smith's essay on global warming. Instead of rehashing the debate, Smith wonders about the possibility of people migrating to the Northern Rim as temperatures rise and inhospitable environments become more livable. Associate professor of physics Stephon H. S. Alexander tackles dark energy; other pieces address memory, morality, why viruses matter and the inevitability of human extinction. Kinda scary? Yes! Super smart and interesting? Definitely.

SUEDDEUTSCHE ZEITUNG [6.10.09]

Wenn der Kopf im Internet nicht mehr mitkommt: Frank Schirrmachers Buch "Payback" bringt die digitale Debatte zwar auf den neuesten Stand, aber nicht weiter.

Es gibt in der industrialisierten Welt kein Land, in dem die Debatte um den Einfluss des Internets auf die Gesellschaft mit so vielen dogmatischen Verkrustungen und ideologischen Verschärfungen geführt wird, wie in Deutschland. Die digitale Kluft, die sich durch unser Land zieht, verläuft meist entlang der Generationengrenze zwischen "Digital Natives" und "Digital Immigrants", also zwischen jenen, die mit dem Internet aufgewachsen sind, und jenen, die den digitalen Technologien erst als Erwachsene begegneten.

Bild vergrößern

Schirrmachers Stärke ist es, den intellektuellen Wissensdurst mit den Jagdinstinkten eines Boulevardjournalisten zu verbinden. (© Foto: dpa)

SDE.init.trigger( SDE.events.ContentReady.zoomable, edge_ "id": "uid-1-131933-1274122617" );

Dabei ist das Thema längst größer als der knickrige Streit um alte und neue Mediengewohnheiten und Urheberrechtsfragen oder die politische Panikmache vor Amokspielen und Kinderpornos, auf die die digitalen Debatten in Deutschland meist hinauslaufen. Das neue Buch des FAZ-Herausgebers und Feuilletonisten Frank Schirrmacher "Payback" (Blessing Verlag München, 2009, 240 Seiten, 17,95 Euro) erweitert die Debatte nun endlich um kluge Gedanken. Auch wenn der Untertitel "Warum wir im Informationszeitalter gezwungen sind zu tun, was wir nicht tun wollen, und wie wir die Kontrolle über unser Denken zurückgewinnen" zunächst nach der üblichen Mischung aus Kulturpessimismus und Selbsthilfe klingt.

Unterschätzen darf man den Untertitel nicht. Schirrmachers publizistische Stärke ist es, den intellektuellen Wissensdurst eines Universalgelehrten mit den Jagdinstinkten eines Boulevardjournalisten zu verbinden. Das macht den Konkurrenzkampf mit ihm so sportlich und seine Bücher und Debattenanstöße zu Punktlandungen im Zeitgeist. Dass er dabei oft mit Ängsten spielt, wie der Angst vor der Überalterung der Gesellschaft in seinem Bestseller "Das Methusalem-Komplott" oder der Furcht vor der sozialen Entwurzelung in "Minimum", ist seinem Boulevard-Instinkt geschuldet, der solche Ängste schon früh aufspüren und in einen Kontext setzen kann.

Druck der sozialen Verpflichtungen

Auch "Payback" verkauft sich als Begleitbuch zu aktuellen Ängsten. Schirrmacher greift jenes Gefühl der digitalen Überforderung auf, das sich nicht nur in Deutschland, sondern in allen digitalisierten Ländern breitmacht. Denn die Siegeszüge dreier digitaler Technologien haben in den vergangenen beiden Jahren die Grenzen der digitalen Aufnahmebereitschaft ausgereizt.

Da war zunächst das iPhone mit seinen inzwischen rund 20000 "Apps" - Programmen, die aus dem Apple-Handy einen Supercomputer machen. Dann erhöhte die Netzwerkseite Facebook den Druck der sozialen Verpflichtungen im Netz ins Unermessliche. Und schließlich öffnete der Kurznachrichtendienst Twitter die Schleusen für eine Informationsflut, die sich nur noch mit einer Palette von Hilfsprogrammen bewältigen lässt. Längst gibt es in Europa und Amerika unzählige Artikel und Bücher, die diese Überforderung thematisieren.

"Mein Kopf kommt nicht mehr mit", heißt auch das erste Kapitel von "Payback". Da beschreibt Schirrmacher, stellvertretend für viele, seine ganz persönliche kognitive Krise, in die ihn die digitalen Datenmengen gestürzt haben. Wie ein Fluglotse fühle er sich, immer bemüht, einen Zusammenstoß zu vermeiden, immer in Sorge, das Entscheidende übersehen zu haben. Mehr als ein Lassowurf ist dieser Einstieg nicht, denn letztlich führt er über den Identifikationsmoment nur in den ersten der beiden Teile des Buches ein. Und da geht es um mehr.

Lesen Sie auf Seite 2, wie es im zweiten Teil von "Payback" weitergeht.

http://edge.org/3rd_culture/brockman_next09/NS_WhatNext.pdf [6.8.09]

Fallout from the amazing advance in neuroscience dominates this fascinating foray into the future

By Amanda Gefter

FOR PROPHETIC visions of the future, some people turn to horoscopes or fortune tellers. But if you really want to know what the future holds, ask a scientist.

Not just a renowned, seasoned scientist, but a fresh mind, someone who is asking themselves the questions that will define the next generation of scientific thought.

That's precisely what Max Brockman has done in this captivating collection of essays, written by "rising stars in their respective disciplines: those who, in their research, are tackling some of science's toughest questions and raising new ones".

The result is a medley of big ideas on topics ranging from cosmology and climate change, to morality and cognitive enhancement.

The collection is diverse, but one theme resounds: when it comes to the human race, the whole is greater than the sum of its parts. We owe our evolutionary success to our unique modes of social behaviour.

Social species


In their essay "Out of our minds", journalist Vanessa Woods and anthropologist Brian Hare suggest that it wasn't intelligence that led to social behaviour, but rather social behaviour that paved the way for the evolution of human intelligence. "Humans got their smarts only because we got friendlier first," they write.

We are a social species, and we have our brains to thank. As Harvard University neuroscientist Jason Mitchell writes: "The most dramatic innovation introduced with the rollout of our species is not the prowess ofindividual minds, but the ability to harness that power across many individuals."

Language allows us to do this in an unprecedented way — it serves as a vehicle for transferring one's own mental states into another's mind. Lera Boroditsky — a professor of psychology, neuroscience and symbolic systems at Stanford University — has an interesting piece about the ways in which our native language shapes the way we think about such basic categories as space, time and colour. ...

KOREA TIMES [4.9.09]


Translated from English to Korean by Jang Seok-bong and Kim Dae-yeon; Galleon; 563pp., 19,800 won

From global warming to economic crises, things seem to be turning worse. At this time of pessimism prevailing over optimism, the world needs some antidotes to this epidemic of negative views. But what's out there to be positive about?

This is the question that the author asked 160 scholars and scientific thinkers. John Brockman, the founder of Edge, the influential online salon, complied their answers in this book.

Nobel Laureates, Pulitzer Prize winners, Harvard professors and other world class thinkers laid bare their minds about what they're positive about. They are neither blindly nor naively optimistic. Their optimism is based on logical, professional views and insight.

Topics are wide-ranging, from physics and medicine to education and religion or the end of the world. They illustrate diverse sides of the world's future and why they're optimistic about it.

These great thinkers also present tasks that we should tackle to make a better world and this book may help change readers' perceptions of the future of mankind in a more positive way.

-CHO JAE-HYON

THE MAUI NEWS [4.9.09]

While Christians this week observe their holiest week, it might be a good time to contemplate whether their religion is a matter of faith.

There is a faction in the spectrum of Christian believers that persist in seeking to have the government mandate that their beliefs are a science.

Labeled at one time as creationism, it's reincarnated as intelligent design, with proponents insisting that it should be included in public school curriculum alongside Darwin's thesis in "On the Origin of Species."

Numerous court decisions have ruled that efforts to introduce Bible-based curricula on a God-created universe amount to an unconstitutional introduction of state-sponsored religion. Still, advocates continue to pursue mandates to add their "theory" of creation to public school curriculum.

Over the past year, legislative proposals have been offered in Alabama, Florida, Michigan, Missouri, South Carolina and Louisiana seeking to require curriculum that challenge the theory of evolution in the interest of critical analysis and academic freedom.

The persistent effort to cast personal faith in God as a science suggests that proponents of intelligent design don't understand what faith is or lack it. Biblical exhortations to faith occurred when Jesus Christ faced disciples who appeal to him to intercede in a storm at sea (Matthew 8:26): "And he saith to them, Why are ye fearful, O ye of little faith? Then he arose and rebuked the winds and the sea; and there was a great calm."

Over the past century, since Tennessee's Butler Act ban on teaching of evolution as scientific theory was ruled unconstitutional, the effort to inject some other form of Christian belief into science instruction has continued.

It has been a contest of separation of church and state, almost unique to the United States and its Constitution.

Those involved in the church should see it as a separation of faith and scientific theory. To force Christian belief into a science curriculum is to reduce Christianity to a scientific theory that has not been proved.

Intelligent designers claim the theory has been proved, although their hypothesis and proof are a self-fulfilling circular argument. It assumes that complex systems must be designed. The universe is complex, ergo, there must be a designer of the universe.

A counter hypothesis would be that a complex system is a series of anomalies that evolve into patterns occurring as a matter of chance. The universe is a complex system made up of anomalies that have evolved into patterns. Therefore the universe is a matter of chance.

Physicists and cosmologists conduct observations and experiments to test the validity of assumptions about the formation of the universe. To the extent that the evidence of quantum mechanics doesn't align with predictions, even Einstein's general theory of relativity remains theoretical. But a flaw in one theory doesn't prove the validity of another, assuming the Christian God (who also happens to be the Jewish and Islamic God of Abraham) is only theoretical.

Religious belief and science evolved from the same element in the human psyche that needs to explain what we are and what is happening in the world we see. Long before Abraham, tribal shaman were creating versions of gods to explain behavior of plants, animals, Earth's atmosphere, sun, moon and the stars. Forecasts of natural phenomenon were based on observations and those who were more observant of natural cycles were more successful in guiding their tribes.

That is still how science works, even as the technology for observing and analyzing natural phenomena have grown to a high level of sophistication.

It is not how religion works. Faith is a sense of human spirituality that does not rely wholly on empirical observations. It relies on a cognitive element not evident in other animals, but one that is biologically based, according to Marc Hauser, Harvard professor of psychology and biological anthropology ("Moral Minds: How nature designed our universal sense of right and wrong," HarperCollins, 2006).

Hauser says a human's moral sense results from a human's ability "to foresee future rewards" in making decisions about how to behave toward another human being. Religious beliefs are not a deciding factor in moral behavior, Hauser said. Rather, he said, moral decisions are based on the ability of the person to forecast an outcome.

Religion and science also forecast outcomes, but one relies on faith, the other on testable concepts.

University of Chicago ecology professor Jerry Coyne cites elements of scientific inquiry include having testable ideas and relying on evidence in testing a theory (www.edge.org "Must we always cater to the faithful when teaching science?")

The presence of God is not a testable idea, unless the faithful accept that God is only a theory.

Proponents of intelligent design appear to be fearful that individuals cannot exercise faith while they engage in scientific study. Matthew 8:26 offers: "Why are ye fearful, Oh ye of little faith?"

* Edwin Tanji is a former city editor of The Maui News. He can be reached at moolelo@earthlink.net. "Haku Mo'olelo," "writing stories," is about stories that are being written or have been written. It appears every Friday.

NEW YORK TIMES SUNDAY BOOK REVIEW [3.21.09]

Few literary phrases have had as enduring an after­life as “the two cultures,” coined by C. P. Snow to describe what he saw as a dangerous schism between science and literary life. Yet few people actually seem to read Snow’s book bearing that title. Why bother when its main point appears so evident?

Jack Manning/The New York Times

C. P. Snow in 1969.

It was 50 years ago this May that Snow, an English physicist, civil servant and novelist, delivered a lecture at Cambridge called “The Two Cultures and the Scientific Revolution,” which was later published in book form. Snow’s famous lament was that “the intellectual life of the whole of Western society is increasingly being split into two polar groups,” consisting of scientists on the one hand and literary scholars on the other. Snow largely blamed literary types for this “gulf of mutual incomprehension.” These intellectuals, Snow asserted, were shamefully unembarrassed about not grasping, say, the second law of thermodynamics — even though asking if someone knows it, he writes, “is about the scientific equivalent of: Have you read a work of Shakespeare’s?”

In the half-century since, “the two cultures” has become a “bumper-sticker phrase,” asNASA’s administrator, Michael Griffin, said in a 2007 speech. (Naturally, as a scientist, Griffin also declared that Snow had hit on an “essential truth.”) And Snow has certainly been enlisted in some unlikely causes. Writing in Newsweek in 1998, Robert Samuelson warned that our inability to take the Y2K computer bug more seriously “may be the ultimate vindication” of Snow’s thesis. (It wasn’t.) Some prominent voices in academia have also refashioned his complaint. “We live in a society, and dare I say a university, where few would admit — and none would admit proudly — to not having read any plays by Shakespeare,” Lawrence Summers proclaimed in his 2001 inaugural address as president of Harvard, adding that “it is all too common and all too acceptable not to know a gene from a chromosome.” This is Snow for the DNA age, complete with a frosty reception from the faculty.

There is nothing wrong with referring to Snow’s idea, of course. His view that education should not be too specialized remains broadly persuasive. But it is misleading to imagine Snow as the eagle-eyed anthropologist of a fractured intelligentsia, rather than an evangelist of our technological future. The deeper point of “The Two Cultures” is not that we have two cultures. It is that science, above all, will keep us prosperous and secure. Snow’s expression of this optimism is dated, yet his thoughts about progress are more relevant today than his cultural typologies.

After all, Snow’s descriptions of the two cultures are not exactly subtle. Scientists, he asserts, have “the future in their bones,” while “the traditional culture responds by wishing the future did not exist.” Scientists, he adds, are morally “the soundest group of intellectuals we have,” while literary ethics are more suspect. Literary culture has “temporary periods” of moral failure, he argues, quoting a scientist friend who mentions the fascist proclivities of Ezra PoundWilliam Butler Yeats and Wyndham Lewis, and asks, “Didn’t the influence of all they represent bring Auschwitz that much nearer?” While Snow says those examples are “not to be taken as representative of all writers,” the implication of his partial defense is clear.

Snow’s essay provoked a roaring, ad hominem response from the Cambridge critic F. R. Leavis — who called Snow “intellectually as undistinguished as it is possible to be” — and a more measured one from Lionel Trilling, who nonetheless thought Snow had produced “a book which is mistaken in a very large way indeed.” Snow’s cultural tribalism, Trilling argued, impaired the “possibility of rational discourse.”

Today, others believe science now addresses the human condition in ways Snow did not anticipate. For the past two decades, the editor and agent John Brockman has promoted the notion of a “third culture” to describe scientists — notably evolutionary biologists, psychologists and neuroscientists — who are “rendering visible the deeper meanings in our lives” and superseding literary artists in their ability to “shape the thoughts of their generation.” Snow himself suggested in the 1960s that social scientists could form a “third culture.”

So why did Snow think the supposed gulf between the two cultures was such a problem? Because, he argues in the latter half of his essay, it leads many capable minds to ignore science as a vocation, which prevents us from solving the world’s “main issue,” the wealth gap caused by industrialization, which threatens global stability. “This disparity between the rich and the poor has been noticed . . . most acutely and not unnaturally, by the poor,” Snow explains, adding: “It won’t last for long. Whatever else in the world we know survives to the year 2000, that won’t.” (For some reason, Y2K predictions and Snow did not mix well.) Thus Snow, whose service in World War II involved giving scientists overseas assignments, recommends dispatching a corps of technologists to industrialize the third world.

This brings “The Two Cultures” to its ultimate concern, which has less to do with intellectual life than with geopolitics. If the democracies don’t modernize undeveloped countries, Snow argues, “the Communist countries will,” leaving the West “an enclave in a different world.” Only by erasing the gap between the two cultures can we ensure wealth and self-government, he writes, adding, “We have very little time.”

Some of this sounds familiar; for decades we have regarded science as crucial to global competitiveness, an idea invoked as recently as in Barack Obama’s campaign. But in other ways “The Two Cultures” remains irretrievably a cold war document. The path to industrialization that Snow envisions follows W. W. Rostow’s “take-off into sustained growth,” part of 1950s modernization theory holding that all countries could follow the same trajectory of development. The invocation of popular revolution is similarly date-stamped in the era of decolonization, as is the untroubled embrace of ­government-dictated growth. “The scale of the operation is such that it would have to be a national one,” Snow writes. “Private industry, even the biggest private industry, can’t touch it, and in no sense is it a fair business risk.”

This is, I think, why Snow’s diagnosis remains popular while his remedy is ignored. We have spent recent decades convincing ourselves that technological progress occurs in unpredictable entrepreneurial floods, allowing us to surf the waves of creative destruction. In this light, a fussy British technocrat touting a massive government aid project appears distinctly uncool.

Yet “The Two Cultures” actually embodies one of the deepest tensions in our ideas about progress. Snow, too, wants to believe the sheer force of science cannot be restrained, that it will change the world — for the better — without a heavy guiding hand. The Industrial Revolution, he writes, occurred “without anyone,” including intellectuals, “noticing what was happening.” But at the same time, he argues that 20th-century progress was being stymied by the indifference of poets and novelists. That’s why he wrote “The Two Cultures.” So which is it? Is science an irrepressible agent of change, or does it need top-down direction?

This question is the aspect of “The Two Cultures” that speaks most directly to us today. Your answer — and many different ones are possible — probably determines how widely and deeply you think we need to spread scientific knowledge. Do we need to produce more scientists and engineers to fight climate change? How should they be deployed? Do we need broader public understanding of the issue to support governmental action? Or do we need something else?

Snow’s own version of this call for action, I believe, finally undercuts his claims. “The Two Cultures” initially asserts the moral distinctiveness of scientists, but ends with a plea for enlisting science to halt the spread of Communism — a concern that was hardly limited to those with a scientific habit of mind. The separateness of his two cultures is a very slippery thing. For all the book’s continuing interest, we should spend less time merely citing “The Two Cultures,” and more time genuinely reconsidering it.

Peter Dizikes is a science journalist based in Boston.

Short takes on three books
AMERICAN SCIENTIST [2.28.09]

WHAT HAVE YOU CHANGED YOUR MIND ABOUT? Today's Leading Minds Rethink Everything. Edited by John Brockman. Harper Perennial, $14.95, paper.

...Last year's question, "What have you changed your mind about?," brought a typically brilliant array of brief essays, by turns provocative, playful and profound. Last year's question, "What have you changed your mind about?," brought a typically brilliant array of brief essays, by turns provocative, playful and profound. Brockman has collected them into a volume with the question as its title.

In one of the essays, MIT quantum-mechanical engineer Seth Lloyd describes how his students have given him a new appreciation of technology. In another, mathematicianKeith Devlin explains his growing conviction that human mathematics is peculiar to the human mind. Nature news editor Oliver Morton has abandoned his support for human spaceflight. And journalist Charles Seife, who once assumed that democracy and science shared the same ideals, now believes that the egalitarian and the skeptic are natural opponents.

These contributions are typically only two or three pages long, which makes them compulsively readable. The only disappointment is that there's no discussion among the participants—but that's what the Web site is for.—Greg Ross

Read the full article →

Pages