Edge 286—May 21, 2009
(11,700 words)

THE THIRD CULTURE

CHIMERAS OF EXPERIENCE
A Conversation with Jonah Lehrer

SEED CELEBRATES THE QUESTIONS C.P. SNOW RAISED 50 YEARS AGO BY ASKING: WHERE ARE WE NOW?


ARTICLES OF NOTE

THE NEW YORK TIMES
What You Don’t Know Makes You Nervous
By Daniel Gilbert

NEW YORK TIMES
Guest Column: Math and the City
By Steven Strogatz

NEW YORK TIMES — TIERNEY LAB
Message in What We Buy, but Nobody's Listening
By John Tierney

THE REASON PROJECT LAUNCHES IT'S WEBSITE

PUBLISHERS WEEKLY
Rip My Book, Please
By Andrew Richard Albanese

NEWSWEEK
Science Cult
By John Horgan

NEWSWEEK
I, Robot
By Daniel Lyons

PBS — BILL MOYERS JOURNAL
Daniel Goleman explains to Bill Moyers how better educated consumers can help build a sustainable economy

NATURE
How much reason do you want?
By Philip Ball

LOS ANGELES TIMES
Atheists: No God, no reason, just whining
By Charlotte Allen

SEED
Alison Gopnik Describes New Experiments in Developmental Psychology That Show Everything We Think We Know About Babies Is Wrong
By Evan Lerner

NEW SCIENTIST
How to map the multiverse
by Anil Ananthaswamy

THE WASHINGTON POST
paidContent.org - Condé Nast's Carey And Wired's Anderson: Pursuing The 'Fremium' Model
By David Kaplan

COSMOS
Rage of reason
By Robin McKie



The paradox of modern neuroscience is that the one reality you can't describe as it is presently conceived is the only reality we'll ever know, which is the subjective first person view of things. Even if you can find the circuit of cells that gives rise to that, and you can construct a good causal demonstration that you knock out these circuit of cells, and you create a zombie; even if you do that... and I know Dennett could dismantle this argument very, very quickly ... there's still a mystery that persists, and this is the old brain-body, mind-body problem, and we don't simply feel like three pounds of meat.

CHIMERAS OF EXPERIENCE
A Conversation with Jonah Lehrer



Edge Video

INTRODUCTION

"I always thought of myself as a scientist," says Jonah Lehrer "and then I had the privilege of working for several years in the lab of Eric Kandel as a technician, doing the manual labor of science, and what I discovered there was that I was a terrible scientist. As much as I loved the ideas, I excelled at experimental failure, I found new ways to make experiments not work. I would mess up PCRs, add the wrong buffers, northerns, westerns, southerns. I would make them not work in quite ingenious ways, and I realized slowly, over the course of those years, that the secret to being a great scientist is to love the manual labor of it." But there are many ways to contribute to the conversation that is science, and Lehrer is making important contributions as a writer who has internalized the process of the scientific method in asking interesting questions about ourselves and the world around us.

"Neuroscience has contributed so much in just a few decades to how we think about human nature and how we know ourselves," he says. "But how can we take that same rigor, which has made this research so valuable and, at the same time, make it a more realistic representation of what it's actually like to be a human. After all, we're a brain embedded in this larger set of structures."

"You can call it culture, call it society, call it your family, call it your friend, call it whatever it is. It's the stuff that makes people sign onto their Facebook a thousand times a day. It's the reason Twitter exists. We have got all these systems now that really make us fully aware of just how important social interactions are to what it is to be human. The question is, how can we study that? Because that, in essence, is a huge part of what's actually driving these enzymatic pathways in your brain. What's triggering these synaptic transmissions and these squirts of neurotransmitter back and forth is thoughts of other people, what other people say to us, interacting with the world at large. " Read on...

John Brockman

JONAH LEHRER, Contributing Editor at Wired and the author of How We Decide and Proust Was a Neuroscientist, has written for The New Yorker, Nature, Seed, The Washington Post and The Boston Globe.

Jonah Lehrer's Edge Bio Page

[PERMALINK]


CHIMERAS OF EXPERIENCE

[JONAH LEHRER:] The questions I'm asking myself right now are on a couple different levels. For a long time there's been this necessary drive towards reductionism; towards looking at the brain, these three pounds of gelatinous flesh, as nothing but a loop of kinase enzymes. You're a trillion synaptic connections. Of course, that’s a necessary foundation for trying to understand the mind and the brain, simply trying to decode the wet stuff. And that's essential, and we've made astonishing progress thanks to the work of people like Eric Kandel, who has helped outline the chemistry behind memory and all these other fundamental mental processes. Yet now we're beginning to know enough about the wet stuff, about these three pounds, to see that that's at best only a partial glance, a glimpse of human nature; that we're not just these brains in a vat, but these brains that interact with other brains and we are starting to realize that the fundamental approach we've taken to the mind and the brain, looking at it as this system of ingredients, chemical ingredients, enzymatic pathways, is actually profoundly limited.

The question now is, how do you then extrapolate it upwards? How do you take this organ, this piece of meat that runs on 10 watts of electricity, and how do you study it in its actual context, which is that it's not a brain in a vat. It's a brain interacting with other brains. How do you study things like social networks and human interactions?

Just think, for instance, about what’s now the hottest method in cognitive neuroscience: The fMRI machine, the brain scan. Think about the fundamental limitation of this machine, which is that it's one person by himself in what's essentially a noisy coffin. So you give him the stimulus. He's going through the experimental task, whatever it is. Choosing whether or not to buy something, doing a visual memory task. Whatever the protocol is, you're in essence looking at a brain in a vacuum. You're looking at a brain by itself, and we don't think enough about how profoundly abstract that is, and what an abstraction that is on the reality we actually inhabit, the reality of being a human and what human nature is all about.

The question now, and this is a fascinating question to think about, is how can we take this research, which is so rigorous, and how can we make it more realistic.

Neuroscience has contributed so much in just a few decades to how we think about human nature and how we know ourselves. But how can we take that same rigor, which has made this research so valuable and, at the same time, make it a more realistic representation of what it's actually like to be a human. After all, we're a brain embedded in this larger set of structures.
You can call it culture, call it society, call it your family, call it your friend, call it whatever it is. It's the stuff that makes people sign onto their Facebook a thousand times a day. It's the reason Twitter exists. We have got all these systems now that really make us fully aware of just how important social interactions are to what it is to be human. The question is, how can we study that? Because that, in essence, is a huge part of what's actually driving these enzymatic pathways in your brain. What's triggering these synaptic transmissions and these squirts of neurotransmitter back and forth is thoughts of other people, what other people say to us, interacting with the world at large.

As someone on the fringes of the field, part of the excitement to me is the fact that at this point there still are these different levels of description, and no one quite knows how they all fit together. There's the electrophysiologists with their galvanic needle measuring the individual dopamine neuron, and then there's the person in the brain scan, looking at different circuitry, different blobs of brain lighting up in a brain scanner. Then there's the EEG machine.

Even if you're just looking at the wet stuff, even if you're just looking at the meat, there are all these different ways of describing it, and how you ask your questions, the experimental tools you have, in large part drive what kind of questions you can ask and what kind of stuff you're looking for. Even within cognitive neuroscience, there's this tremendous variation in terms of how the brain, how the circuitry, is described.

We feel like more than just the sum of a trillion neurons. We feel like more than just three pounds of wet flesh, and so simply describing the brain in terms of its neurotransmitters and neurons and all these chemicals and exciting ingredients doesn't fully grapple with what it feels like to be human, the first person subjective experience of being a conscious being. When you think about the really grand epic questions of neuroscience ... what is consciousness? How can we form a scientific explanation for consciousness, for human experience? That is the holy grail. That question itself necessitates us to think beyond the strict limitations of reductionism, simply because describing the experience as mere squirts of neurotransmitter, oscillations of electricity in the prefrontal cortex, won't, in itself, fully answer the real question, which is how does this meat generate chimeras of experience, chimeras of being a self, in a body.

Consciousness is a very tough thing to explain away, because in the end, the paradox of modern neuroscience stems from the constant drive towards the smallest possible fundamental unit you can study, measure and quantify. Let's grant the fact that one day we can find the circuit of neurons that explains conscious experience. You knock out these 10 cells in the prefrontal cortex, generating this binding rhythm, whatever it is. To me, that still begs the real question which is, how does that create the illusion, even if it is just an illusion, as Dennett would say, an epiphenomenon. You still have to explain where the chimera comes from. Where the subjective experience comes from, where the taste of the apple in an apple comes from.

The paradox of modern neuroscience is that the one reality you can't describe as it is presently conceived is the only reality we'll ever know, which is the subjective first person view of things. Even if you can find the circuit of cells that gives rise to that, and you can construct a good causal demonstration that you knock out these circuit of cells, and you create a zombie; even if you do that... and I know Dennett could dismantle this argument very, very quickly ... there's still a mystery that persists, and this is the old mind-body problem, but it’s an old problem for a reason: we don't simply feel like three pounds of meat.

We don't simply feel like kinase enzymes and synaptic proteins and all that. We feel like this unified self staring out at a world, watching rain fall on Fifth Avenue. It's hard to imagine neuroscience as it's presently conceived ever explaining this mystery in terms of neurons and cells and glials and all the rest.

I've talked to enough scientists who thought that LSD would be a great way to study it if it weren't so tough to get it in the lab. Here are these drugs that profoundly and reliably and in ways you can actually measure distort our experience, and you can study what LSD does to serotonin in the prefrontal cortex You can study how it affects dopamine projections from the ventral striatum.

I was talking to a scientist last year who studies 'aha' moments. What happens in your brain when you have an epiphany? He was saying it would be great for him to use LSD in the lab, because when you take a hit of acid, you're a eureka machine. You think you've solved the world. You think you've solved the cosmos. You're just writing down notes on cocktail napkins. Not until you wake up in the morning do you realize you wrote down the most banal things ever — in the moment, you're just having one epiphany after another. Wouldn't it be great for him to give people LSD and study what happens in their brain? It turns out that it's very tough to do. It's very tough to get grant funding. People still remember Tim Leary at Harvard in the early '60s. There's still a stigma attached to it. We joke about it, but it could still be a very, very useful experimental tool.

As far as my background goes, I was a double major at Columbia in Neuroscience and English, and I always thought of myself as a scientist. I always thought I wanted to be a scientist. This was my narrative since I was eight years old, when I read E.O. Wilson and Richard Dawkins and didn't understand a word of what they were saying, I still fell in love with their approach to the world, the way the looked at nature, and saw it as a puzzle to be decoded.

I always thought of myself as a scientist, and then I had the privilege of working for several years in the lab of Eric Kandel as a technician, doing the manual labor of science, and what I discovered there was that I was a terrible scientist. As much as I loved the ideas, I excelled at experimental failure, I found new ways to make experiments not work. I would mess up PCRs, add the wrong buffers, northerns, westerns, southerns. I would make them not work in quite ingenious ways, and I realized slowly, over the course of those years, that the secret to being a great scientist is to love the manual labor of it.

What you're doing as a scientist is doing experiments. Ninety-nine percent of what you do as a neuroscientist is the act of experimentation, and also, thinking with a disciplined thought process, taking very big grandiose ambitious questions — "What is memory?" — and breaking them down into testable questions you can study in a sea slug or in a genetically modified mouse. I realized I didn't have that talent, which is such an important talent, to take a very big question and break it down into these empirical units. So I began thinking about science writing.

I love scientists. I love hanging out with scientists. I always think of the W.H. Auden line about when he's in a room full of scientists, he feels like a shabby curate in a room of dukes. That's how I feel. They're the most fun people to hang out with. Even on days when I wasn't working in the lab, when I was still an undergrad I would love to go do my homework in the coffee room just to eavesdrop on these conversations I could barely understand, because there were so many acronyms I couldn't follow. CREB1, CREB2, CPEB. Who knows what they're talking about, but it had the feel of, here are people solving reality one little acronym at a time. But I realized that I didn't have the talent, didn't have the skills, the discipline to be a scientist myself.

I first began thinking about science writing, and then I was lucky enough to get a scholarship to Oxford where I studied twentieth century literature for one year and then the history of science in the Theology department for my second degree.

It was a great experience to broaden my mind, and that's when I began thinking about science writing and began working on my first book. I kind of fell into this.

My first goal is simply to translate the science. When there's an amazing new paper that just came out in "Proceedings" and Nature and Science, I want to talk to the scientist, get a sense for how they came up with these answers. These aren't the final answers. This is our provisional draft of reality.

Simply translating the act of the scientist is the first job of any science writer. Then, in my more grandiose moments, the other big job of a science writer is to see connections, s to see the forest composed by all these trees. The act of being a scientist, by definition, the act of being a modern scientist requires you to really focus. You have to drill down. You have to spend years studying one brick, one synaptic protein, one kind of thing that turns on the amygdala, one very particular question.

One of the great privileges of being a science writer is you get to zoom out a bit and see connections that maybe the scientist themselves aren't aware of. You get to hopscotch between these different levels of description; you can talk to a psychologist and try to figure out how their paradigm for studying human behavior actually relates back down to dopamine neurons, to human nature at its most minute.
That's the real privilege of being a science writer, and that's part of the reason I love writing about neuroscience, because I feel like there's so much to connect. We still have no idea how it fits together. We're just at this point where we know enough to know how little we know. That makes it very exciting to be a journalist.

There are a number of people I've been thinking about in this regard, three of which immediately come to mind: Walter Mischel, Eric Kandel, and Geoffrey West.

~~

Walter Mischel at Columbia University is probably best known for the marshmallow task. It's a very simple experiment he did at the Bing Nursery School at Stanford University between 1968 and 1972, where you bring a four-year-old into the experimental room, and he'd say, "Kid, you can have one marshmallow right now, or if you can wait for about 15 minutes while I run an errand, you can have a second marshmallow." And he offered the kids marshmallows or cookies, pretzel sticks, and what he found was that there's tremendous variation in terms of how long kids can wait; every kid wants the second marshmallow or the second cookie, but some kids will eat the marshmallows before the scientist leaves the room. Some kids will wait two minutes. The average waiting time is about two and a half minutes, and some kids can wait the full 15 minutes.

The question is, what allowed some kids to wait? And it wasn't that these kids wanted the marshmallow any less or that these kids had more willpower. It's that these kids knew how to distract themselves. These are the kids who would cover their eyes, turn their back, sing songs from Sesame Street, pretend to fall asleep.

My favorite kid is a boy with neatly parted hair, and he chose the Oreo cookies, and you can watch him. He's just really struggling with it. It's an agonizing, agonizing wait, and he carefully surreptitiously looks around to make sure no one's watching him. There's a large one-way mirror right to his left that he conveniently ignores. He picks up the Oreo cookie, carefully unspools it, licks off the white cream filling, puts it back together, puts it on the table, and then he could wait 15 minutes, no problem. Mischel notes that the kids who can wait what they’re better at is the strategic allocation of attention. They know that my willpower's weak and if I'm thinking about this yummy, delicious marshmallow, I'm going to eat it. What I have to do is not think about it; I need to distract myself.

Then you do this longitudinal study, and you find that the kids who could wait at the age of four — and this is the most predictive test you can give a four-year-old, much more predictive than an IQ test — it predicts their behavior in school, how likely they'll do drugs, their body mass index. The SAT score of a kid who can wait is 210 points higher than the SAT score of a kid who can't wait. It's an incredibly predictive test. Here's this very simple experiment, this very simple protocol you give to four-year-olds, and it turns out to explain a lot about their behavior as teenagers, adolescents.

Mischel and his collaborators are now flying 55 of these kids out to Palo Alto — they're now in their 40s — to put them in brain scans, and to see the different brain areas that underlie this ability to exert willpower, but the larger lesson is that what we think about willpower is actually completely wrong.

People think about willpower as gritting your teeth, but willpower actually is profoundly weak; no one can really resist a marshmallow if you're thinking about how sweet the marshmallow is. What these people are better at is — and this is how the scientists describe it — is the ability to control their thoughts, to control the contents of working memory.

Some people are much better at that, and that's a crucial life skill that allows you to — my favorite television show's on, but I need to study for the SAT, I need to do homework. How can I resist this temptation? It allows you to control your temper, to not lose your temper when someone calls you a name. It really is a very, very important life skill, and that's what Mischel was able to measure at the age of four.

I've been thinking a lot about that, and now Mischel 's trying to go back into the schools to see if he can teach this to kids. Once kids leave kindergarden, we stop thinking about them in terms of character, in terms of these personality traits, but it turns out these are crucial things, and schools shouldn't just be in the business of teaching algebra, of teaching literacy, teaching spelling.

They have to be in the business of teaching kids how to think, teaching them these metacognitive rules. Teach kids how to structure their thoughts, how to do a better job of controlling their mind, and that's going to have a huge payoff in terms of academic skills later on. I've been thinking a lot about that. Mischel's just a magnificent and very meticulous scientist.

~~

I was so lucky to have Eric Kandel as a mentor. Even to call him a mentor is a very presumptuous thing, but just working in his lab was such a defining and pivotal experience. Here is this brilliant, brilliant man. The smartest man I've ever met, who defined more than anyone else the theory behind modern neuroscience, which is that you can take this very, very complex mental process — what is memory? — and you can study it by basically annoying a sea slug, by impressing a sea slug, by poking a sea slug, waiting for the sea slug to habituate, to adapt, and that sea slug will use the same ingredients, the same kinase enzymes, the same whatever, to remember the memory as a human. That was such a fundamental insight. It's easy to forget what a profound shift in thinking that was and how controversial it was at the time.

He's since branched out in so many interesting ways and looked at memory in so many from so many other angles. In other words, he hasn’t just defined the way we ask these questions, but he’s also come up with a staggering number of good answers. But just working in his lab, more than anything else, allowed me to fall in love with the scientific process, with the scientists themselves. I used to love going to lab meeting and I loved how contentious it was and how people would ask each other questions. This is where the ideas happen. It's not a man sitting by himself. It's not Newton under the apple tree. It's scientists talking to themselves, asking each other the hard questions, coming up with alternate explanations. That's when the breakthroughs happen.

Also, of course, Eric is this incredible intellectual. He's so cultured. He can talk about the molecules behind memory with the same eloquence he talks about Egon Schiele and Klimt and Freud. I was in the lab and talking to him just when he was starting to go back and visit Austria again and visit Vienna and rediscover his relationship to Vienna. That was fascinating, watching him grapple with and struggle with his past, because there's a lot of forgiveness involved. But it was such a privilege, such a lucky privilege, to be able to watch him think.

~~

What fascinates me about Geoffrey West's work is not just the sheer ambition of it, trying to come up with metabolic equations of life. He is the Director of the Santa Fe Institute. He's a theoretical physicist and one of the questions he's working on is, can you come up with a set of equations that define the metabolic processes of life from the rat to the elephant, and can you come up with a set of equations that describe how animals process energy, and how their cells work varies with the sheer size of the animal?

This would be a kind of universal equation of life, a very grandiose idea, and it's a very controversial idea, and it's hard to know why it would exist. There are, of course, outliers on this neat, sloping line, but it's still a very elegant idea, and one of those hypotheses where even if it's wrong it will still teach us so much.

I'm drawn to West now because he's done some really interesting work trying to apply these same laws to cities. As we talked about earlier, one of the big challenges going forward is how can we ask deep and interesting questions about humans in groups, social networks, cities.. How has the invention of the city changed human nature?

West has come up with really elegant ways to start to measure using algorithms, using these mathematical tools and the skill set he has to measure how the kind of city we live in affects how we think. For instance, whether or not the city fosters random interaction, If you bump into a stranger on the subway, that turns out to have all sorts of big consequences. It's like Jane Jacobs, who talked about the serendipity of the sidewalk and bumping into people. Geoffrey West has shown that cities that foster those kinds of random interactions score higher in measures of innovation. Like the number of patents they produce. Cities that don't, cities like Phoenix and Las Vegas, score much lower on these innovation scales. Obviously, there's no causation here, just some elegant correlation. but you can begin to see these really elegant connections between urban theory and Jane Jacobs, between who you bump into on the sidewalk and how that actually may foster creativity for reasons we can't yet begin to explain. It's this whole different way to look at human nature, to study these groups, and it's a really interesting approach.

~~

Part of why there isn't more writing about science in the major publications is that people simply aren't used to asking these questions through that prism. It's not just in journalism. It's in the academy more than anywhere else that there are separate domains, and that if you're asking questions about art, then you shouldn't be asking questions about neuroscience.

Obviously you can take that too far and you don't want to say Rothko is nothing but oscillations in your visual cortex, because that leaves out what makes Rothko Rothko, but it's still so fun to try to merge these disciplines, to try to take a novel and say, "Why does this work? Why am I finding this imaginative narrative so gripping?" By trying to figure out why certain works of art endure – why Hamlet titillates the human brain - you can often learning something quite interesting about the art and the brain.

In terms of journalism and the challenge of accurately representing science, the thing I struggle with is capturing the process. Too often there is this tendency to say, what makes this science story interesting? What's the payoff? Is there a new drug in the pipeline? Have they published a big new paper in Nature that's going to solve where human language comes from? We’re so focus on the result, on the conclusion, on the abstract of the paper and the last paragraph in the paper, but really what a science paper is is the methods section, it’s the process. And that is incredibly hard to actually translate to the public, partly because it can be pretty tedious but to get inside how a scientist thinks, to show that what makes science such a valuable, essential and crucial modern institution, is that there is this process. Someone had to struggle for years, someone had to sift through, parse through ambiguous data and come up with a good tentative answer, and that's incredibly difficult to translate to the public.

It's not simply that we need more science coverage. Really what we need absolutely more of, starting now, is to give people a better sense of the scientific process. A lot of the scientific illiteracy I see out there and the scientific misunderstanding of the American public, and the reason many people find it so easy to brush aside science or to simply believe that science is unweaving the rainbow, is because they don't understand the scientific process and the struggle and what a beautiful romantic process it often is. What we need more of, and this is a challenge to the writer, is to convey to people the excitement and the drama of a man or woman trying to take these big, big questions and come up with a new answer. It’s a noble pursuit.


The Third Culture has grown beyond Edge, as scientists have become increasingly public — and even famous  — figures. Seed approached six thinkers to ask where we are now: Whether the Two Cultures are still divided, and what role the Third Culture is playing.

SEED CELEBRATES THE QUESTIONS C.P. SNOW RAISED 50 YEARS AGO BY ASKING: WHERE ARE WE NOW?

Introduction

"Are we beyond the Two Cultures?" asks Seed Magazine in its May 7 commemoration of the 50th anniversary of C.P. Snow’s Two Cultures lecture. Readers following Edge since it began 12 years, 285 editions, and 2,939,953 words ago, know how to answer this question. Fortunately, Seed follows up and asks "Where are we now?"

It's been clear for several years that the third culture I predicted I fifteen years earlier has been in need of an update. "There are encouraging signs," I wrote in "The Expanding Third Culture" (2006), "that the third culture includes scholars in the humanities who think the way scientists do. Like their colleagues in the sciences, they believe there is a real world and their job is to understand it and explain it. They test their ideas in terms of logical coherence, explanatory power, conformity with empirical facts. They do not defer to intellectual authorities: Anyone's ideas can be challenged, and understanding and knowledge accumulate through such challenges. They are not reducing the humanities to biological and physical principles, but they do believe that art, literature, history, politics—a whole panoply of humanist concerns—need to take the sciences into account."

Seed has played in this field of ideas, creating their own kind of culture, one that embraces artists, architects, novelists designers, musicians, etc., presenting their work in vibrant and imaginative ways.

In the videos below, Seed asks six notable scientists, authors, thinkers — all also early Edge contributors — (E.O. Wilson, Janna Levin, Albert-László Barabási, Steven Pinker, Marc D. Hauser, and Rebecca Goldstein) — to comment on where the third culture is today.

John Brockman

[PERMALINK]

"May 7 marks the 50th anniversary of C.P. Snow’s Two Cultures lecture. Half a century ago the prominent novelist and speaker, who studied under Lord Rutherford, described a chasm between literary intellectuals and scientists, a gulf that impoverished both sides and impeded efforts to relieve suffering around the world. Science was not understood or respected by the dominant culture, to the detriment of all, he said. At some point scientists had ceased to be considered intellectuals, Snow noted, and though any educated person was required to know Shakespeare, almost none knew the second law of thermodynamics.

"Snow’s words touched off decades of debate on both the existence of the "Two Cultures” and the possibility of a "Third Culture” — a group Snow envisioned as curious non-scientists who could bridge the gap between scientists and humanists. In 1991, literary agent John Brockman wrote an essay entitled "The Third Culture,” which made the point that "scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.” In 1991, his Edge Foundation launched a website  —Edge — explicitly to bring together intellectuals of the Third Culture — many scientists, but also writers and philosophers — with the goal of bringing empirical studies directly to the public. The Third Culture has grown beyond Edge, as scientists have become increasingly public — and even famous  — figures. Seed approached six thinkers to ask where we are now: Whether the Two Cultures are still divided, and what role the Third Culture is playing."

Janna Levin's Edge Bio Page

[...Continue to Seed's "Are We Beyond the Two Cultures"]



THE NEW YORK TIMES
May 21, 2009

What You Don’t Know Makes You Nervous
By DANIEL GILBERT

CAMBRIDGE, Mass. — Seventy-six years ago, Franklin Delano Roosevelt took to the inaugural dais and reminded a nation that its recent troubles “concern, thank God, only material things.” In the midst of the Depression, he urged Americans to remember that “happiness lies not in the mere possession of money” and to recognize “the falsity of material wealth as the standard of success.”

“The only thing we have to fear,” he claimed, “is fear itself.”

As it turned out, Americans had a great deal more to fear than that, and their innocent belief that money buys happiness was entirely correct. Psychologists and economists now know that although the very rich are no happier than the merely rich, for the other 99 percent of us, happiness is greatly enhanced by a few quaint assets, like shelter, sustenance and security. Those who think the material is immaterial have probably never stood in a breadline.

Money matters and today most of us have less of it, so no one will be surprised by new survey results from the Gallup-Healthways Well-Being Index showing that Americans are smiling less and worrying more than they were a year ago, that happiness is down and sadness is up, that we are getting less sleep and smoking more cigarettes, that depression is on the rise.

An uncertain future leaves us stranded in an unhappy present with nothing to do but wait.
But light wallets are not the cause of our heavy hearts. After all, most of us still have more inflation-adjusted dollars than our grandparents had, and they didn’t live in an unremitting funk. Middle-class Americans still enjoy more luxury than upper-class Americans enjoyed a century earlier, and the fin de siècle was not an especially gloomy time. Clearly, people can be perfectly happy with less than we had last year and less than we have now.

So if a dearth of dollars isn’t making us miserable, then what is? No one knows. I don’t mean that no one knows the answer to this question. I mean that the answer to this question is that no one knows — and not knowing is making us sick.



NEW YORK TIMES
May 20, 2009

THE WILD SIDE

Guest Column: Math and the City
BY STEVEN STROGATZ

One of the pleasures of looking at the world through mathematical eyes is that you can see certain patterns that would otherwise be hidden. This week's column is about one such pattern. It's a beautiful law of collective organization that links urban studies to zoology. It reveals Manhattan and a mouse to be variations on a single structural theme.

The mathematics of cities was launched in 1949 when George Zipf, a linguist working at Harvard, reported a striking regularity in the size distribution of cities. He noticed that if you tabulate the biggest cities in a given country and rank them according to their populations, the largest city is always about twice as big as the second largest, and three times as big as the third largest, and so on. In other words, the population of a city is, to a good approximation, inversely proportional to its rank. Why this should be true, no one knows.

Even more amazingly, Zipf's law has apparently held for at least 100 years. Given the different social conditions from country to country, the different patterns of migration a century ago and many other variables that you'd think would make a difference, the generality of Zipf's law is astonishing.

Keep in mind that this pattern emerged on its own. No city planner imposed it, and no citizens conspired to make it happen. Something is enforcing this invisible law, but we're still in the dark about what that something might be.



NEW YORK TIMES — TIERNEY LAB
May 19, 2009

FINDINGS

Message in What We Buy, but Nobody's Listening

By JOHN TIERNEY

Why does a diploma from Harvard cost $100,000 more than a similar piece of paper from City College? Why might a BMW cost $25,000 more than a Subaru WRX with equally fast acceleration? Why do “sophisticated” consumers demand 16-gigabyte iPhones and “fair trade” coffee from Starbucks?

If you ask market researchers or advertising executives, you might hear about the difference between “rational” and “emotional” buying decisions, or about products falling into categories like “hedonic” or “utilitarian” or “positional.” But Geoffrey Miller, an evolutionary psychologist at the University of New Mexico, says that even the slickest minds on Madison Avenue are still in the prescientific dark ages.

Instead of running focus groups and spinning theories, he says, marketers could learn more by administering scientifically calibrated tests of intelligence and personality traits. If marketers (or their customers) understood biologists' new calculations about animals' “costly signaling,” Dr. Miller says, they'd see that Harvard diplomas and iPhones send the same kind of signal as the ornate tail of a peacock.

Sometimes the message is as simple as “I've got resources to burn,” the classic conspicuous waste demonstrated by the energy expended to lift a peacock's tail or the fuel guzzled by a Hummer. But brand-name products aren't just about flaunting transient wealth. The audience for our signals — prospective mates, friends, rivals — care more about the permanent traits measured in tests of intelligence and personality, as Dr. Miller explains in his new book, “Spent: Sex, Evolution and Consumer Behavior.” ...


THE REASON PROJECT LAUNCHES IT'S WEBSITE

"The Reason Project is a 501(c)(3) nonprofit foundation devoted to spreading scientific knowledge and secular values in society. Drawing on the talents of the most prominent and creative thinkers across a wide range of disciplines, The Reason Project seeks to encourage critical thinking and wise public policy through a variety of interrelated projects. The foundation will convene conferences, produce films, sponsor scientific studies and opinion polls, publish original research, award grants to other charitable organizations, and offer material support to religious dissidents and public intellectuals — all with the purpose of eroding the influence of dogmatism, superstition, and bigotry in our world.

"While the foundation is devoted to fostering critical thinking generally, we believe that religious ideas require a special focus. Both science and the arts are built upon cultures of vigorous self-criticism; religious discourse is not. As a result, unwarranted religious beliefs still reign unchallenged in almost every society on earth—dividing humanity from itself, inflaming conflict, preventing wise public policy, and diverting scarce resources. One of the primary goals of The Reason Project is to change this increasingly unhealthy status quo.

"We are always looking for creative ways to involve the community in our efforts. If you would like to contribute to the work of The Reason Project, please fill out a volunteer application. We encourage you to consider the work of The Reason Project your own.:

Sam Harris is a Co-Founder and Chairman of The Reason Project. Among the members of the Advisory Board are more than a few individuals familiar to Edge readers: Peter Atkins, Jerry Coyne, Richard Dawkins, Daniel Dennett, Rebecca Goldstein, Anthony Grayling, Lawrence Krauss, Harry Kroto, Ian McEwan. Steven Pinker, Ayaan Hirsi Ali, Christopher Hitchins, Salman Rushdie, Craig Venter, and Steven Weinberg.



PUBLISHERS WEEKLY
May 18, 2009

Rip My Book, Please
An interview with The Long Tail's Chris Anderson on the meaning of free

by Andrew Richard Albanese

One day after News Corp. chairman Rupert Murdoch told reporters that his company is exploring how to charge for its online content, proclaiming that an “epochal” moment looms in Web history, PW asked Wired editor-in-chief Chris Anderson, bestselling author of The Long Tail, about the dire state of newspapers in the digital age. “With The Long Tail, people were like, okay, smart guy, fix the music industry,” Anderson quipped. “Now, it is going to be, okay, smart guy, fix the newspaper industry! I have to say, I do not have any answers.”

What Anderson does have is a new and timely book: Free: The Future of a Radical Price, and it comes with an experiment. In July, Hyperion will launch the book with a radical twist, offering free downloads of the book online as well as Amazon Kindle downloads for a limited time around publication. A fascinating treatise on Internet economics and culture [PW Reviews, April 27], Free probes a fundamental question for the digital age—a question, it would seem, opposite to the one Rupert Murdoch is now exploring: “How can so much stuff online be free?” As with all things Internet, the answer to that question is at once simple and complicated, and not altogether comforting.

“To authors and their publishers,” observed New York Times reporter Motoko Rich in a May 11 article on book piracy, the digital realm remains “new and frightening territory.” Understandably so: the music industry is now in its eighth straight year of decline since the public execution of the pioneering file-sharing service, Napster. Newspapers and magazines, meanwhile, are shuttering at an unprecedented rate.

So far, the book industry has avoided such disasters, and the e-book has fallen well short of the hype it generated at the turn of the millennium, when wild predictions flourished. Despite impressive growth rates, e-books currently make up about 1% of total book sales. The benefit of watching others blunder their way into the unfolding digital world, however, might just turn out to be a blessing for book publishers, girding them with more experience about a more mature medium. With a mix of Internet businesses now turning their attention to books—from Google' s book scanning program and settlement, to the Amazon Kindle and the Sony Reader, to upstart literary Web site Scribd—it's clear that publishers' turn has come. Books are moving online. Now what? ...



NEWSWEEK
May 18, 2009

Science Cult
Ray Kurzweil's vision of a 'Singularity' has attracted some followers, but don't expect it anytime soon.
By John Horgan

I once believed in the imminence of superhuman intelligence. In 1981, when I was still in college, I took a science-writing class at Columbia University from the journalist Pamela McCorduck. She had just written Machines Who Think (note the mischievous "Who"), a book about the efforts of Marvin Minsky and other artificial-intelligence pioneers to create conscious, autonomous computers that would leave mere humans in their cognitive dust. This research, which McCorduck often enthused over in class, helped persuade me to become a science journalist. What could be cooler than witnessing this giant leap forward in the evolution of consciousness?

My youthful infatuation with AI gives me a somewhat jaded perspective on the prophecies of some modern scientists, notably the computer entrepreneur Ray Kurzweil, that we are on the verge of a "Singularity." In physics, a "singularity" is an event or place, like the big bang or a black hole, where the laws of physics are stretched to the breaking point. Singularitarians (which some call themselves) have adopted the term to describe a radical transformation of consciousness that will result from breakthroughs in artificial intelligence as well as nanotechnology, biotechnology and neuroscience.

At first, Singularitarians say, we may become cyborgs, as WiFi-equipped brain chips, nanobots and genetic modifications soup up our intelligence, perception and memory. Eventually, we may abandon our flesh-and-blood selves entirely and transform our psyches into giant software programs, like Vista but presumably less buggy. We will then "upload" ourselves into computers and dwell forever in cyberspace. Our transformation into immortal, God-like cyberbeings will supposedly take place not millennia or centuries from now but within the next few decades. ...



NEWSWEEK
May 16, 2009

I, Robot
Ray Kurzweil can't wait to be a Cyborg—a human mind inside an everlasting machine. But is this the next great leap in human evolution

By Daniel Lyons

Ray Kurzweil's wildest dream is to be turned into a cyborg—a flesh-and-blood human enhanced with tiny embedded computers, a man-machine hybrid with billions of microscopic nanobots coursing through his bloodstream. And there's a moment, halfway through a conversation in his office in Wellesley, Mass., when I start to think that Kurzweil's transformation has already begun. It's the way he talks—in a flat, robotic monotone. Maybe it's just because he's been giving the same spiel, over and over, for years now. He does 70 speeches annually at $30,000 a pop, and draws crowds of adoring fans who worship him as a kind of prophet. Kurzweil is a legend in the world of computer geeks, an inventor, author and computer scientist who bills himself as a futurist. The ideas he's espousing are as radical as anything you've ever heard. But the strangest thing about Ray Kurzweil is that when you sit down for a one-on-one chat with him, he's absolutely boring.

Listen closely, though, and you may be slightly terrified. Kurzweil believes computer intelligence is advancing so rapidly that in a couple of decades, machines will be as intelligent as humans. Soon after that they will surpass humans and start creating even smarter technology. By the middle of this century, the only way for us to keep up will be to merge with the machines so that their superior intelligence can boost our weak little brains and beef up our pitiful, illness-prone bodies. Some of Kurzweil's fellow futurists believe these superhuman computers will want nothing to do with us—that we will become either their pets or, worse yet, their food. Always an optimist, Kurzweil takes a more upbeat view. He swears these superhuman computers will love us, and honor us, since we'll be their ancestors. He also thinks we'll be able to embed our consciousness into silicon, which means we can live on, inside machines, forever and ever, amen.

Kurzweil calls this moment "The Singularity," and says it represents the next great leap in human evolution, when humans will transcend biology by merging with technology. Kurzweil truly believes this is going to happen—and he can't wait to be part of it. All he has to do is stay alive until 2045, when he believes the necessary technologies will be available. So he lives on a strict diet, and every day he swallows 150 dietary supplements in order to "reprogram" his body's biochemistry. Today he is 61 years old and in very good health. In 2045 he will be 97. In other words, it's doable. ...



PBS — BILL MOYERS JOURNAL
May 15, 2009

Daniel Goleman explains to Bill Moyers how better educated consumers can help build a sustainable economy.

DANIEL GOLEMAN: I think that's well put. The sad fact is that what we see in the store, what we put in our homes, what we use every day, all those objects, all those friendly products that we're so used to, has a hidden legacy which has to do with their impacts on the environment, on our health, on ecosystems, on the people that made them, that starts from the moment that they start to extract the ingredients. Manufacture through transport, through use, disposal. At every stage in that progression, over the life cycle of a product, there's a new methodology. It's called life cycle assessment.

BILL MOYERS: But what's new about this? Because we've all been taught, or learned by osmosis, that each of us leaves a carbon footprint on the sands of time. And we know there are consequences to our presence here. So what's new about what-

DANIEL GOLEMAN: Well, Bill, I think there are two things that are new. One is that life cycle assessment goes way beyond carbon footprint at any given point. For example, a glass bottle, a glass jar, they analyze that, its manufacturing, to 1,959 discreet steps.

At every step of the way there are myriad impacts on the environment, on health, on the people involved, and so on. So, first, we have a vaster sense, and a much more accurate sense, of really what the impact is. And the second thing is, and this is the big breakthrough, that information is now available to you and me while we're shopping. So that we can use it to make better decisions.

BILL MOYERS: How so?

DANIEL GOLEMAN: Well, there's a fabulous website. It's called GoodGuide. GoodGuide.com. And it summarizes, it draws on about 200 of these databases, and summarizes for us in, you know, ten points — ten is the best, one is the worst — how this product stacks up on its environmental health and social impacts compared to other products of its kind. What this does, Bill, is give us what's called radical transparency. Suddenly, you know, we've all vaguely known, things have carbon fingerprints. Now we can know exactly which is better. ...



NATURE
May 14, 2009


COLUMN: MUSE
How much reason do you want?
The 'war' between science and religion is stuck in a rut. Can we change the record now, asks Philip Ball?
By Philip Ball

The 50th anniversary of C. P. Snow's famous 'Two Cultures' lecture has elicited mixed views. Some feel that the divide between the sciences and the humanities is as broad and uncomfortable as it was in 1959; others say the world has moved on. But perhaps we need instead to acknowledge that today's divisions exist between two quite different cultures.

To my mind, the most problematic of these is the distinction between those who believe in the value of knowledge and learning, whether artists, scientists, historians or politicians, and those who reject, even denigrate, intellectualism in world affairs.

But others feel that the most serious disparity is now between those who trust in science and Enlightenment rationalism, and those who are guided by religious dogma. This feeling has apparently motivated the recent launch of the Reason Project, an initiative organized by neuroscientist and writer Sam Harris, which boasts a stellar advisory board that includes Richard Dawkins, Daniel Dennett, Steven Weinberg, Harry Kroto, Craig Venter and Steven Pinker, along with Salman Rushdie, Ayaan Hirsi Ali and Ian McEwan.

The project is aimed at "spreading scientific knowledge and secular values in society" and seeks "to encourage critical thinking and erode the influence of dogmatism, superstition, and bigotry in our world".

War and peace

It's easy to agree that the use (or generally, abuse) of religion to justify suppression of human rights, maltreatment and murder is abhorrent. To the extent that this is in the project's sights, it should be applauded. But with Dawkins (The God Delusion) and Christopher Hitchens (God Is Not Good) on board, one can't help suspecting that the Almighty Himself is the prime target.

This debate now tends to cluster into two camps. One, exemplified by the Reason Project, insists that science and religion are fundamentally incompatible, and that the world ain't big enough for the both of them. The other side is exemplified by another recently launched project, the BioLogos Foundation, established by the former leader of the Human Genome Project, Francis Collins. In this view, science and religion can and should make their peace: there is no reason why they cannot coexist. The mission statement of BioLogos speaks of "America's escalating culture war between science and faith", and explains that the foundation "emphasizes the compatibility of Christian faith with what science has discovered about the origins of the universe and life".

BioLogos is funded by the Templeton Foundation, which similarly seeks to identify common ground between science and religion. To the militant atheists, this is sheer appeasement,

That is what evolutionary biologist Jerry Coyne, a board member of the Reason Project, laments in an essay called Truckling to the faithful: A spoonful of Jesus helps Darwin go down. Coyne accuses the US National Academy of Sciences, and especially its National Center for Science Education, of pandering to the religious masses. ...



LOS ANGELES TIMES
May 17, 2009

OPINION
Atheists: No God, no reason, just whining
Superstar atheists are motivated by anger -- and boohoo victimhood.
By Charlotte Allen

I can't stand atheists -- but it's not because they don't believe in God. It's because they're crashing bores.

Other people, most recently the British cultural critic Terry Eagleton in his new book, "Faith, Reason, and Revolution," take to task such superstar nonbelievers as Oxford biologist Richard Dawkins ("The God Delusion") and political journalist Christopher Hitchens ("God Is Not Great") for indulging in a philosophically primitive opposition of faith and reason that assumes that if science can't prove something, it doesn't exist.

My problem with atheists is their tiresome -- and way old -- insistence that they are being oppressed and their fixation with the fine points of Christianity. What -- did their Sunday school teachers flog their behinds with a Bible when they were kids?

Read Dawkins, or Hitchens, or the works of fellow atheists Sam Harris ("The End of Faith") and Daniel Dennett ("Breaking the Spell"), or visit an atheist website or blog (there are zillions of them, bearing such titles as "God Is for Suckers," "God Is Imaginary" and "God Is Pretend"), and your eyes will glaze over as you peruse -- again and again -- the obsessively tiny range of topics around which atheists circle like water in a drain. ...

...Maybe atheists wouldn't be so unpopular if they stopped beating the drum until the hide splits on their second-favorite topic: How stupid people are who believe in God. This is a favorite Dawkins theme. In a recent interview with Trina Hoaks, the atheist blogger for the Examiner.com website, Dawkins described religious believers as follows: "They feel uneducated, which they are; often rather stupid, which they are; inferior, which they are; and paranoid about pointy-headed intellectuals from the East Coast looking down on them, which, with some justification, they do." Thanks, Richard!

Dennett likes to call atheists "the Brights," in contrast to everybody else, who obviously aren't so bright. In a 2006 essay describing his brush with death after a heart operation, Dennett wrote these thoughts about his religious friends who told him they were praying for his recovery: "Thanks, I appreciate it, but did you also sacrifice a goat?" With friends like Daniel Dennett, you don't need enemies.

Then there's P.Z. Myers, biology professor at the University of Minnesota's Morris campus, whose blog, Pharyngula, is supposedly about Myers' field, evolutionary biology, but is actually about his fanatical propensity to label religious believers as "idiots," "morons," "loony" or "imbecilic" in nearly every post. The university deactivated its link to Myers' blog in July after he posted a photo of a consecrated host from a Mass that he had pierced with a rusty nail and thrown into the garbage ("I hope Jesus' tetanus shots are up to date") in an effort to prove that Catholicism is bunk -- or something. ...



SEED
May 16, 2009

ALISON GOPNIK DESCRIBES NEW EXPERIMENTS IN DEVELOPMENTAL PSYCHOLOGY THAT SHOW EVERYTHING WE THINK WE KNOW ABOUT BABIES IS WRONG.

By Evan Lerner

Thomas Nagel famously asked, “What is it like to be a bat?” That question has become a staple of Philosophy 101 courses, but we might be better served asking a more basic one: What is it like to be a baby? Though all of us experience life as a baby firsthand, we've long held misconceptions about what babies are capable of thinking, feeling, and understanding. Only recently have we overturned dominant theories of development in which very young children were thought to be barely conscious at all.

In The Philosophical Baby developmental psychologist Alison Gopnik compiles the latest in her field's research to paint a new picture of our inner lives at inception — one in which we are, in some ways, more conscious than adults. Gopnik spoke with Seed's Evan Lerner about how babies and young children learn from us and what we can learn from them.

Seed: How does a better understanding of what's going on in the minds of babies help us as adults?

Alison Gopnik: One of the things we discovered is that imagination, which we often think of as a special adult ability, is actually in place in very young children, as early as 18 months old. That ability is very closely related to children's ability to figure out how the world works. Imagination isn't just something we develop for our amusement; it seems to be something innate and connected to how we understand the causal structure of the real world. In fact, the new computational model of development we've created —  using what computer scientists call Bayesian networks — shows systematically how understanding causation lets you imagine new possibilities. If children are computing in this way, then we'd expect imagination and learning to go hand in hand. ...



NEW SCIENTIST
May 4, 2009

How to map the multiverse
by Anil Ananthaswamy

BRIAN GREENE spent a good part of the last decade extolling the virtues of string theory. He dreamed that one day it would provide physicists with a theory of everything that would describe our universe - ours and ours alone. His bestselling book The Elegant Universe eloquently captured the quest for this ultimate theory.

"But the fly in the ointment was that string theory allowed for, in principle, many universes," says Greene, who is a theoretical physicist at Columbia University in New York. In other words, string theory seems equally capable of describing universes very different from ours. Greene hoped that something in the theory would eventually rule out most of the possibilities and single out one of these universes as the real one: ours.

So far, it hasn't - though not for any lack of trying. As a result, string theorists are beginning to accept that their ambitions for the theory may have been misguided. Perhaps our universe is not the only one after all. Maybe string theory has been right all along.

Greene, certainly, has had a change of heart. "You walk along a number of pathways in physics far enough and you bang into the possibility that we are one universe of many," he says. "So what do you do? You smack yourself in the head and say, 'Ah, maybe the universe is trying to tell me something.' I have personally undergone a sort of transformation, where I am very warm to this possibility of there being many universes, and that we are in the one where we can survive." ...



THE WASHINGTON POST
May 6, 2009

paidContent.org - Condé Nast's Carey And Wired's Anderson: Pursuing The 'Fremium' Model
David Kaplan

paidContent: Newspaper publishers are doing pretty badly, and judging by the Publishers Information Bureau's data (mag ad pages dropped 26 percent in Q1), magazines aren't doing that great either. Is it hyperbolic to worry about the death of print these days?

Chris Anderson, Wired: Print is not just about newspapers or magazine, you need to think of books as well. What we're seeing is a distinction between different kinds of print. We're now slowly figuring out what kind of print adds value in the internet age and what kind doesn't. The kind of magazines that we do, which are long-form, photography, all the qualities being celebrated here, are the things that add value to the internet. And while magazine websites are also being honored here, print still does these things better than anything else. You're not going to want to read 8,000 words on your screen. And we just won an award for design. HTML does not do justice to really innovative design, to what we won for tonight. And so, that kind of print is not dead, it is still thriving. And our company, in particular, focuses on mostly monthly, high-production, high-design visual artifacts. That's got a long future. More after the jump. ...

...paidContent: Chris, during your acceptance speech for the design award, you referenced an editorial meeting for what eventually became the February cover. It was a white cover with black letters for an obscure mathematical formula called the "Gaussian copula function." When you presented this at an editorial meeting, you conceded that this might not be the best choice to move magazines off the newsstand. SI Newhouse's approving reaction was to shrug and say "Oh, It doesn't matter." You said whether you believed it or not, it was important to hear that at the meeting. How did the issue do?

CA: As it turned out, it did sell pretty well?better than average, although not our bestseller for the year-to-date. But the important thing for me is that to SI, it was a powerful idea, presented in a novel and innovative way. Who knows what would have happened if it did indeed tank on the newsstand, as conventional wisdom would have had it, but our confidence that SI meant what he said?that swinging for the fences in terms of magazine-making ambition matters most?is what gives us the courage to break convention and experiment with radical concepts, like our current JJ Abrams issue on Mystery.



COSMOS
October, 2008

Rage of reason
By Robin McKie

Richard Dawkins is a towering figure in evolution who skewers creationists for sport. He doesn't suffer fools gladly, but was kind enough to talk to Robin McKie.

..."I wanted to do more than just describe how Darwin came to natural selection, but to explore what it means to people today," he says. "The theory was, and remains, the most powerful, revolutionary idea ever put forward by an individual."

The end result is typical Dawkins: an eloquent presentation of how Darwin developed his theory; an uncompromising description of its operation in the wild; and a few barbed anti-religious jibes for good measure.

We see lions hunting down zebras and polar bears slaughtering seals, he says. The weak are killed off, leaving only animals best suited to their environment to pass on their genes to future generations. Slowly these genes accumulate until a new species emerges. This is natural selection – though it is scarcely a pleasant business.

"The total amount of suffering in the natural world is beyond all decent contemplation," Dawkins insists. "For most animals, reality is a business of struggle, suffering and sudden death."...


"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures. ... Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter. — Boston Globe

Mahzarin Banaji, Samuel Barondes, Yochai Benkler, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Nicholas Christakis, Brian Cox, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch,Dennis Dutton, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Steve Jones, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, John Markoff, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Clay Shirky, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Seirian Sumner, Leonard Susskind, Nassim Nicholas Taleb, Timothy Taylor, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Lewis Wolpert, Richard Wrangham, Philip Zimbardo

[Continue to Edge Video]



WHAT HAVE YOU CHANGED YOUR MIND ABOUT
Edited by John Brockman
With An Introduction By BRIAN ENO

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now."
San Francisco Chronicle

"A great event in the Anglo-Saxon culture."
El Mundo


Praise for the online publication of
What Have You Change Your Mind About?

"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent

"A great event in the Anglo-Saxon culture." El Mundo

"As fascinating and weighty as one would imagine." The Independent

"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian

"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times

"Provocative ideas put forward today by leading figures."The Telegraph

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle

"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer

"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail

"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star

"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online


WHAT ARE YOU OPTIMISTIC ABOUT?
Today's Leading Thinkers on Why Things Are Good and Getting Better
Edited by John Brockman
Introduction by DANIEL C. DENNETT



[2007]

"The optimistic visions seem not just wonderful but plausible." Wall Street Journal

"Persuasively upbeat." O, The Oprah Magazine

"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed

"Uplifting...an enthralling book." The Mail on Sunday


WHAT IS YOUR DANGEROUS IDEA?
Today's Leading Thinkers on the Unthinkable
Edited by John Brockman
Introduction by STEVEN PINKER
Afterword by RICHARD DAWKINS


[2006]

"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)

"A selection of the most explosive ideas of our age." Sunday Herald

"Provocative" The Independent

"Challenging notions put forward by some of the world's sharpest minds" Sunday Times

"A titillating compilation" The Guardian

"Reads like an intriguing dinner party conversation among great minds in science" Discover


WHAT WE BELIEVE BUT CANNOT PROVE?
Today's Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Introduction by IAN MCEWAN


[2006]

"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times

"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times

"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer

"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed

"Scientific pipedreams at their very best." The Guardian

"Makes for some astounding reading." Boston Globe

"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4

"Intellectual and creative magnificence" The Skeptical Inquirer



[2008]



"Compelling"
"Stellar"

"Important"

[2006]

"Irresistible"
"Excellent"
"Fascinating"


[2006]

"incisive"
"deeply passionate"
"engaging"

[2004]

"Intriguing"
"Engrossing"
"Invigorating"



[1994]

"Rousing"
"Astonishing"
"Bloodthirsty"

[2000]

"Dazzling"
"Wondrous"
"Outstanding"


[2002]


"Provocative"
"Captivating"
"Mind-stretching"

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.


|Top|