Edge in the News: 2007

To Night [1.31.07]

Dare to question. Most don't. Indeed, many people get alarmed, agitated, when difficult questions are posed.

Questioning settled assumptions forces people to think, which can be a frightening, radical exercise.

Consider the "dangerous ideas" listed here: "Do women, on average, have a different profile of aptitudes and emotions than men?

Were the events in the Bible fictitious — not just the miracles, but those involving kings and empires? Do most victims of sexual abuse suffer lifelong damage? Did Native Americans engage in genocide and despoil the landscape? Do men have an innate tendency to rape?

Are suicide terrorists well-educated, mentally healthy, and morally driven? Are Ashkenazi Jews, on average, smarter than Gentiles because their ancestors were selected for the shrewdness needed in money lending? ...

Steven Pinker, in his introduction, calls these "dangerous ideas - ideas that are denounced not because they are self-evidently false, not because they advocate harmful action, but because they are thought to corrode the prevailing moral order"....

...psychologist Daniel Gilbert employs just 131 words to shoot down the thought "that ideas can be dangerous".

Paradoxically, he states "the most dangerous idea is the only dangerous idea: The idea that ideas can be dangerous."

Whew! I was worried for a moment. Like the meaning of life, there's no simple answer. Which is why so many, desperate for certainty, shy away books like this.

Personally, I relish such questions, and if you have any sort of an open, enquiring mind, then so will you.

[...]

South Africa [1.31.07]

Dare to question. Most don't. Indeed, many people get alarmed, agitated, when difficult questions are posed.

Questioning settled assumptions forces people to think, which can be a frightening, radical exercise.

Consider the "dangerous ideas" listed here: "Do women, on average, have a different profile of aptitudes and emotions than men?

Were the events in the Bible fictitious — not just the miracles, but those involving kings and empires? Do most victims of sexual abuse suffer lifelong damage? Did Native Americans engage in genocide and despoil the landscape? Do men have an innate tendency to rape?

Are suicide terrorists well-educated, mentally healthy, and morally driven? Are Ashkenazi Jews, on average, smarter than Gentiles because their ancestors were selected for the shrewdness needed in money lending? ...

Steven Pinker, in his introduction, calls these "dangerous ideas - ideas that are denounced not because they are self-evidently false, not because they advocate harmful action, but because they are thought to corrode the prevailing moral order"....

...psychologist Daniel Gilbert employs just 131 words to shoot down the thought "that ideas can be dangerous".

Paradoxically, he states "the most dangerous idea is the only dangerous idea: The idea that ideas can be dangerous."

Whew! I was worried for a moment. Like the meaning of life, there's no simple answer. Which is why so many, desperate for certainty, shy away books like this.

Personally, I relish such questions, and if you have any sort of an open, enquiring mind, then so will you.

[...]

THE TELEGRAPH [1.30.07]

A method that can precisely measure how well people are connected to one another has been successfully tested for the first time at the nation's most prestigious science party. Most people have encountered the "small world" phenomenon - that striking coincidence that emerges while chatting to a stranger when you discover you have a mutual friend. Scientists believe that almost any two people from anywhere in the world, may well be linked by a chain of half a dozen or so people. The Royal Society, the nation's academy of science, marked a fitting venue for a new test of this idea. At a Daily Telegraph party held there last night, sponsored by Novartis, Prof Richard Wiseman, of the University of Hertfordshire, carried out a novel experiment using four "experimenters" and four "target" guests. Each randomly selected a guest and asked to be directed to a target. In all, they approached 64 guests, and managed to complete 37 chains. Related Articles * Bungling burglars of 2010 31 Jan 2007 * The 40-something ski bum: meeting the ninja ski instructor 31 Jan 2007 * What to do with your unwanted Christmas presents? 31 Jan 2007 * One in four men is obese ? and it looks like I'm the one 31 Jan 2007 * Matthew Norman: let's pack England's phoney Barmy Army off from Ashes series in Australia to Helmand 31 Jan 2007 * Mother and children found dead in north Wales home 31 Jan 2007 "Sixty per cent of the 500 or so people were linked by only two or three degrees of separation. They are twice as well connected, on average, as other people," said Prof Wiseman. The experiment was inspired by a study carried out in the 1960s by American psychologist Stanley Milgram. "Amazingly, the average chain length was just three people fewer than in Milgram's experiment," said Prof Wiseman. "The longest chain was six people, and about 15 per cent of the chains were completed in one step. We would expect this group of scientists and popularisers to be well connected but, for the first time, we have been able to measure just how well connected," said Prof Wiseman. The party was attended by a remarkable cross section of people, among them: Subhanu Saxena, CEO of Novartis, Will Lewis, editor of the Telegraph, Fay Weldon, the author, Lord Rees, the Astronomer Royal, Richard Fortey, President of the Geological Society, the naturalist Sir David Attenborough, Phil Campbell, editor of Nature, the television presenters Joan Bakewell, Floella Benjamin, Adam Hart-Davis and Robert Winston, Nobel laureate Sir Tim Hunt, Christmas lecturer Prof Marcus do Sautoy, agent John Brockman and director of the Royal Institution, Baroness Greenfield.

GENOME TECHNOLOGY ONLINE [1.30.07]

The Edge Foundation, an intellectual group of leaders from various fields, has issued its question of the year: What are you optimistic about? While we might rephrase the question to eliminate that irksome preposition, the point today is that genomic heavyweight George Church has sent in his response, and it's worth a read.

Church predicts that 2007 will be the year of the personal genome, with the mainstream public finally getting involved (and interested) in the field and its consequences. "I am optimistic that while society is not now ready, it will be this year," Church writes. Check out his full response here.

And for the record -- it's people like George Church who keep us optimistic. Thanks, George!

[...]

Genome Technology Online [1.30.07]

The Edge Foundation, an intellectual group of leaders from various fields, has issued its question of the year: What are you optimistic about? While we might rephrase the question to eliminate that irksome preposition, the point today is that genomic heavyweight George Church has sent in his response, and it's worth a read.

Church predicts that 2007 will be the year of the personal genome, with the mainstream public finally getting involved (and interested) in the field and its consequences. "I am optimistic that while society is not now ready, it will be this year," Church writes. Check out his full response here.

And for the record—it's people like George Church who keep us optimistic. Thanks, George!

[...]

Natalie Angier, The New york Times [1.29.07]

Valentine’s Day is nearly upon us, that sweet Hallmark holiday when you can have anything your heart desires, so long as it’s red. Red roses, red nighties, red shoes and red socks. Red Oreo filling, red bagels, red lox.

As it happens, red is an exquisite ambassador for love, and in more ways than people may realize. Not only is red the color of the blood that flushes the face and swells the pelvis and that one swears one would spill to save the beloved’s prized hide. It is also a fine metaphoric mate for the complexity and contrariness of love. In red we see shades of life, death, fury, shame, courage, anguish, pride and the occasional overuse of exfoliants designed to combat signs of aging. Red is bright and bold and has a big lipsticked mouth, through which it happily speaks out of all sides at once. Yoo-hoo! yodels red, come close, have a look. Stop right there, red amends, one false move and you’re dead.

Such visual semiotics are not limited to the human race. Red is the premier signaling color in the natural world, variously showcasing a fruitful bounty, warning of a fatal poison or boasting of a sturdy constitution and the genes to match. Red, in other words, is the poster child for the poster, for colors that have something important to say. “Our visual system was shaped by colors already in use among many plants and animals, and red in particular stands out against the green backdrop of nature,” said Dr. Nicholas Humphrey, a philosopher at the London School of Economics and the author of “Seeing Red: A Study in Consciousness.” “If you want to make a point, you make it in red.”

What is it, then, to see red, to see any palette at all? Of our famed rods and cones, the two classes of light-sensing cells with which the retina at the back of each eye is supplied, the rods do the basics of vision, of light versus shadow, tracking every passing photon and allowing us to see by even a star’s feeble flicker, though only in gunmetal shades of black, white and grim. It is up to our cone cells to capture color, and they don’t kick in until the dawn’s earylish light or its Edisonian equivalent, which is why we have almost no color vision at night.

Cones manage their magic in computational teams of three types, each tuned to a slightly different slice of the electromagnetic spectrum, the sweeping sum of lightwaves that streams from the sun. As full-spectrum sunlight falls on, say, a ripe apple, the physical and chemical properties of the fruit’s skin allow it to absorb much of the light, save for relatively long, reddish lightwaves, which bounce off the surface and into our greedy eyes. On hitting the retina, those red wavelengths stimulate with greatest fervor the cone cells set to receive them, a sensation that the brain interprets as “healthy, low-hanging snack item ahead.”

In fact, human eyes, like those of other great apes, seem to be all-around fabulous fruit-finding devices, for they are more richly endowed with the two cone types set to red and yellow wavelengths than with those sensitive to short, blue-tinged light. That cone apportionment allows us to discriminate among subtle differences in fruit ruddiness and hence readiness, and may also explain why I have at least 40 lipsticks that I never wear compared with only three blue eye shadows.

Whatever the primary spur to the evolution of our rose-colored retinas, we, like most other animals with multichromatic vision, have learned to treat red with respect. “In the evolution of languages,” Dr. Humphrey writes, “red is without exception the first color word to enter the vocabulary,” and in some languages it’s the only color word apart from black and white. It’s also the first color that most children learn to name, and that most adults will cite when asked to think of a color, any color.

Red savors the spice of victory. Analyzing data from Olympic combat sports like boxing and tae kwon do, in which competitors are randomly assigned to wear red shorts or blue, Dr. Russell Hill and his colleagues at the University of Durham in Britain found that the red-shorted won their matches significantly more often than would be expected by chance alone. What the researchers don’t yet know is whether the reds somehow get an subconscious boost from their garb, or their blue opponents are felled by the view. After all, said Dr. Geoffrey Hill, a biology professor at Auburn University in Alabama and no relation to Russell Hill, “I’ve seen some of my biggest, toughest students, these tough, athletic guys, faint right to the floor at the sight of one drop of bird’s blood.”

Red refuses to be penned down or pigeonholed. It has long been the color of revolution, of overthrowing the established order. “Left-wing parties in Europe have all been red,” Dr. Humphrey said, “while the conservatives, in Britain and elsewhere, go for blue.” Yet in the United States, the color scheme lately has been flipped, and the red states are said to be the guardians of traditional values, of mom and pop, of guns and red meat.

Context, too, changes red’s meaning. A female bird may be attracted to the bright scarlet sheen of a male’s feathers or of a baby bird’s begging mouth, but will assiduously avoid eating red ladybugs that she knows are packed with poisons.

Given red’s pushy reputation, design experts long thought people felt uncomfortable and worked poorly when confined to red rooms. But when Dr. Nancy Kwallek, a professor of interior design at the University of Texas at Austin, recently compared the performance of clerical workers randomly assigned for a week to rooms with red, blue-green or white color schemes, she found that red’s story, like the devil, is in the details. Workers who were identified as poor screeners, who have trouble blocking out noise and other distractions during the workday, did indeed prove less productive and more error prone in the red rooms than did their similarly thin-skinned colleagues in the turquoise rooms. For those employees who were rated as good screeners, however, able to focus on their job regardless of any ruckus around them, the results were flipped. Screeners were more productive in the red room than the blue. “The color red stimulated them,” she said, “and they thrived under its effects.”

And the subjects assigned to the plain-vanilla settings, of a style familiar to the vast majority of the corporate labor force? Deprived of any color, any splash of Matisse, they were disgruntled and brokenhearted and did the poorest of all.

THE NEWYORK TIMES MAGAZINE [1.20.07]

Are you an optimist or a pessimist? Put so starkly, the question has a fatuous ring. Unless you are in the grip of a bipolar disorder, you are probably optimistic about some things and pessimistic about others. Optimism tends to reign when people are imagining how their own plans will turn out. Research shows that we systematically exaggerate our chances of success, believing ourselves to be more competent and more in control than we actually are. Some 80 percent of drivers, for example, think they are better at the wheel than the typical motorist and thus less likely to have an accident. We live in a Lake Woebegon of the mind, it seems, where all the children are above average. Such “optimism bias,” as psychologists have labeled it, is hardly confined to our personal lives. In fact, as Daniel Kahneman, a Nobel laureate in economics, and Jonathan Renshon argue in the current issue of Foreign Policy, it may help explain why hawkishness so often prevails at the national level. Wasn’t the Iraq war expected by proponents to be “fairly easy” (John McCain) or “a cakewalk” (Kenneth Adelman)?

Skip to next paragraph

Joel Meyerowitz/Edwynn Houk Gallery

 

But when it comes to the still bigger picture — the fate of civilization, of the planet, of the cosmos — pessimism has historically been the rule. A sense that things are heading downhill is common to nearly every culture, as Arthur Herman observes in “The Idea of Decline in Western History.” The golden age always lies in the past, never in the future. It’s not hard to find a psychological explanation for this big-picture gloominess. As we age, we become aware of our powers diminishing; we dwell on the happy episodes from our past and forget the wretched ones; moving toward the grave, we are consumed by nostalgia and foreboding. What could be more natural than to project this mixture of attitudes onto history at large?

The very idea of progress, a novelty of the Enlightenment that has been in fashion only fitfully since, can grow wearisome. “Progress might have been all right once,” Ogden Nash said, “but it has gone on too long.”

You might think scientists would be the optimistic exception here. Science, after all, furnishes the model for progress, based as it is on the gradual and irreversible growth of knowledge. At the end of last year, Edge.org, an influential scientific salon, posed the questions “What are you optimistic about? Why?” to a wide range of thinkers. Some 160 responses have now been posted at the Web site. As you might expect, there is a certain amount of agenda-battling, and more than a whiff of optimism bias. A mathematician is optimistic that we will finally get mathematics education right; a psychiatrist is optimistic that we will find more effective drugs to block pessimism (although he is pessimistic that we will use them wisely). But when the scientific thinkers look beyond their own specializations to the big picture, they continue to find cause for cheer — foreseeing an end to war, for example, or the simultaneous solution of our global-warming and energy problems. The most general grounds for optimism offered by these thinkers, though, is that big-picture pessimism so often proves to be unfounded. The perennial belief that our best days are behind us is, it seems, perennially wrong.

Such reflections may or may not ease our tendency toward global pessimism. But what about our contrary tendency to be optimistic — indeed, excessively so — in our local outlook? Is that something we should, in the interests of cold reason, try to disabuse ourselves of? Optimism bias no doubt causes a good deal of mischief, leading us to underestimate the time and trouble of the projects we undertake. But the mere fact that it is so widespread in our species suggests it might have some adaptive value. Perhaps if we calculated our odds in a more cleareyed way, we wouldn’t be able to get out of bed in the morning.

A couple of decades ago, the psychologist Shelley Taylor proposed that “positive illusions” like excessive optimism were critical to mental health. People who saw their abilities and chances realistically, she noted, tended to be in a state of depression. (Other psychologists, taking a closer look at the data, countered that depressives actually show more optimism bias than nondepressives: given the way things turn out for them, they are not pessimistic enough.) And there is new evidence that optimism may in some ways be self-fulfilling. In a recently published study, researchers in the Netherlands found that optimistic people — those who assented to statements like “I often feel that life is full of promises” — tend to live longer than pessimists. Perhaps, it has been speculated, optimism confers a survival advantage by helping people cope with adversity.

But pessimism still appears to have its advantages. Another recently published paper observes that over the last three decades, the people of Denmark have consistently scored higher on life-satisfaction than any other Western nation. Why? Because, say the authors, the Danes are perennial pessimists, always reporting low expectations for the year to come. They then find themselves pleasantly surprised when things turn out rather better than expected.

Americans, too, are lowering their expectations, at least in one respect. According to the Census Bureau’s 2007 Statistical Abstract of the United States, most college freshmen in 1970 said their primary goal was to develop a meaningful life philosophy. In 2005, by contrast, most freshmen said their primary goal was to be comfortably rich — a more modest one, it would seem, given the relative frequency of wealth and wisdom.

As for the minority still seeking a philosophy of life, the Viennese satirist Karl Kraus came up with a formula nearly a century ago that remains the perfect blend of optimism and pessimism: Things are hopeless but not serious.

BOING BOING [1.20.07]


There's probably a great Linux joke in here, but I'm not funny enough to come up with it. Technologist and former Microsoft executive Nathan Myhrvold visited the Falklands[ / Islas Malvinas], and took some amazing photographs of penguins and other creatures there. Dr. Myhrvold is CEO and managing director of Intellectual Ventures, a private entrepreneurial firm he founded with his former Microsoft colleague, Dr. Edward Jung. Snip from an essay about what he observed on the islands:

It turns out that there are some reasonably well developed scientific theories of cuteness.

Penguins look like little people – their bipedal stance, walking gait and proportions look like a tiny toy person. Self-love is something humans are good at, so it is natural to find these animals compelling. Their behaviors also happen to map well to human behavior – or at least one can naively imagine so because they are stereotypically similar to some of our own actions.

That covers penguins, but there are some more universal aspects of cuteness. I once studied to be a cartoonist (alas, I wasn’t funny enough) and in that field they have this very well figured out. The rule of thumb is that if you want a cartoon character to be cute, you draw it so that the total body height is between 2.5 and 3 times the height of the head. This gives you a Mickey Mouse, or Tweety Bird sort of character. You then make the eyes a large fraction of head height – little beady eyes are not cute. To make a heroic character – say Superman, Spiderman or Captain America you want 7.5 to 8 heads high. It always has amused me that being a pinhead looks heroic.

Link. Image: (c) 2007, Nathan Myhrvold. (Thanks, John Brockman)

Reader comment: Jeff says,

You should link to Mhyrvold's article on the future of digital photography, it's a must read. Direct link here. Excerpt:

I'm eagerly awaiting Canon's next move, probably to 25-plus megapixels. I'm what marketing people call an early adopter, but mark my words - you'll own a 16- or even a 25-megapixel point-and-shoot in a few years, and it will not stop there. By some estimates, your eyes have an effective resolution of more than 500 megapixels. If you can see it, why shouldn't a camera record it? The reason many pictures don't turn out is that in daytime the human eye can easily perceive a dynamic range of 10,000:1, while at night it is more like 1,000,000:1. Meanwhile, color slide film can record only about 32:1, and digital cameras, about 64:1.

In many situations, this forces a choice - do you expose for the light parts of the scene and let the dark parts go dead black, or save the shadows and turn the bright parts pure white? Future digital sensors will fix this, with ever broader dynamic range and greater light sensitivity (the ISO rating).

Focus is another problem. How many of your pictures wind up fuzzy? Autofocus technology can help, but cameras today still have a limitation on how much of a scene can be in focus at one time, known as depth of field. If you focus on the flower in front of you, the mountain in the background is apt to be fuzzy. Yet technically there is no reason we can't get essentially infinite depth of field, again by using more digital processing.

Javier Rodruiguez says,

My impression concerning your post would have been much better if you just said "Islas Malvinas" instead of Falklands (...) they always belonged to Argentina, not just a matter of sovereignty but simply geology (it's physically undeniable that they are inside South America's continental platform). I guess you regard colonialism as evil, as much as many of us do.

THE NEWS & OBSERVER [1.20.07]

'What are you optimistic about?" editor John Brockman asked some of the world's leading scientists on his Web site, www.edge.org.

As I've yet to complete my unified theory of the universe, he did not include me in his survey. If he had, I'd have answered: Just about everything.

As I reported in last week's column, Brockman's respondents were forward-looking, describing cutting-edge research that will help combat global warming and other looming problems. My optimism is anchored in the past.

By almost any measure -- greater wealth, better health, diminishing levels of violence -- the world is good and getting better. My only regret is that I am alive today because tomorrow will be even brighter.

Where to start with the good news? How about with the Big Kahuna: During the 20th century, life spans for the average American rose from 44 years to 77 as we tamed age-old scourges such as smallpox, malaria, polio and plague.

[...]

THE INDEPENDENT [1.20.07]

Richard Dawkins

Evolutionary biologist; Charles Simonyi Professor for the Understanding of Science, Oxford University; author, 'The God Delusion'

The final scientific enlightenment

I am optimistic that the physicists of our species will complete Einstein's dream and discover the final theory of everything before superior creatures, evolved on another world, make contact and tell us the answer. I am optimistic that, although the theory of everything will bring fundamental physics to a convincing closure, the enterprise of physics itself will continue to flourish, just as biology went on growing after Darwin solved its deep problem. I am optimistic that the two theories together will furnish a totally satisfying naturalistic explanation for the existence of the universe and everything that's in it, including ourselves. And I am optimistic that this final scientific enlightenment will deal an overdue death blow to religion and other juvenile superstitions.

Rodney A Brooks

Director, MIT Computer Science and Artificial Intelligence Laboratory (CSAIL); chief technical officer of iRobot Corporation; author, 'Flesh and Machines'

The 22nd century

I am optimistic about many things, especially the future. Just last week I met a number of people from the 22nd century, and they were delightful. We smiled and giggled together a lot but none of them seemed to speak a word of English. Even their Japanese was not so great just yet. But demographic analysis tell us that many of those little girls I saw in Kyoto will end up as citizens of the next century.

I am optimistic that even if none of the people I just met do so, then at least someone who is already alive will be the first person to make their permanent home off Earth. And next century my new young acquaintances will go to sleep at night on this planet knowing that humankind has spread itself out into the solar system. Some people will have done it for wealth. Others, driven by our evolutionarily selected urges, will have done it to once again mediate risks across our gene pool by spreading out into different environmental niches. But the wonder of it all is that those now old, but sprightly, women in Kyoto will be able to revel in the romance of the human spirit, always questing to learn, understand, explore, and be.

Brian Eno

Artist; composer; producer (U2, Talking Heads, Paul Simon); recording artist

Big government

Things change for the better either because something went wrong or because something went right. Recently, we've seen an example of the former, and this failure fills me with optimism.

The acceptance of the reality of global warming has, in the words of Sir Nicholas Stern in his report on climate change to the British government, shown us "the greatest and widest ranging market failure ever seen". The currency of conservatism for the past century has been that markets are smarter than governments: and this creed has reinforced the conservative resistance to anything resembling binding international agreements. The suggestion that global warming represents a failure of the market is therefore important. Technical solutions will hopefully be found, but the process will need to be primed and stoked and enforced by legislation that would be regarded as big-government socialism in the present climate. The future may be a bit more like Sweden and a bit less like America.

If a single first instance of global governance proves successful, it will strengthen its appeal as a way of addressing other problems - such as weapons control, energy management, money-laundering, conflict resolution, people-trafficking, slavery and poverty. It will become increasingly difficult for countries to stay outside of future treaties such as Kyoto - partly because of international pressure but increasingly because of pressure from their own populations.

Steven Pinker

Psychologist, Harvard University; author, 'The Blank Slate'

The decline of violence

In 16th-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted on a stage and slowly lowered into a fire. According to the historian Norman Davies, "the spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonised". As horrific as present-day events are, such sadism would be unthinkable today in most of the world. This is just one example of the most important and under-appreciated trend in the history of our species: the decline of violence. Cruelty as popular entertainment, human sacrifice to indulge superstition, slavery as a labour-saving device, genocide for convenience, torture and mutilation as routine forms of punishment, execution for trivial crimes and misdemeanors, assassination as a means of political succession, pogroms as an outlet for frustration, and homicide as the major means of conflict resolution - all were unexceptionable features of life for most of human history. Yet today they are statistically rare in the West, less common elsewhere than they used to be, and widely condemned when they do occur.

Most people, sickened by the headlines and the bloody history of the 20th century, find this claim incredible. Yet as far as I know, every systematic attempt to document the prevalence of violence over centuries and millennia (and, for that matter, the past 50 years), particularly in the West, has shown that the overall trend is downward (though of course with many zigzags). What went right? No one knows, possibly because we have been asking the wrong question - "Why is there war?" instead of "Why is there peace?"

There have been some suggestions, all unproven. Perhaps the gradual perfecting of a democratic Leviathan - "a common power to keep [men] in awe" - has removed the incentive to do it to them before they do it to us. James Payne, author of The History of Force, suggests that it's because, for many people, life has become longer and less awful - when pain, tragedy and early death are expected (omega) features of one's own life, one feels fewer compunctions about inflicting them on others. The award-winning science writer Robert Wright points to technologies that enhance networks of reciprocity and trade, which make other people more valuable alive than dead. The Australian philosopher Peter Singer attributes it to the inexorable logic of the golden rule: the more one knows and thinks, the harder it is to privilege one's own interests over those of other sentient beings. Perhaps this is amplified by cosmopolitanism, in which history, journalism, memoir and realistic fiction make the inner lives of other people, and the contingent nature of one's own station, more palpable - the feeling that "there but for fortune go I..."

Larry Sanger

Co-founder, Wikipedia

Enlightenment

I am optimistic about humanity's coming enlightenment.

In particular, I am optimistic about humanity's prospects for starting exemplary new collaboratively developed knowledge resources. When we hit upon the correct models for collaborative knowledge-collection online, there will be a jaw-dropping, unprecedented, paradigm-shifting explosion in the availability of high-quality free knowledge.

Lord (Martin) Rees

President, The Royal Society; Professor of Cosmology & Astrophysics; Master, Trinity College, University of Cambridge; author, 'Our Final Century: The 50/50 Threat to Humanity's Survival'

The energy challenge

A few years ago, I wrote a short book entitled 'Our Final Century'. I guessed that, taking all risks into account, there was only a 50 per cent chance that civilisation would get through to 2100 without a disastrous setback. This seemed to me a far from cheerful conclusion. However, I was surprised by the way my colleagues reacted to the book: many thought a catastrophe was even more likely than I did, and regarded me as an optimist. I stand by this optimism.

There are indeed powerful grounds for being a techno-optimist. For most people in most nations, there's never been a better time to be alive. The innovations that will drive economic advance - information technology, biotech and nanotech - can boost the developing as well as the developed world. We're becoming embedded in a cyberspace that can link anyone, anywhere, to all the world's information and culture - and to every other person on the planet. Creativity in science and the arts is open to hugely more than in the past. Twenty-first century technologies will offer lifestyles that are environmentally benign - involving lower demands on energy or resources than what we'd consider a good life today. And we could readily raise the funds - were there the political will - to lift the world's two billion most deprived people from their extreme poverty.

Later in this century, mind-enhancing drugs, genetics, and "cyberg" techniques may change human beings themselves. That's something qualitatively new in recorded history - and it will pose novel ethical conundrums. Our species could be transformed and diversified (here on Earth and perhaps beyond) within just a few centuries.

My number-one priority would be much-expanded research and development into a whole raft of techniques for storing energy and generating it by "clean" or low-carbon methods. The stakes are high - the world spends nearly three trillion dollars per year on energy and its infrastructure. This effort can engage not just those in privileged technical environments in advanced countries, but a far wider talent pool. Even if we discount climate change completely, the quest for clean energy is worthwhile on grounds of energy security, diversity and efficiency. This goal deserves a priority and commitment from governments akin to that accorded to the Manhattan project or the Apollo moon landing.

Peter Schwartz

Futurist; business strategist; co-founder, Global Business Network, a Monitor Company; author, 'The Long Boom'

Growing older

I turned 60 this year and several decades ago I would have looked forward to a steady decline in all my physical and mental capabilities, leading into a long and messy death. The accelerating pace of biological and medical advances that are unfolding in front of us are heavily focused on reducing the infirmities of ageing and curing or transforming the diseases of old age from fatal to chronic. It means that 90 really will be the new 60 and there is a good chance that I will be among the vigorous new centenarians of mid century, with most of my faculties working fairly well.

Vision, hearing, memory, cognition, bone and muscle strength, skin tone, hair and, of course, sexual vigour will all be remediable in the near future. Alzheimer's may be curable and most cancers are likely to be treatable if not curable. And regenerative medicine may truly lead to real increase in youthfulness as new custom-grown organs replace old, less-functional ones. And, within a few decades, we are likely to be able to slow ageing itself, which could even lead to life beyond 120.

Judith Rich Harris

Independent investigator and theoretician; author, 'No Two Alike: Human Nature and Human Individuality'

Friendship

I am optimistic about human relationships - in particular, about friendship. Perhaps you have heard gloomy predictions about friendship: it's dying out, people no longer have friends they can confide in, loneliness is on the rise.

But friendship isn't dying out: it's just changing, adapting to the changes in the world. People are discovering different ways of getting together. It may be harder to find a bowling partner but it's easier to find someone to chat with, because there are more ways to chat.

When I was a child, people with chronic illnesses were described as "shut-ins". Now, a person can be shut in without being shut out. I have friends whom I know only through email conversations but who are as dear to me as my college roommate and dearer by far than my next-door neighbour.

W Daniel Hillis

Physicist; computer scientist; chairman, Applied Minds, Inc; author, 'The Pattern on the Stone'

Demographics

I am optimistic about humankind's ability to reach a sustainable balance with other life on Earth, in part because the number of humans on Earth will soon start to decrease. This doesn't mean that I think we should ignore our environmental problem - just the (omega) opposite: I think we should fight hard now with the confidence that we can win a complete and lasting victory.

We are so accustomed to watching the explosion of human growth and development that it is easy to imagine that this is normal. It is not. We are the first generation in history that has watched the human population double in our own lifetime, and no future generation is likely to see it again. All of those blights of growth that we have come to accept - crowded cites, jammed roads, expanding suburbs, fish-depleted oceans and tree-stripped forests - are all symptoms of a one-of-a-kind surge in human expansion. Soon, they will be just memories.

There are currently over six billion people in the world. There will probably never be more than 10 billion. Population forecasts vary, but they all agree that human population growth is slowing. As people become more prosperous, they have smaller families. In every country where women are allowed free access to education and healthcare, rates of population growth are going down. Sometimes the trends are hidden by the delays of demographics, but the real population growth rates are already negative in Europe, China and, if we subtract immigration, the US. The total human population is still growing, but not as fast as it once was. Assuming that these trends continue, the total population of the world will be shrinking well before the end of this century.

There is no doubt that the environmental challenges of the next decades are daunting, and they will require all the power of human striving and creativity to overcome them. Yet I have no doubt that we will succeed. Innovation, good will and determined effort will be enough to handle the next few billion people. Then, as populations shrink, demands on resources will be reduced. Nature will begin to repair itself, reclaiming what we have so hastily taken. I hope we manage to keep the gorillas, elephants and rhinoceroses alive. By the end of the century, they will have room to roam.

Timothy Taylor

Archaeologist, University of Bradford; author, 'The Buried Soul'

Skeuomorphism

In a small wire tidy on my desk I have several corks. But they are not cork. In the 1980s, demand for high-quality cork began to outstrip supply. As low-grade cork often taints (or "corks") wine, substitutes were sought. My corks are synthetic. One is cork-coloured and slightly variegated to make it appear traditional; like real corks in the German Riesling tradition, it is stamped in black with a vine tendril motif. Another is less convincingly mottled, and is mid-yellow in colour with the name of the vintner, Gianni Vescovo, printed in bold black. Both these corks are skeuomorphs - objects that preserve formal vestiges of the constraints of an original no longer strictly necessary in the new material. First-generation skeuomorphs are close mimics, even fakes.

Second-generation skeuomorphs, such as the Vescovo cork, abandon any serious attempt at deception. Its mottling, and the fact that it is still a functional cork, rather than a metal screwtop closure, is a comforting nod to the history of wine. At the same time it signals a new, more consistent, freedom from contamination. As synthetic corks became more familiar, new and more baroque forms arose. These third-generation skeuomorphs are fun: a bright-purple cork that stoppered an Australian red suggests a grape colour, while a black cork has a hi-tech look that draws symbolic attention to the new techniques of low-temperature fermentation in stainless-steel.

I see much of the history of technology as an unplanned trajectory in which emergent skeuomorphic qualities often turn out to have been critical. Corks are a relatively trivial example in an extraordinary history of skeuomorphism, impossible to review here, but which encompasses critical turns in material development from prehistoric flint, via the discovery of metals and alloys, to complex compound objects, of which computers are a modern manifestation.

I grew up with Alan Turing's unsettling vision of a future machine indistinguishable from a human in its reactions. Ray Kurzweil's provocative prediction of the impending "singularity" - the point when computer intelligence would start to leave humans gasping in its intellectual wake - added to my fears.

I have recently become quite relaxed about all this. Computers' eventual power will probably not be in simulation or deception. Instead, by surpassing us in some areas, they will relieve our brains and bodies of repetitive effort. If they behave as other skeuomorphs before them, it will be computers' currently unimagined emergent qualities that we will come to value most, enhancing and complementing our humanity rather than competing with and superseding it.

David Bodanis

Writer; futurist; author, 'Passionate Minds'

Decency

I'm optimistic because there's a core decency in people that even the worst machinations of governments can't entirely hold down. The Evelina Hospital [in Southwark, London] is the first new children's hospital that's been built in the city in a century. There's a giant atrium in the middle, and the contract with the company doing the cleaning says that the window cleaners need to dress up as superheroes. The children in bed - many with grave illnesses - delight in seeing Superman and Spiderman dangling just inches away from them, on the outside of the glass; apparently for the cleaners it's one of the highlights of their week. The government has wasted a fortune on consultants, bureaucracy and reorganisations of the NHS. It's always defended in cold management-speak. This simple arrangement with the window cleaners cuts through all that. Everyone I've mentioned it to recognises that - and in that recognition, lies our hope.

Gloria Origgi

Philosopher and researcher, Centre Nationale de la Recherche Scientifique; author, 'Text-E: Text in the Age of the Internet'

Europe

I'm optimistic about Europe. On 30 May 2005, the day after the French rejected in a referendum the project of the European Constitution, I was travelling on the Thalys high-speed train from Paris to Brussels for a committee meeting at the European Community. The train was full people of my age - in their thirties - going to Brussels as "experts" in various domains to attend meetings and participate in various EC projects. I looked around and started chatting with my neighbours. The conversation was light, mainly about restaurants and bars in Brussels or new exhibitions and movies. Most of the people I spoke with came from more than one cultural background, with two or more nationalities in the family: say, father from Germany, mother from Ireland, grown up in Rotterdam. All of us were at least bilingual, many trilingual or more.

I quickly realised that asking the opening question of ordinary train encounters, "Where are you from?", had become patently obsolete. The image was quite at odds with the newspapers' and politicians' cliché of the prototypical EC officer as a grey, square, hideously boring civil servant in a checkered jacket, wasting time inventing useless bureaucratic rules. My neighbours epitomised (omega) the deep cultural change that is now taking place in Europe. A new generation has grown up - people born more than a quarter of a century after the end of the Second World War and now moving around Europe to study, work, meet, date, marry and have children with people from other European countries, and do so as a matter of course.

Walter Isaacson

President & CEO, Aspen Institute; former CEO, CNN; former managing editor, 'Time'; author, 'Benjamin Franklin: An American Life'

Print as a technology

I am very optimistic about print as a technology. Words on paper are a wonderful information storage, retrieval, distribution and consumer product. Imagine if we had been getting our information delivered digitally to our screens for the past 400 years. Then some modern Gutenberg had come up with a technology that was able to transfer these words and pictures on to pages that could be delivered to our doorstep, and we could take them to the backyard, the bath, or the bus. We would be thrilled with this technological leap forward, and we would predict that someday it might replace the internet.

Simon Baron-Cohen

Psychologist, Autism Research Centre, Cambridge University; author, 'The Essential Difference'

The rise of autism

Whichever country I travel to, attending conferences on the subject of autism, I hear the same story: autism is on the increase. Thus, in 1978, the rate of autism was four in 10,000 children, but today (according to a Lancet article in 2006) it is 1 per cent. No one quite knows what this increase is due to, though conservatively it is put down to better recognition, better services, and broadening the diagnostic category to include milder cases such as Asperger's syndrome.

It is neither proven nor disproven that the increase might reflect other factors, such as genetic change or some environmental (eg, hormonal) change. And for scientists to answer the question of what is driving this increase will require imaginative research comparing historical as well as cross-cultural data. Some may throw up their hands at this increase in autism and feel despair and pessimism. They may feel that the future is bleak for all of these newly diagnosed cases of autism. But I remain optimistic that, for a good proportion of them, it has never been a better time to have autism.

Why? Because there is a remarkably good fit between the autistic mind and the digital age. Computers operate on the basis of extreme precision, and so does the autistic mind. Computers are systems, and the autistic mind is the ultimate systemiser. The inherently ambiguous and unpredictable world of people and emotions is a turn-off for someone with autism, but a rapid series of clicks of the mouse that leads to the same result every time that sequence is performed is reassuringly attractive. Many children with autism develop an intuitive understanding of computers in the same way that other children develop an intuitive understanding of people.

So, why am I optimistic? For this new generation of children with autism, I anticipate that many of them will find ways to blossom, using their skills with digital technology to find employment, to find friends, and in some cases to innovate.

George Dyson

Science historian; author, 'Darwin Among the Machines'

The return of commercial sail

I am optimistic about the return of commercial sail. Hybrid sail/electric vessels will proliferate by harvesting energy from the wind. Sailing ships turn wind energy directly into long-distance transport, but the practice was abandoned in an era of cheap fuel. The prospects deserve a second look. It is possible to not only conserve, but even accumulate fuel reserves by sailing around the world. Modern sailing-vessel design, so far, has been constrained by two imperatives: racing and ability to sail upwind. Under favourable conditions, sails produce far more horsepower than is needed to drive a ship. At marginal sacrifice in speed, by running the auxiliary propulsion system in reverse, this energy can be stored. Hybrid vessels, able to store large amounts of energy would be free to roam the world. The trade winds constitute an enormous engine waiting to be put to use. When oil becomes expensive enough, we will.

David Deutsch

Quantum physicist, Oxford University; author, 'The Fabric of Reality'

Whether solutions are possible

They always are. Why is that important? Firstly, because it is true. There is no anthropocentric spite built into the laws of physics, mandating that human improvement may proceed this far and no further. Nor is the dark, neo-religious fantasy true that Nature abhors human hubris, and always exacts a hidden price that outweighs any apparent success, so that "progress" always has to be in scare quotes. And secondly, because how we explain failure, both prospectively and retrospectively, is itself a major determinant of success. If we are optimistic that failure to improve ourselves means merely that we haven't found the solution yet, then success is never due to divine grace (nowadays known as "natural resources") but always to human effort and creativity, and failure is opportunity.

Matt Ridley

Science writer; founding chairman of the International Centre for Life; author, 'Francis Crick: Discoverer of the Genetic Code'

The future

The historian Macaulay said, in 1830: "We cannot absolutely prove that those are in error who tell us that society has reached a turning point, that we have seen our best days. But so said all who came before us and with just as much apparent reason." The enduring pessimism of human beings about the future does real harm by persuading people, especially the young, to retreat from adventure and enterprise into anomie. Sure, the world has problems: Aids, Islamofascism, carbon dioxide. But I bet we can solve them as we have solved others - such as smallpox, the population explosion and the high price of whale oil. s

The full-length versions of these pieces (and many more) can be found at www.edge.org, a website founded by John Brockman.'What Is Your Dangerous Idea?', by John Brockman (Editor), is published by Simon & Schuster, £12.99; 'What We Believe But Cannot Prove', by John Brockman (Editor), is published by Pocket Books, £7.99

The News & Observer [1.20.07]

'What are you optimistic about?" editor John Brockman asked some of the world's leading scientists on his Web site, www.edge.org.

As I've yet to complete my unified theory of the universe, he did not include me in his survey. If he had, I'd have answered: Just about everything.

As I reported in last week's column, Brockman's respondents were forward-looking, describing cutting-edge research that will help combat global warming and other looming problems. My optimism is anchored in the past.

By almost any measure -- greater wealth, better health, diminishing levels of violence -- the world is good and getting better. My only regret is that I am alive today because tomorrow will be even brighter.

Where to start with the good news? How about with the Big Kahuna: During the 20th century, life spans for the average American rose from 44 years to 77 as we tamed age-old scourges such as smallpox, malaria, polio and plague.

The Newyork Times Magazine [1.20.07]

...You might think scientists would be the optimistic exception here. Science, after all, furnishes the model for progress, based as it is on the gradual and irreversible growth of knowledge. At the end of last year, Edge.org, an influential scientific salon, posed the questions "What are you optimistic about? Why?" to a wide range of thinkers. Some 160 responses have now been posted at the Web site. As you might expect, there is a certain amount of agenda-battling, and more than a whiff of optimism bias. A mathematician is optimistic that we will finally get mathematics education right, a psychiatrist is optimistic that we will find more effective drugs to block pessimism (although he is pessimistic that we will use the, wisely). But when the scientific thinkers look beyond their own specializations to the big picture, they continue to find cause for cheer - foreseeing an end to war, for example, or the simultaneous solution of our global warming and energy problems. The most general grounds for optimism offered by these thinkers, though, is that big-picture pessimism so often proves to be unfounded. The perennial belief that our best days are behind us is, it seems, perennially wrong.

Such reflections may or may not ease our tendency toward global pessimism. But what about our contrary tendency to be optimistic - indeed, excessively so - in our local outlook? Is that something we should, in the interests of cold reason, try to disabuse ourselves of? Optimism bias no doubt causes a good deal of mischief, leading us to underestimate the time and trouble of the projects we undertake. But the mere fact that it is so widespread in our species suggests it might have some adaptive value. perhaps if we calculated our odds in a more cleareyed way, we wouldn't be able to get out of bed in the morning. ...

THE AGE — MELBOURNE [1.19.07]

THE WORD WENT OUT TO some of the world's leading scientists and thinkers: just what is your dangerous idea? Ideas defined as dangerous not because they're assumed to be false but because they might be true. Spanning multi-disciplinary topics including biology, genetics, neuroscience, psychology and physics, this volume is full of provocative, speculative and plain mischievous arguments. Ask people to play devil's advocate and the results are fascinating. Maybe we're all marionettes dancing on genetic strings; maybe we have no souls or perhaps we may all even "house homicidal circuits within our brains". One bright spark even posits that the very notion of disseminating dangerous ideas (even in a safe, playful medium such as this one) is itself dangerous because ideas can be powerful forces. ...

WEEKEND AMERICA [1.19.07]

 
It's time to set our watches again. The Doomsday Clock moved two minutes closer to midnight this week. This is of course just a symbolic clock that's wound every now and then by board members of the Bulletin of Atomic Scientists. In 1947 the BAS started using the clock to measure how close we are to global nuclear annihilation. The closer the minute hand gets to midnight, the more doomed we are. And as of Wednesday, it's 11:55. We asked Weekend America's Sean Cole to look into how doomed we are and if there's anyone who might be able to offer a second opinion.

THE GUARDIAN [1.19.07]

In a sly joke, the deity who invented music made sure that the mathematical proportions of "pure" acoustic intervals don't quite add up properly. So in order to play harmonically rich music in different keys, you have to skew the tuning in one way or another. Our current method is called "equal temperament", which is what modern pianos have, and in which the major thirds are sharp and the fifths are flat. Lots of people think that Bach's Well-Tempered Clavier was propaganda for equal temperament, and that everyone settled on it soon thereafter. But, as Duffin's scholarly and enjoyably pugnacious book shows, that's not the case: Bach used a different temperament, which slightly favoured some keys over others. Equal temperament was not universally adopted until the 20th century, and Duffin thinks it should still not be the default. He offers cute capsule biographies of major thinkers in the tuning debates and arguments, based on score markings and other evidence, about what composers such as Haydn or Beethoven would have expected. He even imports early records into a software program to analyse the pitches and find out how people were tuning their violins. Most controversial is his argument that string players should play leading notes flatter than in equal temperament (in order to favour harmonic consonance over melodic shape), rather than sharper, as they have traditionally done. None the less, his fine book should make any contemporary musician think differently about tuning.

The Original Accident, by Paul Virilio (Polity, £14.99)

To invent the train, says Virilio, is to invent the train wreck. Our society is predicated on the industrial accident, which only recently leapfrogged the natural disaster in destructive power. From this spiky proposition, Virilio, the ludic French "dromologist" (student of speed), careers off on a kind of intellectual rollercoaster that takes in Aristotle, Chernobyl, the twin towers, genetic engineering, the privatisation of police forces, cosmology, Rabelais, killer asteroids, and a wonderful short story by Ursula Le Guin, written from the point of view of a tree. Virilio's breakneck pattern-recognition method is apt to spark new thoughts in some readers' heads, even if his images are sometimes hostages to pedantry: "If knowledge can be shown as a sphere whose volume is endlessly expanding," Virilio writes, "the area of contact with the unknown is growing out of all proportion." Does it matter that the surface-area-to-volume ratio actually shrinks, not expands, when a sphere grows? I leave it to you to decide.

What Is Your Dangerous Idea?, edited by John Brockman (Simon & Schuster, £12.99)

The results of the 2005 Question at edge.org, posed by Steven Pinker, are in. Apart from an exasperating section about "memes" (are they still fashionable?) and a few Eeyorish dullards, it's a titillating compilation. Physicist Freeman Dyson predicts that home biotech kits will become common; others posit that democracy may be a blip and "on its way out", that "heroism" is just as banal as evil, and that it will be proven that free will does not exist. There are also far-out but thought-provoking notions: that, given the decadent temptations of virtual reality, the only civilisations of any species that survive to colonise the galaxy will be puritan fundamentalists; or that the internet may already be aware of itself. I particularly enjoyed cognitive scientist Donald D Hoffman's gnomic pronouncement that "a spoon is like a headache", and mathematician Rudy Rucker's robust defence of panpsychism, the idea that "every object has a mind. Stars, hills, chairs, rocks, scraps of paper, flakes of skin, molecules". Careful what you do with this newspaper after you've read it.

Weekend america [1.19.07]

Edgie's Chris Anderson of TED and Robert Provine of University of Maryland as the proponents of optimism on program concerning Optimism and the Doomsday Clock

The Guardian [1.19.07]

What Is Your Dangerous Idea?, edited by John Brockman (Simon & Schuster, £12.99)

The results of the 2005 Question at edge.org, posed by Steven Pinker, are in. Apart from an exasperating section about "memes" (are they still fashionable?) and a few Eeyorish dullards, it's a titillating compilation. Physicist Freeman Dyson predicts that home biotech kits will become common; others posit that democracy may be a blip and "on its way out", that "heroism" is just as banal as evil, and that it will be proven that free will does not exist. There are also far-out but thought-provoking notions: that, given the decadent temptations of virtual reality, the only civilisations of any species that survive to colonise the galaxy will be puritan fundamentalists; or that the internet may already be aware of itself. I particularly enjoyed cognitive scientist Donald D Hoffman's gnomic pronouncement that "a spoon is like a headache", and mathematician Rudy Rucker's robust defence of panpsychism, the idea that "every object has a mind. Stars, hills, chairs, rocks, scraps of paper, flakes of skin, molecules". Careful what you do with this newspaper after you've read it.

Science [1.10.07]

Qubits for dollars. Quantum computing guru David Deutsch is the first recipient of the $95,000 Edge of Computation Science Prize for researchers whose computerrelated ideas touch on broader questions about life, the universe, and everything. 

The 52-year-old Deutsch, at the University of Oxford,U.K., provided the first blueprints for a universal quantum computer in 1985, bringing to life an earlier suggestion from physicist Richard Feynman.Quantum computation,which theoretically is exponentially faster than classical computing, could potentially speed up calculations that currently hamper fields such as physics, biology, and nanotechnology. 

"Deutsch clearly deserved the prize because of his seminal role in creating and furthering quantum computation", says physicist and computer scientist Seth Lloyd of the Massachusetts Institute of Technology in Cambridge, who was a judge. But it's an unusual reward that transcends disciplines; other nominees were from fields of computational biology, software development, and communications, he notes."I'll be very interested to see who wins it next," says Lloyd.

The prize is funded by philanthropist Jeffrey Epstein.

Reforma [1.9.07]

El foro virtual Edge propone buscar razones, no simplemente deseos, para el optimismo. Edge es un club que reúne, segén ellos mismos, algunas de las mentes más interesantes del mundo. Su propósito es estimular discusiones en las fronteras del conocimiento. La intención es llegar al borde del conocimiento mundial, acercándose a las mentes más complejas y refinadas, juntarlas en un foro y hacerlos que se pregunten las preguntas que ellos mismos se hacen. La fundación actúa, de este modo, como surtidora de problemas y alojamiento de réplicas. Cada ano se constituye como Centro Mundial de Preguntas.

The Times [1.9.07]

Multiverse enthusiasts have in turn accused the unification theorists of promissory triumphalism because nobody has yet demonstrated a credible unique theory, let alone predicted the values of any Goldilocks parameters. This acrimonious wrangling reveals deep divisions concerning the ultimate goal of science, the nature of physical reality and the place of conscious observers in the grand scheme of things. It raises far-reaching and unresolved problems, such as what is life and what is the universe? Over the past couple of decades, physicists, cosmologists, biologists and other scientists have discussed these foundational questions of science at a growing number of conferences and workshops, or expressed their opinions informally through websites such as www.edge.org or the Los Alamos electronic archive.

Pages