| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |

next >




2007

"WHAT ARE YOU OPTIMISTIC ABOUT?"


CONTRIBUTORS

Andrew Brown

Keith Devlin
Gerald Holton
Donald Hoffman
Piet Hut
Pamela McCorduck
David G. Myers
James O'Donnell
Howard Rheingold
Michael Shermer

Back to Index Page




 


DAVID G. MYERS
Social Psychologist, Hope College (Michigan); Author, A Quiet World: Living with Hearing Loss

Doubling Hearing Aid Functionality

I foresee a friendlier future for us 31 million Americans with hearing loss.  It's no news that cochlear implants, which were unavailable to my late-deafened mother, should spare me her fate.  But few people are aware that many more of us could benefit by doubling the functionality of our hearing aids.

We can dream of a future where hearing aids serve not only as sophisticated microphone amplifiers, but also as wireless loudspeakers that deliver clear, customized sound.  In theatres, auditoriums, worship centers, airport lounges, drive-up order stations, and home TV rooms, sound signals will be transmitted via in-the-ear loudspeakers, much like wi-fi transmissions to laptops.

Good news!  That future has arrived in the UK and much of Scandinavia, and now here in more than one hundred west Michigan facilities, and it is coming to several other American cities.  When people experience public address or TV sound via "hearing aid compatible assistive listening" (with their flick of a hearing aid switch) they typically respond with amazed joy.  What's more, they report increased satisfaction with their hearing aids.

It's a challenge to persuade a nation to exchange its current hearing assistive technology (which requires locating, checking out, and wearing conspicuous headsets) for a technology that many more people would actually use.  But the results of our west Michigan experiment, and another in 1000 California homes, supports my optimism.  Doubling hearing aid functionality will greatly increase hearing aid acceptance and use.

With on-the-horizon technology, we can also foresee music buffs with wireless ear bud loudspeakers.  When that day comes, having something in one's ear will become as mundane as glasses for the eyes, and millions of people with hearing loss will be enjoying fuller and more connected lives.


MICHAEL SHERMER
Publisher of Skeptic magazine, monthly columnist for Scientific American; Author, Why Darwin Matters

Science and The Decline of Magic

I am optimistic that science is winning out over magic and superstition. That may seem irrational, given the data from pollsters on what people believe. For example, a 2005 Pew Research Center poll found that 42 percent of Americans believe that "living things have existed in their present form since the beginning of time." The situation is even worse when we examine other superstitions, such as these percentages of belief published in a 2002 National Science Foundation study:

ESP    60%
UFOs  30%
Astrology    40%
Lucky numbers 32%
Magnetic therapy   70%
Alternative medicine  88%

Nevertheless, I take the historian's long view, and compared to what people believed before the Scientific Revolution, there is much cause for optimism. Consider what people believed a mere four centuries ago, just as science began lighting candles in the dark. In 16th- and 17th-century England, for example, almost everyone believed in sorcery, werewolves, hobgoblins, witchcraft, astrology, black magic, demons, prayer, and providence. "A great many of us, when we be in trouble, or sickness, or lose anything, we run hither and thither to witches, or sorcerers, whom we call wise men…seeking aid and comfort at their hands," noted Bishop Latimer in 1552. Saints were worshiped. Liturgical books provided rituals for blessing cattle, crops, houses, tools, ships, wells, and kilns, not to mention the sick, sterile animals, and infertile couples. In his 1621 book, Anatomy of Melancholy, Robert Burton explained, "Sorcerers are too common; cunning men, wizards, and white witches, as they call them, in every village, which, if they be sought unto, will help almost all infirmities of body and mind."

Just as alcohol and tobacco were essential anesthetics for the easing of pain and discomfort, superstition and magic were the basis for the mitigation of misfortune. As the great Oxford historian of the period, Keith Thomas, writes in his classic 1971 work Religion and the Decline of Magic, "No one denied the influence of the heavens upon the weather or disputed the relevance of astrology to medicine or agriculture. Before the seventeenth century, total skepticism about astrological doctrine was highly exceptional, whether in England or elsewhere." And it wasn't just astrology. "Religion, astrology and magic all purported to help men with their daily problems by teaching them how to avoid misfortune and how to account for it when it struck." With such sweeping power over nearly everyone, Thomas concludes, "If magic is to be defined as the employment of ineffective techniques to allay anxiety when effectives ones are not available, then we must recognize that no society will ever be free from it." The superstitious we will always have with us.

Nevertheless, the rise of science ineluctably attenuated this near universality of magical thinking by proffering natural explanations where before there were only supernatural ones. Before Darwin, design theory (in the form of William Paley's natural theology, which gave us the "watchmaker" argument) was the only game in town so everyone believed that life was designed by God. Today less than half believe that in America, the most religious nation of the developed democracies, and in most other parts of the world virtually everyone accepts evolution without qualification. That's progress.

The rise of science even led to a struggle to find evidence for superstitious beliefs that previously needed no propping up with facts. Consider the following comment from an early 17th-century book that shows how even then savvy observers grasped the full implications of denying the supernatural altogether: "Atheists abound in these days and witchcraft is called into question. If neither possession nor witchcraft (contrary to what has been so long generally and confidently affirmed), why should we think that there are devils? If no devils, no God."

Magic transitioned into empirical magic and formalized methods of ascertaining causality by connecting events in nature—the very basis of science. As science grew in importance, the analysis of portents was often done meticulously and quantitatively, albeit for purposes both natural and supernatural. As one diarist privately opined on the nature and meaning of comets: "I am not ignorant that such meteors proceed from natural causes, yet are frequently also the presages of imminent calamities."

Science arose out of magic, which it ultimately displaced. By the 18th century, astronomy replaced astrology, chemistry succeeded alchemy, probability theory dislodged belief in luck and fortune, city planning and social hygiene attenuated disease, and the grim vagaries of life became less grim, and less vague. As Francis Bacon concluded in his 1626 work, New Atlantis: "The end of our foundation is the knowledge of causes and the secret motions of things and the enlarging of the bounds of human empire, to the effecting of all things possible."

Sic itur ad astra — Thus do we reach the stars.


PAMELA MCCORDUCK
Writer; Author, Machines That Think

Understanding What Really Happens To Humans In Groups

At seventeen I saw that contemporary literature—I studied it then, and hoped eventually to be part of it—is an abyss of despair.  No surprise: it reflects the unspeakable circumstances of the 20th century. Even so, it's no good thing to be seventeen and without hope. Luckily, chance brought me together with some scientists, and I discovered that in science, optimism was, and is, abundant. 

Since then, I've spent much of my life trying to persuade my friends in the humanities that optimism on behalf of the human condition is a plausible point of view. It isn't the only point of view—the 20th century's horrors are a fact, and most of them happened through human agency.  But the full life can support several points of view, often simultaneously, and my personal inclination is toward optimism, however qualified it must be.

For a long time my optimism centered on computing in general, and what kinds of benefits it might bring us. Events have shown I entertained far too modest an optimism—I'm embarrassed to say that the impact of the Internet, in particular the World Wide Web, eluded me completely at first. A few years ago, I returned to artificial intelligence, which I'd written about early on, and then gone away from. Press narratives were uncritical about the field's death throes, and I expected to write an elegy. Instead, I found a revelation. Artificial intelligence is not only robustly healthy, building on its very significant gains since I first wrote about it, but the field's present ambitions burst with, well, vitality.

Lately I've been examining a new aspect of computing, the modeling of human behavior in groups, small and large, beginning from the bottom up, playing out dynamically, as only computer models allow. Years ago, in a casual dinner conversation with a social scientist, I wondered aloud if what prevented us from understanding what really happens to humans in groups is that we haven't found the code.  I meant to make a vague comparison between the genetic code and something hypothetical that encoded human behavior. Instead of laughing, she solemnly agreed. 

Such a code is not yet on the horizon, but thanks to some marvelous new work by very gifted social scientists, its intimations are teasing us.   I'm optimistic that it will eventually be found. When it is, it will be a scientific triumph. It will open not just the future to our understanding, but also the past. It will be a human triumph.

What will it mean to have such a code? For one thing we can plan more intelligently. Want to wage a war? Call in the experts to run a few scenarios for you, laid out in bottom-up detail, humans and their interactions with each other and the terrain they're going to fight it out on. 

Watch silicon agents melt away to fight you another day; watch them reach out for help elsewhere. Once you watch the model run its course, maybe you don't want to fight that particular war after all. Want to predict the possible spread of a disease? Good, the silicon model will tell you how many will fall, and where you can intervene to pinch off contagion effectively, where it's a waste of effort. Want to figure out the ebb and flow of urban crime waves? And then how to prevent them? Play it out in silicon first. Why do humans cooperate, at least as much as they compete? Compare identical silicon societies, same people, same resources, but vary the amount of cooperation, the amount of competition.  Which one collapses? Which one survives?  Which one thrives? Where's the tipping point?

Perhaps as interesting, we'll be able to reach backward in time. How, really, did Mesopotamia become a desert when once it had supported a network of rich societies?  How much of that collapse was climate change, how much human folly? Build a model of early modern Europe and show what really caused the European Renaissance. Compute in detail how Great Britain came to rule the waves—and then didn't any more.

We assume we've solved some of these problems, though historians dispute one another ferociously, as do epidemiologists, as do economists, sometimes over details, sometimes over emphasis, sometimes over fundamental assumptions. Here comes a chance to nail it down, and these techniques offer us insights we couldn't get any other way. Finding the code I once thought was only hypothetical will revolutionize our view of who we are, how we got that way, and who we might become, the same way cracking the genetic code revolutionized biology.

But of course I'm chronically too modest in my hopes, so you can comfortably hope for more.


HOWARD RHEINGOLD
Communications Expert; Author, Smart Mobs

The tools for cultural production and distribution are in the pockets of 14 year olds

The tools for cultural production and distribution are in the pockets of 14 year olds. This does not guarantee that they will do the hard work of democratic self-governance: the tools that enable the free circulation of information and communication of opinion are necessary but not sufficient for the formation of public opinion. Ask yourself this question: Which kind of population seems more likely to become actively engaged in civic affairs — a population of passive consumers, sitting slackjawed in their darkened rooms, soaking in mass-manufactured culture that is broadcast by a few to an audience of many, or a world of creators who might be misinformed or ill-intentioned, but in any case are actively engaged in producing as well as consuming cultural products? Recent polls indicate that a majority of today's youth — the "digital natives" for whom laptops and wireless Internet connections are part of the environment, like electricity and running water — have created as well as consumed online content. I think this bodes well for the possibility that they will take the repair of the world into their own hands, instead of turning away from civic issues, or turning to nihilistic destruction.

The eager adoption of web publishing, digital video production and online video distribution, social networking services, instant messaging, multiplayer role-playing games, online communities, virtual worlds, and other Internet-based media by millions of young people around the world demonstrates the strength of their desire — unprompted by adults — to learn digital production and communication skills. Whatever else might be said of teenage bloggers, dorm-room video producers, or the millions who maintain pages on social network services like MySpace and Facebook, it cannot be said that they are passive media consumers. They seek, adopt, appropriate, and invent ways to participate in cultural production. While moral panics concentrate the attention of oldsters on lurid fantasies of sexual predation, young people are creating and mobilizing politically active publics online when circumstances arouse them to action. 25,000 Los Angeles high school students used MySpace to organize a walk-out from classes to join street demonstrations protesting proposed immigration legislation. Other young people have learned how to use the sophisticated graphic rendering engines of video games as tools for creating their own narratives; in France, disaffected youth, the ones whose riots are televised around the world, but whose voices are rarely heard, used this emerging "machinima" medium to create their own version of the events that triggered their anger (search for "The French Democracy" on video hosting sites). Not every popular YouTube video is a teenage girl in her room (or a bogus teenage girl in her room); increasingly, do-it-yourself video has been used to capture and broadcast police misconduct or express political opinions. Many of the activists who use Indymedia — ad-hoc alternative media organized around political demonstrations — are young.

My optimism about the potential of the generation of digital natives is neither technological determinism nor naive utopianism. Many-to-many communication enables but does not compel or guarantee widespread civic engagement by populations who never before had a chance to express their public voices. And while the grimmest lesson of the twentieth century is to mistrust absolutist utopians, I perceive the problem to be in the absolutism more than the utopia. Those who argued for the abolition of the age-old practice of human slavery were utopians.


GERALD HOLTON
Mallinckrodt Research Professor of Physics and Research Professor of History of Science, Harvard University; Author, Thematic Origins of Scientific Thought

The Increasing Coalescence of Scientific Disciplines

Under our very eyes, research in science has been taking a courageous and promising turn, to realize in our time an ancient dream.

Since Thales and other philosophers on the island in the Ionian Sea, over 2500 years ago, there has been an undying hope that under all the diverse and fluctuating phenomena, there could be found in Nature a grand, majestic order. This fascination, the "Ionian Enchantment," persisted ever since in various forms.

Thus, Isaac Newton thought mechanical forces that explained the motions of the solar system would also turn out to run all else, including human senses. After Darwin's magnificent synthesis, many attempts were made to extend it to include all societal phenomena. The influential Austrian polymath, Ernst Mach, to whom young Einstein referred as one of his most important influences, taught that the true task of scientific research is to establish a form of fundamental science, an Einheitswissenschaft, on which is based every different specialty. From about 1910 on, an increasing number of scientists in Europe and America gave allegiance to the idea of the "Unity of Science," a widespread movement hoping to find functioning bridges between not only different sciences but also between science and philosophy—Niels Bohr being one of the prominent promoters.

But, by and by, it became clear that such hopes were at best premature, that there was not enough of what William James had called "cash value," in terms of having secured many actual accomplishments—not least in attaining a Unified Field Theory. At one of the last meetings devoted to discussions about the Unity of Science, in 1956, J. Robert Oppenheimer, with typical eloquence, offered a valedictory to the Ionian Enchantment, with these words:

"It may be a question [whether there] is one way of bringing a wider unity in our time. That unity, I think, can only be based on a rather different kind of structure than the one most of us have in mind....The unity we can seek lies really in two things. One is that the knowledge that comes to us in such terrifyingly inhumanly rapid rate has some order in it....The second is simply this: We can have each other to dinner. We ourselves, and with each other by our converse, can create, not an architecture of global scope,but an immense, intricate network of intimacy, illumination, and understanding."

But even as such opinions were accepted with resignation, something new had been born, quietly grew, and in our time has become the source of increasing optimism about the value of the old dream—by turning in a new direction. I mean that scientific research, at first only sporadically during the last century, but more and more in our time, has been successfully reaching out for a new sort of unity—in practice, for an integration among disciplinary fragments. This time the movement is not driven by a philosophy of science or a search for the Ur-science. Rather it is appearing as if spontaneously in the pursuit and progress of research science itself.

There is an increasing coalescence of scientific disciplines in many areas. Thus the discovery of the structure of the genome not only required contributions from parts of biology, physics, chemistry, mathematics, and information technology, but in turn it led to further advances in biology, physics, chemistry, technology, medicine, ecology, and even ethics. And all this scientific advance is leading, as it should, to the hopeful betterment of the human condition (as had been also one of the platform promises of the Unity of Science movement, especially in its branch in the Vienna Circle).

Similar developments happen in the physical sciences—a coalescence of particle physics and large-scale astronomy, of physics and biology, and so forth. It is a telling and not merely parochial indicator that about half of my 45 colleagues in my Physics Department, owning to their widespread research interests, now have joint appointments with other departments at the University: with Molecular and Cellular Biology, with Mathematics, with Chemistry, with Applied Sciences and Engineering, with History of Science. Just now, a new building is being erected next to our Physics Department. It has the acronym LISE, which stands for the remarkable name, Laboratory of Integrated Science and Engineering. Although in industry, here and there, equivalent labs have existed for years, the most fervent follower of the Unity of Science movement would not have hoped then for such an indicator of the promise of interdisciplinarity. But as the new saying goes, most of the easy problems have been solved, and the hard ones need to be tackled by a consortium of different competences.

From other parts of this university, plans are under way to set up a program for higher degrees in the new field of Systems Biology, which has the goal of reaching "an integrated understanding" of biological/medical processes; that program is to bring together faculty and students from biology, medicine, chemistry, physics, mathematics, computation and engineering. And these parochial examples are indications of a general trend in many universities. The new password to success is now "integration" and "interdisciplinarity." If an "official" sacralization of this movement were needed, it would be the 2005 release of a big volume by the National Academy of Sciences, with the title "Facilitating Interdisciplinary Research."

All this is not precisely what the philosophers and scientists, from Thales on, were hoping for. We will not, at least not for a long time, have that grand coalescence of all sciences and more. What has come lacks exalted philosophical pretensions, being instead a turn to weeks and years of many-heads-together, hands-on work on specific, hard problems of intense scientific interest, many of them also of value to society at large.

And, of course, these co-workers can also still have each other to dinner.


DONALD HOFFMAN
Cognitive Scientist, UC, Irvine; Author, Visual Intelligence

We Will Soon Devise a Scientific Theory for the Perennial Mind-Body Problem

The enigmatic relation between conscious experiences and the physical world, commonly known as the mind-body problem, has frustrated philosophers at least since Plato, and now stonewalls scientists in their attempts to construct a rigorous theory. Yet I am optimistic that, despite millennia of prior failures, we will soon devise a scientific theory for this perennial problem.

Why such optimism? First, the mind-body problem is now recognized as a legitimate scientific problem. In 2005, the journal Science placed it second in a list of 125 open questions in science. During the twentieth century, a multi-decade detour into behaviorism sidelined scientific investigation of the mind-body problem. But three decades into the cognitive revolution, the problem was dusted off and again given serious scientific attention.

Second, scientists soon rediscovered that the problem is surprisingly hard. Neurophysics, real and artificial neural networks, classical and quantum algorithms, information and complexity—standard tools that prove powerful in the study of perception, cognition and intelligence—have yet to yield a single scientific theory of conscious experiences. We cannot, for instance, answer the basic question: Why must this particular pattern of neural activity or this functional property cause, or be, this particular conscious experience (say, the smell of garlic) instead of that other conscious experience (say, the smell of a truffle), or instead of no conscious experience at all? Precise predictions of this type, de rigueur for genuine scientific explanations, have yet to be fashioned, or even plausibly sketched, with the standard tools.

Third, although science is laudably conservative, yet when pushed to the wall by recalcitrant data and impotent theories, scientists have repeatedly proved willing to reexamine dearly held presuppositions and to revise or jettison the ineffectual in favor of unorthodox assumptions, provided that these assumptions permit the construction of explanatory theories that answer to data. Aristarchus, then Copernicus, countenanced a heliocentric solar system, Newton action at a distance, Einstein quanta of light and distortions of space-time, Bohr probability waves, superpositions and nonlocality. Theories of quantum gravity now posit eleven dimensions, vibrating membranes, and pixels of space and time. The initial response to such proposals is, invariably, widespread incredulity. But considerations of explanatory power and empirical adequacy, wherever they point, eventually win the day. Scientists revise their offending presuppositions, adjust psychologically as best they can to the new world view, and get on with the business of science in the new framework.

Evidence is mounting that the mind-body problem requires revision of deeply held presuppositions. The most compelling evidence to date is the large and growing set of proposals now on offer. All are nonstarters. They are, to quote Pauli, not even wrong. We have yet to see our first genuine scientific theory of the mind-body problem. This has prompted some to conclude that homo sapiens has been cheated by evolution and simply lacks the requisite concepts: Those concepts necessary for us to survive long enough to reproduce did not include those necessary to solve the mind-body problem. If so, there is little hope, at present, for swift progress.

I am optimistic, however, that the obstacle is not in our genes but in our presuppositions. Tinkering with presuppositions is more clearly within the purview of current technology than tinkering with our genes. Indeed, tinkering with one's presuppositions requires no technology, just a ruthless reconsideration of what one considers to be obviously true. Science has risen to the task before. It will rise again. But progress will be tortuous and the process psychologically wrenching. It is not easy, even in the light of compelling data and theories, to let go of what once seemed obviously true.

Here are some obvious truths that guide current attempts to solve the mind-body problem: Physical objects have causal powers. Neural activity can cause conscious experiences. The brain exists whether or not it is observed. So too does the moon, and all other physical objects. Consciousness is a relative latecomer in the evolution of the universe. Conscious sensory experiences resemble, or approximate, true properties of an independently existing physical world.

Will we soon be forced to relinquish some of these truths? Probably. If so, the current ontological predilections of science will require dramatic revision. Could science survive? Of course. The fundamental commitments of science are methodological, not ontological. What is essential is the process of constructing rigorous explanatory theories, testing them with careful experiments, and revising them in light of new data. Ontologies can come and go. One might endure sufficiently long that it is taken for a sine qua non of science. But it is not. An ontology breathed into life by the method of science can later be slain by that same method. Therein lies the novel power of science. And therein lies my optimism that science will soon succeed in fashioning its first theory of the mind-body problem. But at the feet of that theory will probably lie the slain carcass of an effete ontology.


ANDREW BROWN
Journalist, The Guardian; Author, The Darwin Wars

A Proper Scientific Understanding of Irrationality In General, and of Religion In Particular

I'm not actually optimistic about anything very much, but it's clear that if civilisation is to survive, we need a proper scientific understanding of irrationality in general, and of religion in particular. To be optimistic about that is a precondition for optimism about anything else. What might such an understanding look like?

For a start, it would be naturalistic and empirical. It would not start from definitions of religion or faith, but from a careful study, in the spirit of William James, of how it is that religious people actually behave and believe. What would be found, again in a Jamesian spirit, is that there are varieties of religious behaviour, as there are varieties of religious experience. We would need to know how these are related to each other, and to other things that are not described as religious. It may well be that "religion" is a concept no more useful than phlogiston.

It would take seriously Dan Dennett's distinction between beliefs and opinions—more seriously, I think, than he sometimes does himself. A belief, in Dennett's sense, is a kind of behaviour or a propensity to behave as if certain things were true. It need not be conscious at all. The kind of conscious, articulable propositions about the world which most people mean by "belief" he calls an "opinion".

In this sense, an enquiry into religious belief would be distinct from an enquiry into religious opinions: Religious "belief" would involve all of the largely unconscious mechanisms which lead people to behave superstitiously, or reverently, or with a disdain for heretics; religious opinions would be the reasons that they give for this behaviour. We need to understand both. It may be that their opinions would correspond to their beliefs but that is something to be established in every case by empirical enquiry. It's obvious that in most cases they don't. Intellectuals are supposed to be motivated by their opinions; some of them actually are. But everyone is motivated by their beliefs and prejudices as well.

In particular, such an enquiry would be very careful about what counts as evidence. A friend of mine who does consciousness research once said sourly that "The problem with the brain is that if you go looking for something in there, you're very liable to find it." Similarly, if you go looking for some particular quality in religious belief you are likely to find it there, as well as its opposite. What's needed is the distinctly scientific attitude that takes disconfirming evidence seriously, and doesn't respond to it by simply repeating the confirming evidence.

I happened to see a play "On Religion" by the British atheist philosopher AC Grayling last night, which is an excellent dramatisation of some of these issues. The atheist character, a woman lecturer, is given a speech in which she recounts the story of a scientist who has spent fifteen years arguing that the Golgi apparatus does not in fact exist. It is an artifact of the inadequacies of our microscopes. Finally, he attends a lecture from a visiting cell biologists who proves conclusively that the Golgi apparatus does exist. And, just as the whole department is trying to avoid his eye from sympathetic shame, he rushes up to the lecturer, grabs his hand, and says "My dear fellow, I wish to thank you. I have been wrong these fifteen years." It is an improving and inspiring story, which pitches over into bathos as soon as the atheist spells out the moral. "No religious person could ever say that" she says. Has she really never heard of the phenomenon of conversion? What do the converted say, if not that some evidence has convinced them they were wrong all their lives before?

So, I think, if I am to be optimistic, that there will be a real breakthrough in the empirical study of religion, at the end of which no scientist will ever feel able to assert that "no religious person could ever say" without making a careful enquiry into what religious people actually do say and what they mean by it.


PIET HUT
Professor of Astrophysics, Institute for Advanced Study, Princeton

The Real Purity of Pure Science

I grew up reading heroic stories about progress in science, the absolute superiority of the scientific method, the evil of superstition, and other one-dimensional optimistic views.

Almost half a century later, I have a much more nuanced view of progress, method, and ways of looking at the world. What has been presented as the scientific method, at any given time, has been a simplified snapshot of an intrinsically much more opportunistic enterprise. As such, much damage has been done by suggesting that others areas, from social science and economy to politics, should adopt such a simple and always outdated picture.

The strength of science is not at all its currently accepted method. The strength is the fact that scientists allow the method to change.

The way the method changes is the exact same way through which progress is made by applying the method in doing everyday research. Change of method takes place slowly and carefully, through long and detailed peer discussions, and may be almost imperceptible in any given field during the lifetime of a scientist. The scientific method is like spacetime in general relativity: it provides the stage for matter to play, but the play of matter in turn affects the stage.

The real basis for the success of science is its unique combination of progressive and conservative elements. A scientist gets brownie points for crazy new ideas, as long as they are really interesting and stimulating, and also for being extremely conservative in criticizing any and all new ideas, as long as the criticism can be shown to be valid. What is interesting in new ideas and what is valid in criticism thereof is determined solely by peer review, by the collective opinions of the body of living scientists, not by falling back on some kind of fixed notion of a method.

My optimism is that other areas of human activities can learn from science to combine conservative and progressive approaches, taming the usual black-white duality in a collaborative dance of opposites.

Pure science has been held up as a beacon of hope, as a way to allow scientists to pursue their own intuitions, and thus to find totally new solutions to old problems. This is seen in contrast to applied science, where short-term goals do not allow sufficient room for finding really new approaches. Indeed, the irony here is that the best applications of sciences are ultimately based on pure, rather than applied research.

The moral of the story has been to say that long-term research should not focus on goals, but rather it should let the scientific method follow its own course. Purified from goals, the scientific method is held up as the beacon to follow. But I think this story is still misleading. The greatest breakthroughs have come from a doubly pure science, purified from goals and methods alike. In small and large ways, each major breakthrough was exactly a breakthrough because it literally broke the rules, the rules of the scientific method as it had been understood so far. The most spectacular example has been quantum mechanics, which changed dramatically even the notion of experimental verification.

I am optimistic that all areas of human activities can be inspired by the example of science, which has continued to thrive for more than four centuries, without relying on goals, and without even relying on methods. The key ingredients are hyper-critical but non-dogmatic conservatism, combined with wildly unconventional but well-motivated progressiveness. Insofar as there is any meta-method, it is to allow those ingredients to be played off against each other in the enactments of scientific controversies, until consensus is reached.


KEITH DEVLIN
Mathematician; Executive Director, Center for the Study of Language and Information, Stanford; Author, The Millennium Problems

We Will Finally Get Mathematics Education Right

For the first time since Euclid started the mathematics education ball rolling over two thousand years ago, we are within a generation of eradicating innumeracy and being able to bring out the mathematical ability that research has demonstrated conclusively is within (almost) everyone's reach. The key to this development (actually two developments, one in the developing world, the other in affluent, technology-rich societies) is technology (actually two technologies).

First the developing world. Forget the $100 laptop, which I think has garnered the support it has only because of the track record and charisma of its principal advocate (Nicholas Negroponte), the ubiquitous computing device that will soon be in every home on the planet is the mobile phone. Despite the obvious limitations of a small screen and minimal input capability, with well-crafted instructional materials it will provide the developing world with accessible education in the basic numerical and quantitative reasoning skills that will enable them to escape from the poverty trap by becoming economically self-sufficient. Such a limited delivery system would not work for an affluent consumer who has choices, but for someone highly motivated by the basic desires of survival and betterment, who has no other choice, it will be life transforming.

At the other end of the economic spectrum, the immersive, three-dimensional virtual environments developed by the gaming industry make it possible to provide basic mathematical education in a form that practically everyone can benefit from.

We have grown so accustomed to the fact that for over two thousand years, mathematics had to be communicated, learned, and carried out through written symbols, that we may have lost sight of the fact that mathematics is no more about symbols than music is about musical notation. In both cases, specially developed, highly abstract, stylized notations enable us to capture on a page certain patterns of the mind, but in both cases what is actually captured in symbols is a dreadfully meager representation of the real thing, meaningful only to those who master the arcane notation and are able to recreate from the symbols the often profound beauty they represent. Never before in the history of mathematics have we had a technology that is ideally suited to representing and communicating basic mathematics. But now, with the development of manufactured, immersive, 3D environments, we do.

For sure, not all mathematics lends itself to this medium. But by good fortune (actually, it's not luck, but that would be too great a digression to explain) the medium will work, and work well, for the more basic mathematical life-skills that are of the most value to people living in modern developed societies.

Given the current cost of developing these digital environments (budgets run into the millions of dollars), it will take some years before this happens. We can also expect resistance from mathematics textbook publishers (who currently make a large fortune selling a product that has demonstrably failed to work) and from school boards who still think the universe was created by an old guy with a white beard (no, not Daniel Dennett) 6,000 years ago. But as massive sales of videogames drives their production costs down, the technology will soon come within reach of the educational world.

This is not about making the learning of mathematics "fun." Doing math will always be hard work, and not everyone will like it; its aficionados may remain a minority. But everyone will achieve a level of competency adequate for their lives.

Incidentally, I don't think I am being swayed or seduced by the newest technology. Certainly, I never thought that television, or the computer, or even artificial intelligence, offered a path to effective math learning. What makes immersive 3D virtual environments the perfect medium for learning basic math skills is not that they are created digitally on computers. Nor is it that they are the medium of highly seductive videogames. Rather, it is because they provide a means for simulating the real world we live in, and out of which mathematics arises, and of doing so in a way that brings out and confronts the player (i.e., learner) with the underlying mathematical structure of our world. If Euclid were alive today, this is how he would teach math.


JAMES O'DONNELL
Classicist; Cultural Historian; Provost, Georgetown University; Author, Augustine: A New Biography

Scientistific Discoveries Are Surprisingly Durable

Anna Karenina famously begins with the line, "Happy families are all alike; every unhappy family is unhappy in its own way." A little less famously and a great deal more astutely, Nabokov turned the line on its head at the opening of Ada: "All happy families are more or less dissimilar; all unhappy ones are more or less alike." I'm with Nabokov, and that's why I can be an optimist.

Of course in the long run, optimism is impossible. Entropy is unforgiving: even a historian knows that.

And history repeats itself. The same stupidities, the same vengeances, the same brutalities are mindlessly reinvented over and over again. The study of history can help the educated and the wise avoid the mistakes of the past, but alas, it does nothing for helping the numbskulls.

But the study of the past and its follies and failures reveals one surprising ground for optimism. In the long run, the idiots are overthrown or at least they die. On the other hand, creativity and achievement are unique, exciting, liberating—and abiding. The discoveries of scientists, the inventions of engineers, the advances in the civility of human behavior are surprisingly durable. They may be thwarted or attacked, and at any given moment it may seem that the cause of women's rights is beleaguered in too many places in the world. But the idea of women's equality with men is not going away. Too few students may master the natural sciences, but the understanding enshrined in Newton's laws of motion and the calculus are not going away. Too many people may eat and smoke their way to early graves, but the accurate understanding of the mechanisms of the human body and how they can be healed and repaired and kept healthy—that's not going away either.

After all, we started out in the African savannah, trying to run fast enough to catch up with things we could eat and fast enough to stay away from things that could eat us. Our natural destiny is to squat in caves and shiver, then die young. We decided we didn't like that, and we figured out how to do better. Even if the numbskulls get their way and we were to wind up back in a cave, we would remember—and we wouldn't be in the cave long. We do not remember everything, and there are losses. But we turn out to be a stubbornly smart, resilient and persistent species, and we do not forget the most important things.


< previous

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |

next >


John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: [email protected]
Copyright © 2007 by
Edge Foundation, Inc
All Rights Reserved.
|Top|