Edge: THE NEW HUMANISTS



JOHN BROCKMAN is publisher and editor of Edge. His books include The Third Culture; The Next Fifty Years: Science in the First Half of the Twenty-First Century; and The New Humanists: Scientists at the Edge.

John Brockman's Edge bio page


TABLE OF CONTENTS

Introduction: The New Humanists

Part I: Homo sapiens


Jared Diamond: A New Scientific Synthesis of Human History

Why did human development proceed at such different rates on different continents for the last 13,000 years? Historians tend to avoid this subject like the plague, because of its apparently racist overtones. Many people, or even most people, assume that the answer involves biological differences in average IQ among the world's populations, despite the fact that there is no evidence for the existence of such IQ differences.?In case the stink of racism still makes you feel uncomfortable about exploring this subject, just reflect on the underlying reason that so many people accept racist explanations of history's broad pattern: We don't have a convincing alternative explanation. Until we do, people will continue to gravitate by default to racist theories. That leaves us with a huge moral gap, which constitutes the strongest reason for tackling this uncomfortable subject.

JARED DIAMOND is a professor of geography at UCLA, a MacArthur Fellow, winner of the National Medal of Science, and author of The Third Chimpanzee (awarded the British Science Book Prize and a Los Angeles Times Book Prize) and the Pulitzer Prize-winning Guns, Germs, and Steel.

Steven Pinker: A Biological Understanding of Human Nature

I believe that there is a quasi-religious theory of human nature prevalent among pundits and intellectuals which includes both empirical assumptions about how the mind works and a set of values that people hang on those assumptions. The theory has three parts: [T]he Blank Slate—that we have no inherent talents or temperaments because the mind is shaped completely by the environment (parenting, culture, and society). The second is the myth of the Noble Savage—that evil motives are not inherent in people but spring from corrupting social institutions. The third is the Ghost in the Machine—that the most important part of us is somehow independent of our biology, so that our ability to have experiences and make choices can't be explained by our physiological makeup and evolutionary history.

STEVEN PINKER, a research psychologist, is Peter de Florez Professor in the Department of Brain and Cognitive Sciences at MIT, and the author of Language Learnability and Language Development; Learnability and Cognition; The Language Instinct; How the Mind Works; Words and Rules: The Ingredients of Language; and The Blank Slate: The Modern Denial of Human Nature.

Helena Cronin: Getting Human Nature Right

Certainly, human nature is fixed. It's universal and unchanging, common to every baby that's born, down through the history of our species. But human behavior, which is generated by that nature, is endlessly variable and diverse. After all, fixed rules can give rise to an inexhaustible range of outcomes. Natural selection equipped us with the fixed rules—the rules that constitute our human nature. And it designed those rules to generate behavior that's sensitive to the environment. So, the answer to genetic determinism is simple. If you want to change behavior, just change the environment. And to know which changes would be appropriate and effective, you have to know those Darwinian rules. You need only to understand human nature, not to change it.

HELENA CRONIN is a codirector of the London School of Economic's Centre for Philosophy of Natural and Social Sciences, where she runs the wide-ranging and successful program called Darwin@LSE, which fosters research at the forefront of evolutionary theory. She is the author of The Ant and the Peacock: Altruism and Sexual Selection from Darwin to Today.

Andy Clark: Natural-Born Cyborgs?

Our brains are (by nature) unusually plastic; their biologically proper functioning has always involved the recruitment and exploitation of nonbiological props and scaffolds. More so than any other creature on the planet, we humans emerge as natural-born cyborgs, factory-tweaked and primed so as to be ready to grow into extended cognitive and computational architectures—ones whose systemic boundaries far exceed those of skin and skull.

ANDY CLARK is professor of philosophy and director of the cognitive science program at Indiana University. He was previously professor of philosophy at Sussex University, UK, and director of the Philosophy/Neuroscience/Psychology Program at Washington University in St. Louis. He is the author of Microcognition: Philosophy, Cognitive Science, and Parallel Distributed Processing; Associative Engines; Being There: Putting Brain, Body, and World Together Again; Mindware: An Introduction to the Philosophy of Cognitive Science; and Natural-Born Cyborgs.

Marc D. Hauser: Animal Minds

In my own work, we've begun looking at the kinds of computations that animals and human infants are capable of when they interact with the physical and social world. We want to understand how such capacities evolved and how they constrain thought.

MARC D. HAUSER is a cognitive neuroscientist at Harvard University, where he is a Harvard College Professor, a professor in the Department of Psychology and the Program in Neurosciences, and director of the Mind, Brain, and Behavior Program. He is author of The Evolution of Communication; Wild Minds: What Animals Really Think; and two forthcoming books—People, Pets, or Property? and Ought: The Inevitability of a Universal Moral Grammar.

Richard Wrangham: The Evolution of Cooking

A lot of people find it hard to live with the idea that we've had a natural history of violence. But if we look at ourselves an animal, it's clear that natural selection has favored emotions in men that predispose them to enjoy competition, to enjoy subordinating other men, to enjoy even killing other men. These are difficult ideas to accept, and there are people who argue that it's inappropriate to write about such ideas, and they look for ways to undermine the evidence. What they seem to fear is that once a biological component in our violent behavior is recognized, then violence may be seen as inevitable.

RICHARD WRANGHAM is a professor of biological anthropology at Harvard University who studies chimpanzees in Uganda with an eye to illuminating human evolution and behavior. One of Wrangham's central ideas is that we should cherish the parallels between humans and other great apes, because they help us to understand our own behavior. "For all our self-consciousness, we humans continue to follow biological rules," he notes. "Life is easier if we understand those rules. Recognition of the deep contradictions in humanity binds us to our past and also lights our future." Wrangham is the author, with Dale Peterson, of Demonic Males: Apes and the Origins of Human Violence.

Daniel C. Dennett: The Computational Perspective

When I go to a workshop or conference and give a talk, I'm actually doing research, because the howls and screeches and frowns that I get from people, the way in which they react to what I suggest, is often diagnostic of how they are picturing the problems in their own minds. And in fact people have very different covert images about what the mind is and how the mind works. The trick is to expose these images, to bring them up into public view and then correct them. That's what I specialize in.

DANIEL C. DENNETT is University Professor, professor of philosophy, and director of the Center for Cognitive Studies at Tufts University. A philosopher by training, he is known as the leading proponent of the computational model of the mind. He has made significant contributions in fields as diverse as evolutionary theory, artificial intelligence, cognitive science, animal studies, and computer science. He is the author of Content and Consciousness, Brainstorms, Elbow Room,The Intentional Stance, Consciousness Explained, Darwin's Dangerous Idea, Kinds of Minds, Brainchildren, and Freedom Evolves. With Douglas Hofstadter, he coedited The Mind's I, and he is the author of over 200 scholarly articles on various aspects of the mind, published in journals ranging from Artificial Intelligence and Behavioral and Brain Sciences to Poetics Today and the Journal of Aesthetics and Art Criticism.

Stephen M. Kosslyn: What Shape Are a German Shepherd's Ears?

There is a gigantic project, yet to be done, that will root psychology in the rest of natural science. Once this is accomplished, you'll be able to go from phenomenology (things like mental imagery) to information processing?to the brain?down through the workings of the neurons, including the biochemistry, all the way to the biophysics and the way that genes are up-regulated and down-regulated. This is going to happen; I have no doubt at all. When it does, we're going to have a vastly better understanding of human nature than at any other time in human history.

STEPHEN M. KOSSLYN, the John Lindsley Professor of Psychology at Harvard University, has published over 250 papers on the nature of visual mental imagery and related topics. He is a cofounder and senior editor of the Journal of Cognitive Neuroscience and has served on several National Research Council committees advising the government on new technologies. His books include Image and Mind; Ghosts in the Mind's Machine; Elements of Graph Design; Wet Mind: The New Cognitive Neuroscience (with Olivier Koenig); Image and Brain: The Resolution of the Imagery Debate; and Psychology: The Brain, the Person, the World (with Robin Rosenberg).


Part II: Machina sapiens?

Jordan B. Pollack: Software Is a Cultural Solvent


I work on developing an understanding of biological complexity and how we can create it, because the limits of software engineering have been clear now for two decades. The biggest programs anyone can build are about 10 million lines of code. A real biological object—a creature, an ecosystem, a brain—is something with the same complexity as 10 billion lines of code. And how do we get there?

JORDAN B. POLLACK is a professor of computer science and complex systems at Brandeis University. His laboratory's work on AI, artificial life, neural networks, evolution, dynamical systems, games, robotics, machine learning, and educational technology has been reported on by the New York Times, Time, Science, NPR, and other media sources worldwide. Pollack is a prolific inventor, advises several start up companies, and in his spare time runs Thinmail, which makes software to enhance e-mail and wireless telephone communications.

David Gelernter: The Second Coming: a Manifesto

The theme of the Second Age, now approaching, is that computing transcends computers. Information will travel through a sea of anonymous, interchangeable computers like a breeze through tall grass. A desktop computer will be a scooped-out hole in the beach, where information from the cybersphere wells up like seawater.

DAVID GELERNTER, a professor of computer science at Yale University and chief scientist at Mirror Worlds Technologies, is a leading figure in the third generation of artificial intelligence researchers and the inventor of a programming language called Linda, which made it possible to link computers to work on a single problem. He has since emerged as one of the seminal thinkers in the field known as parallel, or distributed, computing. His books include Mirror Worlds; The Muse in the Machine; 1939: The Lost World of the Fair; and Judaism Beyond Words.

Rodney Brooks: Making Living Systems

My midlife research crisis has been to move away from looking at humanoid robots and toward looking at the very simple question of what makes something alive—what the organizing principles are that go on inside living systems. In my lab at MIT, we're trying to build robots that have properties of living systems that robots haven't had before.

RODNEY BROOKS is the director of the MIT Artificial Intelligence Laboratory and Fujitsu Professor of Computer Science at MIT. He is also chairman and chief technical officer of iRobot, a robotics company. Dr. Brooks appeared as one of the four principals in the 1997 Errol Morris movie Fast, Cheap, and Out of Control (named after one of Brooks's papers in the Journal of the British Interplanetary Society). He is the author of Flesh and Machines and Cambrian Intelligence: The Early History of the New A.I.

Hans Moravec: Making Minds

Perhaps programs that implement humanlike intelligence in a highly abstract way are possible on existing computers, as AI traditionalists imagine. Perhaps, as they also imagine, devising such programs requires lifetimes of work by world-class geniuses. But it may not be so easy.

HANS MORAVEC is a principal research scientist in the Robotics Institute of Carnegie Mellon University and the author of Mind Children: The Future of Robot and Human Intelligence and Robot: Mere Machine to Transcendent Mind.

David Deutsch: Quantum Computation

For me, the main application of the theory [of quantum computation] is to change our sense of the nature of reality. Regardless of its practical applications in the distant future, the really important thing is the philosophical implications—epistemological and metaphysical—and the implications for theoretical physics itself. One of the most important implications is one that we get before we even build the first qubit [quantum bit]. The very structure of the theory forces upon us a view of physical reality as a multiverse.

DAVID DEUTSCH's papers on quantum computation laid the foundations for that field, breaking new ground both in physics and the theory of computation and triggering an explosion of research efforts worldwide. His work revealed the importance of quantum effects in the physics of time travel, and he is the most prominent contemporary researcher in the quantum theory of parallel universes. In 1998, he was awarded the Paul Dirac Prize by Britain's Institute of Physics "for pioneering work in quantum computation leading to the concept of a quantum computer and for contributing to the understanding of how such devices might be constructed from quantum logic gates in quantum networks." He is a founding member of the Centre for Quantum Computation at the Clarendon Laboratory, University of Oxford, and the author of The Fabric of Reality.

Marvin Minsky: What Comes After Minds?

Tens of thousands of researchers today, in the field called artificial intelligence, are striving to endow machines with?humanlike abilities. They've developed programs that outperform people in many specialized domains. Some solve hard mathematical problems or skillfully pilot ships and planes. Others can recognize voices and faces or objects on assembly lines. But none of them yet can dress themselves, or understand the sorts of things that young children can. Why don't any computers yet have what we call everyday, commonsense knowledge or do the sorts of reasoning that we regard as obvious?

MARVIN MINSKY is Toshiba Professor of Media Arts and Sciences and professor of electrical engineering and computer science at the Massachusetts Institute of Technology. His research has led to both theoretical and practical advances in mathematics, computer science, physics, psychology, and artificial intelligence, with notable contributions in the domains of computational semantics and knowledge representation, machine perception and learning, and theories of human problem solving. Minsky is also the inventor of the popular Confocal Scanning Microscope, which revolutionized our ability to see dense microscopic structures. He is the author of The Society of Mind and the forthcoming book, The Emotion Machine.

Ray Kurzweil: The Singularity

We are entering a new era. I call it the Singularity. It's a merger between human intelligence and machine intelligence which is going to create something bigger than itself. It's the cutting edge of evolution on our planet. One can make a strong case that it's actually the cutting edge of the evolution of intelligence in general, because there's no indication that it has occurred anywhere else. To me that is what human civilization is all about. It is part of our destiny, and part of the destiny of evolution, to continue to progress ever faster and to grow the power of intelligence exponentially.

RAY KURZWEIL, an inventor and entrepreneur, has been pushing the technological envelope for years in his field of pattern recognition. He was the principal developer of the first omni-font optical character recognition machine, the first print-to-speech reading machine for the blind, the first CCD flat-bed scanner, the first text-to speech synthesizer, the first music synthesizer capable of recreating the grand piano and other orchestral instruments, and the first commercially marketed large vocabulary speech recognition system. In 1999 he received the National Medal of Technology from President Clinton. He has received scores of other national and international awards, eleven honorary doctorates, and honors from two other U.S. presidents. In 2002 he was inducted into the U.S. Patent Office's National Inventors Hall of Fame. He is the author of The Age of Intelligent Machines and The Age of Spiritual Machines: When Computers Exceed Human Intelligence.

Jaron Lanier: One Half of a Manifesto

We imagine "pure" cybernetic systems, but we can prove only that we know how to build fairly dysfunctional ones. We kid ourselves when we think we understand something, even a computer, merely because we can model or digitize it.

JARON LANIER, a computer scientist and musician, is lead scientist of the National Tele-Immersion Initiative, a coalition of research universities studying advanced applications for Internet 2. Best known for his work in virtual reality, a term he coined, Lanier helped develop the first implementations of multiperson virtual worlds using head-mounted displays. He also codeveloped the first implementations of virtual reality in surgical simulation, vehicle design prototyping, and various other applications. As a musician, he writes for orchestra, plays a very large number of instruments from around the world, and has performed with a wide variety of collaborators, from Philip Glass to George Clinton.

Part III: And Beyond...

Seth Lloyd: How Fast, How Small, How Powerful? Moore's Law and the Ultimate Laptop

Now we have created devices called computers, which can register and process huge amounts of information—a significant fraction of the amount of information that human beings themselves, as a species, can process. When I think of all the information being processed in that way?I see our species at a very interesting point in its history, which is the point at which our artifacts will soon be processing more information than we physically will be able to process.

SETH LLOYD is an associate professor of mechanical engineering at MIT and a principal investigator at MIT's Research Laboratory of Electronics. He works on problems having to do with information and complex systems, from the very small (How do atoms process information? How can you make them compute?) to the very large (How does society process information? And how can we understand society in terms of its ability to process information?).

Alan Guth: A Golden Age of Cosmology

The classical theory was never really a theory of a bang; it was a theory about the aftermath of a bang. It started with all of the matter in the universe already in place, already undergoing rapid expansion, already incredibly hot. There was no explanation of how the universe got that way. Inflation is an attempt to answer the question of what made the universe bang, and now it looks as though it's almost certainly the right answer.

ALAN GUTH, the father of the inflationary theory of the universe, is the Victor F. Weisskopf Professor of Physics at MIT. His research interests are in the area of elementary particle theory and the application of particle theory to the early universe. In 2002 he was awarded the Dirac Medal of the International Centre for Theoretical Physics, along with Paul Steinhardt and Andrei Linde, for the development of the concept of inflation in cosmology. He is the author of The Inflationary Universe: The Quest for a New Theory of Cosmic Origins.

Paul Steinhardt: The Cyclic Universe

[F]or the past year I've been involved in the development of an alternative theory that turns cosmic history topsy-turvy. In it, all the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.

PAUL STEINHARDT is the Albert Einstein Professor in Science and a professor in both the Department of Physics and the Department of Astrophysical Sciences at Princeton University. He is one of the leading theorists responsible for inflationary theory, having been involved in constructing the first workable model of inflation and the theory of how inflation could produce seeds for galaxy formation. He was also among the first to show evidence for dark energy and cosmic acceleration, introducing the term "quintessence" to refer to dynamical forms of dark energy. In 2002 he was awarded the Dirac Medal of the International Centre for Theoretical Physics, along with Alan Guth and Andrei Linde, for the development of the concept of inflation in cosmology.

Lisa Randall: Theories of the Brane

Additional spatial dimensions may seem like a wild and crazy idea at first, but there are powerful reasons to believe that there really are extra dimensions of space. One reason resides in string theory, in which it is postulated that the particles are not themselves fundamental but are oscillation modes of a fundamental string.

LISA RANDALL is a professor of physics at Harvard University, where she also earned her PhD (1987). She was a President's Fellow at the University of California at Berkeley, a postdoctoral fellow at Lawrence Berkeley Laboratory, and a junior fellow at Harvard before joining the MIT faculty in 1991. Between 1998 and 2000, she had a joint appointment at Princeton and MIT as a full professor, and she moved to Harvard as a full professor in 2001. Her research in theoretical high energy physics is primarily related to exploring the physics underlying the standard model of particle physics. This has involved studies of supersymmetry and, most recently, extra dimensions of space.

Lee Smolin: Loop Quantum Gravity

It's only since the middle 1980s that real progress began to be made on unifying relativity and quantum theory. The turning point was the invention of not one but two approaches: loop quantum gravity and string theory. Since then, we have been making steady progress on both of these approaches. In each case, we are able to do calculations that predict surprising new phenomena. Still, we are not done. Neither is yet in final form; there are still things to understand. But the really important news is that there is now a real chance of doing experiments that will test the new predictions of these theories. This is important, because we're in the uncomfortable situation of having two well- developed candidates for the quantum theory of gravity. We need to reduce these to one theory. We can do this either by finding that one is wrong and the other right, or by finding that the two theories can themselves be unified.

LEE SMOLIN, a theoretical physicist, is concerned with quantum gravity, "the name we give to the theory that unifies all the physics now under construction." More specifically, he is a co-inventor of an approach called loop quantum gravity. In 2001, he became a founding member and research physicist of the Perimeter Institute for Theoretical Physics, in Waterloo, Ontario. Smolin is the author of The Life of The Cosmos and Three Roads to Quantum Gravity.

Martin Rees: A Look Ahead

The challenge is to understand how complexity emerges. This is just as fundamental as the challenge to come up with the so-called theory of everything—and it is independent of it. The theoretical physicist Steven Weinberg says that if you go on asking "Why?why?why?" you get back to a question in particle physics or cosmology. That's true to a degree, but only in a limited sense.

SIR MARTIN REES is Royal Society Professor at Cambridge University, a fellow of Kings College, and the U.K.'s Astronomer Royal. He was previously Plumian Professor of Astronomy and Experimental Philosophy at Cambridge, having been elected to this chair at the age of thirty, succeeding Fred Hoyle. He is the author of several books, including Gravity's Fatal Attraction (with Mitchell Begelman); Before the Beginning; Just Six Numbers; Our Cosmic Habitat; and Our Final Hour: A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future In This Century—On Earth and Beyond (forthcoming, March 2003).

Epilogue: Responses to "The New Humanists"

Nicholas Humphrey, Jaron Lanier, Joseph LeDoux, John Horgan, Timothy Taylor, Carlo Rovelli, Steven Johnson, Lee Smolin, Douglas Rushkoff, Piet Hut, Marc D. Hauser, Mihalyi Csikzentmihalyi, Denis Dutton, Daniel C. Dennett, Howard Rheingold, Chris Anderson

Suggested Reading


THE NEW HUMANISTS

By John Brockman

In 1991, in an essay entitled "The Emerging Third Culture," I put forward the following argument:

In the past few years, the playing field of American intellectual life has shifted, and the traditional intellectual has become increasingly marginalized. A 1950s education in Freud, Marx, and modernism is not a sufficient qualification for a thinking person today. Indeed, the traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often nonempirical. It uses its own jargon and washes its own laundry. It is chiefly characterized by comment on comments, the swelling spiral of commentary eventually reaching the point where the real world gets lost.

Twelve years later, that fossil culture has been essentially replaced by the "third culture" of the essay's title—a reference to C. P. Snow's celebrated division of the thinking world into two cultures, that of the literary intellectual and that of the scientist. This new culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, have taken the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.

The scientists of the third culture share their work and ideas not just with each other but with a newly educated public, through their books. Focusing on the real world, they have led us into one of the most dazzling periods of intellectual activity in human history. The achievements of the third culture are not the marginal disputes of a quarrelsome mandarin class; they affect the lives of everybody on the planet. The emergence of this new culture is evidence of a great intellectual hunger, a desire for the new and important ideas that drive our times: revolutionary developments in molecular biology, genetic engineering, nanotechnology, artificial intelligence, artificial life, chaos theory, massive parallelism, neural nets, the inflationary universe, fractals, complex adaptive systems, linguistics, superstrings, biodiversity, the human genome, expert systems, punctuated equilibrium, cellular automata, fuzzy logic, virtual reality, cyberspace, and teraflop machines. Among others.

Humanism and the Intellectual Whole

Around the fifteenth century, the word "humanism" was tied in with the idea of one intellectual whole. A Florentine nobleman knew that to read Dante but ignore science was ridiculous. Leonardo was a great artist, a great scientist, a great technologist. Michelangelo was an even greater artist and engineer. These men were intellectually holistic giants. To them, the idea of embracing humanism while remaining ignorant of the latest scientific and technological achievements would have been incomprehensible. The time has come to reestablish that holistic definition.

In the twentieth century, a period of great scientific advancement, instead of having science and technology at the center of the intellectual world—of having a unity in which scholarship included science and technology along with literature and art—the official culture kicked them out. Traditional humanities scholars looked at science and technology as some sort of technical special product. Elite universities nudged science out of the liberal arts undergraduate curriculum—and out of the minds of many young people, who, as the new academic establishment, so marginalized themselves that they are no longer within shouting distance of the action.

In too much of academia, intellectual debate tends to center on such matters as who was or was not a Stalinist in 1937, or what the sleeping arrangements were for guests at a Bloomsbury weekend in the early part of the twentieth century. This is not to suggest that studying history is a waste of time: History illuminates our origins and keeps us from reinventing the wheel. But the question arises: History of what? Do we want the center of culture to be based on a closed system, a process of text in/text out, and no empirical contact with the real world? One can only marvel at, for example, art critics who know nothing about visual perception; "social constructionist" literary critics uninterested in the human universals documented by anthropologists; opponents of genetically modified foods, additives, and pesticide residues who are ignorant of genetics and evolutionary biology.

Cultural Pessimism vs. Scientific Optimism

A fundamental distinction exists between the literature of science and that of disciplines whose subjects are self-referential and most often concerned with the exegesis of earlier thinkers. Unlike those disciplines in which there is no expectation of systematic progress and in which one reflects on and recycles the ideas of others, science, on its frontiers, poses more and better questions, better put. They are questions phrased to elicit answers; science finds the answers and moves on. Meanwhile the traditional humanities establishment continues its exhaustive insular hermeneutics, indulging itself in cultural pessimism, clinging to its fashionably glum outlook on world events.

"We live in an era in which pessimism has become the norm," writes Arthur Herman, in The Idea of Decline in Western History. Herman, who coordinates the Western Civilization Program at the Smithsonian, argues that the decline of the West, with its view of our "sick society," has become the dominant theme in intellectual discourse, to the point where the very idea of civilization has changed. He continues:

This new order might take the shape of the Unabomber's radical environmental utopia. It might also be Nietzsche's Overman, or Hitler's Aryan National Socialism, or Marcuse's utopian union of technology and Eros, or Frantz Fanon's revolutionary fellahin. Its carriers might be the ecologist's "friends of the earth," or the multiculturalist's "persons of color," or the radical feminist's New Amazons, or Robert Bly's New Men. The particular shape of the new order will vary according to taste; however, its most important virtue will be its totally non-, or even anti-Western character. In the end, what matters to the cultural pessimist is less what is going to be created than what is going to be destroyed—namely, our "sick" modern society....[T]he sowing of despair and self doubt has become so pervasive that we accept it as a normal intellectual stance—even when it is directly contradicted by our own reality.

Key to this cultural pessimism is a belief in the myth of the Noble Savage—that before we had science and technology, people lived in ecological harmony and bliss. Quite the opposite is the case. That the greatest change continues to be the rate of change must be hard to deal with, if you're still looking at the world through the eyes of Spengler and Nietzsche. In their almost religious devotion to a pessimistic worldview, the academic humanists have created a culture of previous "isms" that turn on themselves and endlessly cycle. How many times have you seen the name of an academic humanist icon in a newspaper or magazine article and immediately stopped reading? You know what's coming. Why waste the time?

As a counternarrative to this cultural pessimism, consider the twofold optimism of science.

First, the more science you do, the more there is to do. Scientists are constantly acquiring and processing new information. This is the reality of Moore's Law—just as there has been a doubling of computer processing power every eighteen months for the past twenty years, so too do scientists acquire information exponentially. They can't help but be optimistic.

And second, much of the new information is either good news or news that can be made good thanks to ever deepening knowledge and ever more efficient and powerful tools and techniques.

Scientists debate continually, and reality is the check. They may have egos as large as those possessed by the iconic figures of the academic humanities, but they handle their hubris in a very different way. They can be moved by arguments, because they work in an empirical world of facts, a world based on reality. There are no fixed, unalterable positions. They are both the creators and the critics of their shared enterprise. Ideas come from them and they also criticize one another's ideas. Through the process of creativity and criticism and debates, they decide which ideas get weeded out and which become part of the consensus that leads to the next level of discovery. Unlike the humanities academicians, who talk about each other, scientists talk about the universe. Moreover, there's not much difference between the style of thinking of a cosmologist trying to understand the physical world by studying the origins of atoms, stars, and galaxies and an evolutionary biologist trying to understand the emergence of complex systems from simple beginnings or trying to see patterns in nature. As exercises, these entail the same mixture of observation, theoretical modeling, computer simulation, and so on—as in most other scientific fields. The worlds of science are convergent. The frame of reference is shared across their disciplines.

Science is still near the beginning. As the frontiers advance, the horizon gets wider and comes into focus. And these advances have changed the way we see our place in nature. The idea that we are an integral part of this universe—a universe governed by physical and mathematical laws that our brains are attuned to understand—causes us to see our place in the unfolding of natural history differently. We have come to realize, through developments in astronomy and cosmology, that we are still quite near the beginning. The history of creation has been enormously expanded—from 6,000 years back to the 13.7 billion years of Big Bang cosmology. But the future has expanded even more—perhaps to infinity. In the seventeenth century, people not only believed in that constricted past but thought that history was near its end: The apocalypse was coming. A realization that time may well be endless leads us to a new view of the human species—as not being in any sense the culmination but perhaps a fairly early stage of the process of evolution. We arrive at this concept through detailed observation and analysis, through science-based thinking; it allows us to see life playing an ever greater role in the future of the universe.

There are encouraging signs that the third culture now includes scholars in the humanities who think the way scientists do. Like their colleagues in the sciences, they believe there is a real world and their job is to understand it and explain it. They test their ideas in terms of logical coherence, explanatory power, conformity with empirical facts. They do not defer to intellectual authorities: Anyone's ideas can be challenged, and understanding and knowledge accumulate through such challenges. They are not reducing the humanities to biological and physical principles, but they do believe that art, literature, history, politics—a whole panoply of humanist concerns—need to take the sciences into account.

Connections do exist: Our arts, our philosophies, our literature are the product of human minds interacting with one another, and the human mind is a product of the human brain, which is organized in part by the human genome and evolved by the physical processes of evolution. Like scientists, the science-based humanities scholars are intellectually eclectic, seeking ideas from a variety of sources and adopting the ones that prove their worth, rather than working within "systems" or "schools." As such, they are not Marxist scholars or Freudian scholars or Catholic scholars. They think like scientists, know science, and easily communicate with scientists; their principal difference from scientists is in the subject matter they write about, not their intellectual style. Science-based thinking among enlightened humanities scholars is now part of public culture.

In short, something radically new is in the air: new ways of understanding physical systems, new ways of thinking about thinking that call into question many of our basic assumptions. A realistic biology of the mind, advances in physics, information technology, genetics, neurobiology, engineering, the chemistry of materials—all are challenging basic assumptions of who and what we are, of what it means to be human. The arts and the sciences are again joining together as one culture, the third culture. Those involved in this effort—on either side of C.P. Snow's old divide—are at the center of today's intellectual action. They are the new humanists.


Responses to "The New Humanists" from Nicholas Humphrey, Jaron Lanier, Joseph LeDoux, John Horgan, Timothy Taylor, Carlo Rovelli, Steven Johnson, Lee Smolin, Douglas Rushkoff, Piet Hut, Marc D. Hauser, Mihalyi Csikzentmihalyi, Denis Dutton, Daniel C. Dennett, Howard Rheingold, Chris Anderson



From: Nicholas Humphrey

I have major problems with the essay. In particular, I don't find the identification of science and optimism at all convincing—on either of your two counts.

1. I don't think scientists do (or should) expect an exponential Moore's-Law-like expansion of interesting problems. In fact, just the opposite: I think we are—or soon will be—exhausting the mine of deep and interesting problems. We'll have a "theory of everything," we'll have proved Riemann's hypothesis, we'll have got to the bottom of consciousness, and so on. This is indeed the Golden Age of Science. But it has to be self-closing, at least as far as the big, the hard, problems are concerned. I wrote about just this issue in my essay "Scientific Shakespeare." The point I made there is that the "arts" continue to have opportunities that the "sciences" soon will not have. I think we scientists had better be prepared for—and even humble in the face of—the next phase of human culture, which may well revert to the traditional province of the arts.

2. I don't think scientific discoveries can be counted on, necessarily, to bring about a net increase in human happiness—either through what they reveal about the course of nature or through the tools they potentially give us with which to intervene in it. Many scientists, from Bertrand Russell to Jacques Monod to Martin Rees, have been and are deeply pessimistic about what science tells us about the way the world is headed. And, as a separate issue, many still have anxieties about the use to which scientific discoveries will be put—from weapons of mass destruction, to eugenics, to thought control.

This isn't to question your main point that today science is the only game in town. I do of course agree there's more hope in science than there is in anything else. But the problem, as I see it, for this essay is that you already made this point years ago as convincingly as could be, in your introduction to The Third Culture, and it really doesn't need making again. In fact, if I were you, I would now adopt a totally different tack. Instead of repeating your attack on the Bloomsbury-obsessed intellectuals of the second half of the twentieth century, I think you should be drawing attention to the way they have already become marginalized—partly through your own efforts. The evidence for the triumph of science in the intellectual culture is all around. In literature (e.g., Ian McEwan's Enduring Love), in film (e.g., A Beautiful Mind), in theatre (e.g., Michael Frayn's Copenhagen), and so on. What we're seeing is an astonishing turnaround from the old values to the new. Your essay, as it is, is curiously paranoid. You no longer need to be! You've largely won. But the next task is to provide a sober assessment of the nature of the victory.

NICHOLAS HUMPHREY is a theoretical psychologist at LSE and The New School and author of The Mind Made Flesh.[more....]



From: Jaron Lanier

Bravo, John! You are playing a vital role in moving the sciences beyond a defensive posture in response to turf attacks from the "postmodernists" and other leeches on the academies. You celebrate science and technology as our most pragmatic expressions of optimism.

I wonder, though, if it's enough to merely point out how hopelessly lost those encrusted arts and humanities intellectuals have become in their petty arms race of cynicism. If we scientists and technologists are to be the new humanists, we must recognize that there are questions that must be addressed by any thinking person which do not lie within our established methods and dialogs. Indeed your edge.org Web site has provided one of the few forums where scientists can exchange ideas about some of these questions.

Maybe there's a community of scientists who have become "the new humanists," but that isn't good enough. We technical people must either learn to be able to talk about certain things with a greater sensitivity, for a larger popular audience with needs that might seem extraordinary to many of us, or we will continue to cede much influence by default to whoever else is more willing to rise to the occasion.

While "postmodern" academics and "Second Culture" celebrity figures are perhaps the most insufferable enemies of science, they are certainly not the most dangerous. Even as we are beginning to peer at biology's deepest foundations for the first time, we find ourselves in a situation in which vast portions of the educated population have turned against the project of science in favor of pop alternatives usually billed as being more "spiritual." These range from the merely silly (such as astrology) to the archaic, mean, and often violent religious orthodoxies that seem to be gaining power within many of the world's religious traditions. What is it that drives vast numbers of people into superstition and the inevitable exploitation that follows from it? What is it, for instance, that has made medicine informed by science (often derided as being merely "western" or "allopathic") so unattractive to so many smart people, when it is clear that it has been an overwhelming success? Perhaps the science culture elite has not sufficiently appreciated the task it must take on if it is to be its own advocate. Postmodern critics of science are mostly merely ridiculous, while the mainstream enemies of science are something much worse: They are winning.

What does that word "spirituality" mean? Let me propose a definition: One's spirituality is the range of one's emotional relationships to those questions that cannot be answered. Scientists and technologists naturally gravitate away from such questions. "What happens when you die?" for instance. About what we cannot speak we remain silent. We have made peace with the big questions every child asks by finding the limits to our abilities to answer them. Many of us have grown comfortable with a few familiar and eternal splotches of ignorance, even though they are centrally placed in our field of view, because there have been compensations for our disappointments. We're delighted that the universe can be understood so well in so many ways, and specifically that we've been able to make personal contributions to that understanding. We're often enchanted with the beauty we see in nature—beauty that's harder for nonspecialized people to appreciate. Some among us have even found faith of one sort or another, but usually only faith that's precisely coincident with those splotches of ignorance.

But what we forget is that many people, probably most, haven't had life experiences that lubricate such intellectual bargains. Most people are uncomfortable with accepting a little unfortunately placed ignorance—or uncertainty leading to rigorously bounded zones of faith—in exchange for robust specialized knowledge in other areas. There is every reason in the world to ridicule stupid elitist cultural figures who use trendy pessimism as a cover for narcissism. Yes, please, let's have fun with them. But that approach won't do much for the hugely larger number of people who suffer from sincere anxiety about the unanswerable big questions.

I'd like to focus on one particular cultural pathway that I believe is driving much of the public away from the sciences, because some members of the edge.org community are central to it. It goes like this: A scientist or technologist is sought out by the media because she is articulate about life beyond the lab. She appears on TV talking about items with human interest, using the intellectual framework of her research. Suppose she likes to think in terms of artificial intelligence, evolutionary psychology, or some of the other intellectual frameworks that refute the "specialness" of people in order to clarify investigations. An idea arising from such a framework which might serve a purpose in the lab often falls flat out in the open environment. For instance, if she's an artificial intelligence researcher, she might in passing wonder if a lonely childless couple could raise a robotic child for comfort in the future. This was an idea in a popular science fiction movie recently, but it was also espoused as a reasonable and realistic eventuality by an MIT scientist on National Public Radio.

Within the informed scientific and technological community, it's possible to have a nuanced debate about such a remark. It's possible to ask if the scale of complexity in a real child can really be approximated by a digital device anytime in the relevantly near future. One might point out that even if the hardware gets vast and fast enough, we don't seem to be able to write stable giant programs, so some unforeseen advances would at a minimum be required on the software front. But that's not what happens out in the wide world of nonscientists. "Soft," or "spiritual" people, for instance, are often disturbed and become more likely to cancel doctors' appointments in favor of aromatherapy sessions. If scientists think robots and children are the same, then a pox on them! When the artificial intelligence researcher equated, even in a very narrow sense, information systems and human beings, she inadvertently answered some of the big questions of childhood in a particular way. I fear the message ends up being heard as something like "Not only is there no soul, no afterlife, no nothing magical about you at all, but I'm an elite scientist who can see into your circuitry and make another thing like you, thus making you in a fundamental way subordinate to me."

The arts and humanities (and let's not forget the religions!) have been perpetually faced with the challenge of making simple things complicated. So there exist preposterously garbled academic books about philosophy and art. This is a little like that old trope about cargo cults. When I was trained as a composer, I was made to study ridiculously arcane academic music that only a small number of people could understand. This simulated the situation in physics, in the hopes that similar prestige, budgets, and even parking spaces on campus might be forthcoming for the most celebrated and cryptic elite. In this case, the cargo cult approach worked!

Science faces the opposite problem. Most scientists would be delighted if the inherent elitism of a hard discipline would suddenly drop away, so that there could be an army of new collaborators. Sadly, this future is not to be. Instead, we have to learn new ways to improve the interactions between the scientific community and the world at large.

This is where I think the Third Culture still needs to mature. Science must learn to be better at communicating its limits nonapologetically, as strengths. And scientists might have to learn to communicate in public about how we, too, are sometimes troubled at night by the unanswerable questions.

JARON LANIER, a computer scientist and musician, is a pioneer of virtual reality, and currently the lead scientist for the National Tele-Immersion Initiative. [more....]



From:
Joesph LeDoux

It's great to seek some sort of fusion across diverse fields, but I'm concerned that things are not as black and white as you imply in the piece. There are of course some vocal "relativists" in academic circles, but I think most people who are actually making culture (artists, writers, muscians) are open to and very interested in what science has to say. Unfortunately, the same is less the case for some scientists. It is shocking to see how ignorant and dismissive of the arts scientists can be. As I see it, the broader view of culture which you propose is going to require some mind expansion in the sciences as well.

JOSEPH LEDOUX is a neuroscientist at New York University and author of Synaptic Self: How Our Brains Become Who We Are. [more....]



From: John Horgan

If your essay was meant to provoke, it obviously succeeded. But it really works more as a kind of Nike ad for science than a serious analysis of science's relation to the humanities or culture as a whole. It reminds me of Wired rhetoric (pre-Nasdaq crash), or of the jacket copy for books about the Santa Fe Institute in its giddy early days. Science rules! You are brave indeed to resurrect this kind of scientistic triumphalism now that the e-business bubble has burst and the world is roiling with conflicts that science has little or no hope of illuminating, let alone ameliorating.

A few more cantankerous thoughts: You say scientists confront the "real world," as opposed to these humanist ignorami. I wish you had named names, so we could judge whether your targets match your cartoon description. But let's take Judith Butler, who does deconstruction of sexual identity and is a favorite whipping-girl of those bemoaning the decadence of the humanities. I would submit that she's far more engaged with reality—our human reality—than are string theorists or inflationary cosmologists. Certainly some science trade books—such as E.O. Wilson's latest, The Future of Life—address issues that should concern any thoughtful person. But tell me, John, is there any science book as important for someone today to read as, say, Samuel Huntington's Clash of Civilizations?

And lots of popular trade books in science are peddling sci-fi escapism, geared especially toward socially awkward, adolescent males. What does Lee Smolin's evolutionary cosmology have to do with the real world, honestly? Or Ray Kurzweil's fantasies about what it would be like to be transformed into pure software? I'm a science geek, so I find this sort of stuff entertaining when well done, but I certainly can't blame others who have no taste for it. Let's face it, trade science books are best understood as a minuscule subniche of the entertainment industry. If people would rather read about Virginia Woolf's sex life—or watch Friends, for that matter—than wrestle with A Brief History of Time or The Origins of Order, I don't think they should have to feel like second-class citizens.

I agree with you that we would all be better off if more people were scientifically literate. But to me "scientific literacy" does not mean getting all excited over the latest scientific "breakthrough," whether brane theory or monoclonal antibodies or nanotech. It means knowing enough to distinguish genuine advances from the hype surrounding Prozac or evolutionary psychology or Star Wars or gene therapy.

Science has enriched modern life in countless ways, both materially and intellectually. But our infatuation with scientific and technological progress for their own sake has also had adverse consequences: pollution, weapons of mass destruction—you know the old bugaboos. And great harm was committed in the last century because people got carried away by such pseudoscientific fads as Marxism, social Darwinism, eugenics, and psychopharmacology. History teaches us that science is limited in what it can do for us. This is realism, not pessimism. And the last thing we need nowadays is another ideology or faith.

JOHN HORGAN is a freelance writer and author of Rational Mysticism: Dispatches from the Border Between Science and Spirituality; The Undiscovered Mind. [more....]



From: Timothy Taylor

Certainly I recognize some of what John [Brockman] diagnoses as frustrating (and worse) in the social sciences—"text-in, text-out" bubbles of inconsequential, content-free activity only blasphemously given the name of scholarship. But we must also recognize that there has been an extraordinary—and often extraordinarily arrogant—underestimation of the complexity of the humanities by some hard scientists who extend themselves across the arts sciences divide. Personally, I have no doubt that to do moral philosophy well, for instance, requires a longer intellectual training than is typically needed to make advances in, say, plasma physics or genetics. But I also know that some physicists and geneticists are prone not to recognize this. I do not mean to say that what they do is simpleminded (emphatically it is not), just that some (perhaps much) of what they do is epistemologically more straightforward.

The dangers of scientists attempting to become the new humanists are best illustrated by specific examples. For instance, Richard Dawkins's idea of "memes"—proposed cultural counterparts to genes—has not been adopted in archaeology, precisely the discipline where it should have succeeded had it been useful. It is unsurprising (and no real discredit to him) that a top-notch evolutionary biologist does not cut the mustard when it comes to theorizing cultural transmission: After all, Richard Dawkins may have no more training in cultural theory than I have in evolutionary biology. A problem arises, however, if people who may know no better think that memes must be a good idea, and interpret the paucity of critical discussion of them as evidence of the acceptance of the concept.

Similar kinds of concerns arise in relation to the psychologist Steven Pinker's formulation of a "language instinct." This is not a bad idea in theory, but it is elaborated with—apparently—total disregard for an extensive body of work by Russian, French, and German philosophical linguists which has reached very different conclusions. That is to say, whether or not one accepts Pinker's linguistic judgments, his work has come out from a cognitive psychology background into the glare of public attention (and has been widely accepted to be true by the media) without engaging with those humanistic debates of most central relevance to the plausibility or otherwise of his most dramatic claims (as expressed by followers of L. S. Vygotsky, to take one example).

One has to confront the tricky problem that popular science often either preaches to the converted or, when it strays into more "humanistic" domains, makes an unwitting ass of itself. The United States has an excellent tradition of scientists writing for a broader audience, but a scarily growing third of the national population shares a metaphysics which cannot accommodate Darwinian evolution, let alone understand what it entails. The rise of creationism in the United States is an unfolding intellectual tragedy that will be turned around only when there is greater respect—among scientists, in particular—for the sophistication and unpredictability of human social and cultural formations. This will require a renewed humility in addressing the true complexities of our behavioral wellsprings. The prospect of a great nation intellectually split between religious fundamentalism and an equally assertive, dogmatic, and unreflectively narrow scientism is not pretty.

A real victory for science would consist not in sweeping away other aspects of existence, such as religion (not that it has any hope of doing so) but in respectfully deepening understanding of what it is to live and die as a human being and observing the universe from that perspective. Many dimensions of nonrational, symbolic, or ritual behaviors can, of course, be partially or wholly analyzed within a scientific framework, but other aspects will never be amenable to such a thing. There are places where experiment and verification cannot go, and we have to observe, interpret, reflect, and explain perceived phenomena in a qualitatively different way.

TIMOTHY TAYLOR is an archaeologist at University of Bradford, UK, and author of The Buried Soul: How We Invented Death and The Prehistory of Sex. [more....]



From: Carlo Rovelli

We are certainly far away from obscurantism, but there are also signs of reaction against scientific thinking, and John's optimistic essay is a warning. There are serious signs of irrationalism all over the planet and also in the words of our top leaders. Our guarantee against obscurantism is not democracy alone: Peoples have often voted into power forces that openly adhered to irrationalism, such as the Nazis and some current governments. Our guarantee against obscurantism is the widespread recognition of the vital and clear force of rational scientific thinking. When I talk with cultivated people who happily claim they know nothing about math and science, I get even more scared than when powerful people say they do not read books.

Scientific thinking is at the core of our knowledge-based civilization. We can add to this our thirst for justice, our faith in dreams, our deep awareness of the emptiness of life, our faith in humanity as a value, our desire for beauty, our sense of mystery, and all else that the wonders of the human adventure has given us. None of this is challenged by science, or challenges science. To the contrary. The scientific quest for knowledge is deeply emotional in its ways and motivations. If we resist it, we resist reality. Reality, however complex and unknowable in its deepness, is there, and fights back.

CARLO ROVELLI, a specialist in quantum gravity, is a theoretical physicist at the Centre de Physique Théorique, in Marseille. [more....]



From: Steven Johnson

I think Nicholas Humphrey may have a point when he says that "you've already won." One brief piece of anecdotal evidence: I attended a dinner party last weekend that was populated entirely by people who had spent their undergraduate—and in some cases graduate—years in the trenches of postmodernist theory. These were all people who, like me, had sworn allegiance to Baudrillard, Derrida, Foucault, Jameson, et al. in their early twenties. (A number were semiotics majors with me at Brown.) Any science courses we'd taken in those days we took in order to archly deconstruct the underlying "paradigm of research," or expose one of any number of "centrisms" lurking behind the scientific text and its illusory claims of empirical truth.

What struck me over dinner, though, was how readily the conversation drifted—without me pushing it along—to precisely the realm you describe in "The New Humanists," largely focused around brain issues. None of these people had returned to grad school in neuroscience, mind you, but they were all clearly versed in, and fascinated by, the latest news from the brain sciences. They talked casually about neurotransmitters and "other-mindedness;" they leaned readily on evolutionary psychological explanations for the behavior they were discussing; they talked about the role of the "god spot" in the evolution of religious belief. There wasn't a scare quote or a relativist aside in the entire conversation. I couldn't help thinking that if any one of them had made a comparable argument ten or fifteen years ago, he or she would have been heckled out of the room.

I don't think my dinner survey was anomalous. It seems to me that the most interesting work right now is work that tries to bridge the two worlds, that looks for connections rather than divisions. I think that's what E. O. Wilson was proposing in Consilience: not the annexing of the humanities by the sciences but a kind of conceptual bridge-building. In fact, I would say that the most consilient work today has come from folks trained as cultural critics—books like Michael Pollan's Botany of Desire, with its mix of Nietzsche and Richard Dawkins, or Manuel De Landa's 1,000 Years of Non-Linear History, with its unique combination of Deleuze and chaos theory.

I suspect there are other bridges to build in the coming years, but the traffic along those bridges will have to be two-way for the interaction to pay off. Obviously, the postmodernists have made a lot of noise trashing the empirical claims of sciences, but if you tune out much of that bombast, there's quite a bit in the structuralist and poststructuralist tradition that dovetails with new developments in the sciences. To give just a few examples: The underlying premise of deconstruction—that our systems of thought are fundamentally shaped and limited by the structure of language—resonates with many chapters of a book like The Language Instinct. (I tried to persuade Pinker of this when I interviewed him years ago for Feed.) The postmodern assumption of a "constructed reality" goes nicely with the idea of consciousness as a kind of artificial theater and not a direct apprehension of things in themselves. Semiotics and structuralism both began with Levi-Strauss's research into universal mythology, which obviously has deep connections to the project of evolutionary psychology.

STEVEN JOHNSON is cofounder of the pioneering Web magazine Feed. He is the author of Emergence: The Connected Lives of Ants, Brains, Cities, and Software and Interface Culture [more....]



From: Lee Smolin

What Third Culture and New Humanist intellectuals have to offer society is far more than just being in touch with science. They represent the vanguard of a broad intellectual movement that already has representatives in diverse fields of the sciences, social sciences, and humanities. I think the deepest characterization of this new movement is epistemological, because it is about the kinds of questions people are asking and the kinds of answers they are searching for. It is indicated by the emergence of new styles of explanation that reject the notion of an eternal "ultimate reality," perceived by God alone, in favor of more rational and accessible tenets. The old-style explanation relies on the hypothesis that behind ever changing appearances there is an ultimate reality that is eternal and unchanging. This eternal reality may be God, it may be principles of justice or aesthetics, or it may be the ultimate laws of nature. The new style of explanation rejects such ideas as being, in the end, little different from mysticism, since the alleged ultimate reality is unknown and unknowable. As pointed out long ago by C. S. Peirce, any explanation that rests on an appeal to the existence of ultimate and unchanging eternal laws of nature is fundamentally irrational, because there can be no further explanation of why those laws of nature hold rather than some others. Such an explanation is logically no different from an appeal to "the mind of God."

The new style of explanation rejects the Platonic myth of an eternal realm of true ideas in favor of the idea that knowledge has no meaning apart from what humans beings, as part of the natural world, can perceive and agree on. It also rejects the transcendent fantasies according to which scientists used to picture themselves outside reality and outside any society, in the place of God, surveying all that exists without being a part of it. Instead, many scientists are now happy to see themselves as individuals working within communities of living beings and seeking knowledge by sharing their observations and debating their ideas.

At the same time, this new style of explanation believes that there is a truth to things and that human beings are capable of finding it. It just rejects (as irrational mythology) the idea that truth is possible because of the existence of an imagined platonic realm of eternal absolute ideas. Instead, this new movement grounds the notion and possibility of truth on the human ability to argue rationally and in good faith from shared evidence and, by doing so, to arrive at agreement. To accept this is to accept also the notion that rationality is situational and pluralistic. By accepting the idea that there will be things that appear different from different viewpoints, we strengthen the importance of those things we can agree on.

A contributing factor to this shift is that our cosmological picture has changed drastically, in a way that makes the search for an eternal "ultimate reality" incoherent. Relativity and quantum theory tell us that science must be based on relational quantities, which have to do with relationships between things in the universe, and that no appeal to anything transcendent or eternal or otherwise outside the universe is possible, or even meaningful. Observations tell us that we live in a young universe, one that was born a relatively short time ago and has been evolving ever since. It is far from clear what eternal laws of physics can mean when the universe itself is only a few billion years old.

An aspect of this shift is the attitude toward reductionism. Everyone can agree that when something is made of parts it is useful to explain it in terms of its parts. That's fine, but the problem is that there is a natural limitation to how far such a reductionist explanation can be pushed. When it succeeds, reductionism must lead to an explanation in terms of some set of elementary particles and forces. But then there is a problem, because if the elementary particles are truly fundamental, their properties cannot be explained by a further appeal to reductionism. So the question "Why these fundamental particles and laws and not others?" must be answered in some way that is not itself reductionist. If we truly want a rational understanding of why things are as they are and not otherwise, we must follow the path of reductionism until we find out what the fundamental parts are. After that, we must find new, nonreductionist modes of explanation.

Once a science reaches the point where na?ve reductionism can take us no further, there are three moves we can make. The first is to deny the existence of a crisis with reductionism and continue in a hopeless search for the eternal ultimate reality. Unfortunately, this characterizes some (but by no means all) recent work in fundamental physics. Physicists who align themselves with the "many worlds" interpretation of quantum mechanics or "eternal inflation," or who believe that theoretical physics is about to end with the discovery of "M theory," are operating from what may be called a nostalgia for the absolute. There are similar nostalgic movements in other fields.

The second response is what can be called the postmodernist move. This begins by denying the use of reductionism and the importance of rational understandings altogether. Truth is held to be nothing but a social construction, and a thoroughgoing relativism is embraced. This is even worse than the nostalgic response, because it undermines the very reasons for the crisis and leaves us suspended in an impotent haze, from within which we cannot even remember how useful rational thought has been in improving our world politically, scientifically, and humanly.

There is, however, a third, progressive response to the crisis in the search for ultimate reality. This is to accept the strengths and limitations of reductionism and seek to go beyond it to a more comprehensive and powerful kind of explanation. Evolution by natural selection is a paradigmatic example of such a theory: It is consistent with reductionism but transcends it in being ultimately historical and allowing causation to go both ways—from the more to less complex and the reverse. By attributing order to self-organization rather than to design from the outside, evolution by natural selection offers an essentially rational mode of understanding that avoids any mystical appeal to ultimate causes of things.

Another characteristic of such enlightened explanations is that they may be applied to whole systems, which contain both all their causes and all their observers. Such whole systems include the universe, societies, and ecologies. That is, there is no useful view from outside the system; instead, description and explanation are both pluralistic and relational, because they must take into account that any observer is situated inside the system. Rather than denying objectivity, this kind of approach rationalizes it, by rooting objectivity in what may be observed from many distinct viewpoints rather than in a mythical appeal to an "ultimate reality" or an imaginary viewpoint from outside the system. This makes possible both science—that is, knowledge without appeal to authority—and democracy in a pluralistic, multi-ethnic society.

This new kind of explanation characterizes much of modern biology, as well as recent approaches to complex and self-organized systems, whether economic, sociological, physical, or biological. Into this category also go new approaches to the foundations of quantum mechanics, which have been called relational quantum theory, and new approaches to explanation in cosmology, such as cosmological natural selection, the notion of internal observables, and varying speed-of-light cosmologies.

I believe that what John has called the Third Culture and the New Humanism is ultimately rooted in this pluralistic, relational approach to knowledge. But the divide between the older absolutes-seeking styles of thought and the newer pluralistic and relational approach does not run cleanly between the sciences and the humanities. Many of the key debates now animating science are between specialists whose philosophical predilections put them on either side of this divide. The debates between "many worlds" and relational approaches to quantum mechanics, or between string theorists and loop-quantum-gravity theorists clearly reflect this larger division. So do the debates in evolutionary theory about the level and mechanisms of natural selection, and the debates among computer scientists concerning the possibility of strong artificial intelligence. At the same time, there are artists, philosophers, scholars, architects, and legal theorists whose work is an exploration of the implications of the new attitude toward knowledge. Among them are legal theorists such as Roberto Unger and Drucilla Cornell and artists and writers as diverse as Brian Eno and Pico Iyer.

Finally, it must be mentioned that what I have called a new approach to knowledge has very old roots. The seventeenth-century philosopher Leibniz was keenly aware that the world is a system of relations, and the American pragmatists (such as Peirce) were already a century ago confronting the implications of Darwinism for epistemology and philosophy in general. (Indeed, the simplest way to divide Old from New Humanists is to ask whether their writing shows an awareness of how radically Darwinian evolution changes the background for doing new work in philosophy.) But Leibniz's worldview was to a large extent put aside in favor of Newtonian physics, until it was revived in the twentieth century, while the pragmatists have not had the influence of the deconstructionists in the American academy. When graduate students in the humanities embrace Peirce and Dewey rather than Foucault and Derrida, and when they read Darwin rather than Hegel, we will be able to say that the New Humanism has come of age.

LEE SMOLIN, a theoretical physicist, is a founding member and a research physicist at the Perimeter Institute for Theoretical Physics, in Waterloo, Ontario. He is the author of The Life of the Cosmos and Three Roads to Quantum Gravity. [more....]



From: Douglas Rushkoff

I have lately been thinking about the lasting effects of modernism and science on religious narrative. Some cultural theorists may think we're in the age of "post-postmodernism," but our theologians are still contending with Descartes, Copernicus, Darwin, and Freud. The most profound impact of modernity is that we can no longer base the authority of our religious testaments on history; our myths and our gods are refuted by scientific reality. We lose our absolutes and the sense of certainty they afforded us.

So in march the postmodernists, from James Joyce to MTV, who learn to play in the house of mirrors, creating compositions and worldviews out of relativities. Entirely less satisfying (feels more like a Slurpee than hot oatmeal that actually fills you). We cultural theorists tried to make sense out of this world of self-references, as if it mattered. What we ended up with was a culture of inside jokes, cynicism, and detachment. Detachment was considered cool, and then "cool" itself was replaced by objectification. So all our kids walk around emulating the models in a Calvin Klein catalog, posturing through their lives, as if getting photographed were the supreme human achievement. One's appearance in an ad or on a billboard could transform that person into an absolute—the benchmark against which others would define themselves.

But I believe this whole Vanity Fair culture, beginning with Joan Didion or Tom Wolfe and ending with David Sedaris or Dave Eggers, has run its course. We've grown sick of living in a vacuum and struggling to remain detached. It's no fun to read magazines through squinty, knowing smirks. We realize that detachment is a booby prize. We want to engage, meaningfully, in the stuff of life. In comes science. And with it comes good old-fashioned innocent awe. Science is not the force that corrupts our nature, it is the open-minded wonder that returns us to it. It is being welcomed back into the culture of narcissism, because we've finally grown tired enough of ourselves to care about something real. We ache to let go of our postured pretentiousness and surrender to that sensation a kid gets at the Epcot Center or planetarium.

The jaw drops, the eyes widen, the mind opens.

DOUGLAS RUSHKOFF, a professor of media culture at New York University's Interactive Telecommunications Program, is the author of Coercion: Why We Listen to What "They" Say and most recently Nothing Sacred: The Truth About Judaism. [more....]



From: Piet Hut

I, too, expect science to be able to deal with any aspect of reality, in due time. The only catch is that we don't have much of an idea what this future science will look like. This means that we can be proud of the method of science and the results that have been obtained so far, but we'd better be modest about claims that our current results more or less describe the world "as it really is." There are two directions in which to argue for this position.

1. Argument from the past. Remember how self-assured many of the leading physicists were toward the end of the nineteenth century? Fundamental physics seemed almost completed—and then suddenly relativity theory and quantum mechanics came along, offering a vastly different understanding of physical reality. Today we still admire the great contributions from people like Maxwell and Kelvin, but we have completely dropped their pictures of what the world really is like.

2. Argument from the future. Imagine living in the year 100,000 (in an optimistic picture in which civilization has not completely destroyed itself). Would it really be plausible that history books would tell you then that science developed in 500 years, from Galileo in 1600 to the year 2100, when the structure of reality was understood—with the rest being 97,900 years of footnotes? I find this extremely hard to believe. I consider it far more likely that we will continue to see "jaws dropping, eyes widening, minds opening," not only in popular presentations but at the very frontier of science as well.

This is why I don't expect science to be able to provide a valid alternative to a full worldview anytime soon. Whether we are looking for an ethical, humanistic, religious, or spiritual view of the world, including our own presence, science just isn't far enough along to address that quest. It makes more sense to use the scientific method to sift through the knowledge that has come down to us through the ages, to try to separate dogma and specific cultural trappings, while highlighting that which seems to be based most on empirical investigations.

Whatever will be discovered with our tools in, say, the year 52,003 already applies to the real world. And the question is, From the vantage point of 52,003, will our current scientific knowledge be seen as more helpful in leading a full life than our current religious and spiritual views? If we distill from the latter what is most closely related to experiential insights into the human mind, my guess would be that these will provide us with the more useful tools for quite a few centuries to come.

PIET HUT, a professor of interdisciplinary studies at the Institute for Advanced Study, in Princeton, is a founding member of the Kira Institute. [more....]



From:
Marc D. Hauser

I read "The New Humanists" with interest, but actually think you have painted a caricature of both scientists and humanists. Somehow you have convinced yourself that the goals of humanists should be more closely aligned with those of science. I think this is a mistake. I think the problem with your essay is that in trying to make the argument that scientists have swallowed up the positions long held by humanists, you have actually blurred two important issues.

The first point concerns what any self-respecting intellectual should know about the world. You argue, and I concur, that one cannot be an educated member of the species Homo sapiens without knowing about the sciences. What the new humanists, as you call them, have done is open the door on some of the mysteries of science by making such information accessible to a general public. Making information accessible is of course all to the good. One might argue, and sometimes I have, that some of the information disseminated by scientists is done in such a way that it is almost mischievously irresponsible. But that's another story. Returning to the main point, I fully agree that to remain woefully ignorant of the sciences is to remain in the bleachers of an intellectual life. But one could equally well accuse many scientists of remaining woefully ignorant of the humanities. I am often shocked and appalled by scientists who have never read some of the classics of literature, who know little about history, who continue to ignore insights from philosophy. The finger can be pointed both ways.

This brings us to the second point, which is unfortunately fused with the first. You seem to suggest that the humanities ought to have the same, or at least similar, goals as the sciences. You applaud humanists who think like scientists and you point the schoolmarm's finger at those who don't. The humanities can and should have different goals. Take, for instance, philosophy. Although I personally have a great affinity for the empirical philosophers such as Dennett, Fodor, Block, Stitch, and Sober, I also enjoy reading work in the philosophy of ethics that toys with interesting moral (fantasy) dilemmas, in the philosophy of language that presents interesting twists on meaning and metaphor, and in the philosophy of mind that simply engages one to think about possible worlds. Many of these philosophical discussions explicitly ignore empirical work, because that is not the underlying mission. I don't think this is bad at all. It's healthy.

There is plenty of room for scientists to do their thing, humanists to do theirs, and for fertile interactions to arise between the two. I agree that the most fertile ground is in the interface zone, but that is a matter of taste!

Two smaller points:

1. You claim that science is an open system. I think you are very much wrong. There are significant constraints on science. Although science may well "move on," it is often constrained by dominant paradigms and often dominated by particularly powerful individuals. There are also ethical constraints, as evidenced recently by heated discussions concerning the use of information from the Human Genome Project to explore biomedical issues related to ethnic background.

2. On science, information, and quantity. The contrast with Moore's Law fails, in my opinion. I have never heard a scientist speak of the quantity of information. Sure enough, there are more journals now than at any time in the past, and all of us complain about keeping up. But I would rather think of science changing as a function of radical new ideas that open the door to looking at problems in new and exciting ways, as opposed to simply gaining new information. Each new paradigm shift changes the game. Sure, there is more information. But it is the new information guided by the new paradigm which is of interest. When Darwin provided his lightning bolt of intuition, he turned people around and caused them to look at problems in a new light. Yes, it led to more information, but quantity wasn't the issue. Similarly, when Noam Chomsky provided his lightning bolt of intuition into the structure of language, it generated immense data sets on the similarities among languages. But critically it provided a new way of looking for new information; again, quantity wasn't the issue.

MARC D. HAUSER is a cognitive neuroscientist at Harvard University, and author of Wild Minds: What Animals Really Think. [more....]



From:
Mihalyi Csikszentmihalyi

John, I do share with you the almost petulant impatience concerning what passes for scholarship in the humanities and the social sciences. The isolation from the rest of the world, the navel-gazing, the faddish swings and inbred coteries are not a pretty sight. But is this situation due to the perversity of humanists, or is it a temporary disease that just now happens to afflict the humanities? You seem to blame mostly the individuals involved, whereas I would hope that the problem resides with the way the humanities have been practiced in the past few generations.

The mandate of the sciences is to explore, discover, and create new ways of looking at the world and new ways of controlling physical processes. Some of this will be useful to humankind; some—such as nuclear waste, greenhouse gases, genetic changes—might yet be our bane. But because every culture (first, second, third...) tends towards hegemony and values dogma, we must pretend that science is an unmitigated blessing. And in the meantime, it is true, as you say, that the pursuit of science and its sexy daughter, technology, are a lot of fun for those involved in the chase.

What we expect from the humanities is something different. It is not the production of novelty but the selection, the evaluation of what is important, meaningful (dare I say “good”?)—and then the transmission of the selected human achievements to the next generation. And the next. Thus the role of the humanities is conservative, bridging the present and the future, with a view to the past. As you know, there cannot be evolution without a mechanism for screening novelties that improve life from inferior novelties: Producing novelty alone does not lead to adaptive change. To help in this process should be the role of the humanities.

Of course, by and large the humanities have abandoned that task. Why? There are surely many reasons, but one of the major ones is that the same criteria that make sense in science have been applied to the humanities. Assistant professors in philosophy or English are hired and promoted on the basis of the “originality” of their contributions—which forces them to come up with ever cuter novelties rather than to reflect on what is valuable and permanent. Young scholars are not rewarded for being good humanists but for applying the “explore, discover, create” approach to texts, in a superficial imitation of the sciences. If there is blame to assign, it is the recent success of the sciences that has helped erode the uniqueness of the humanities.

The domains of the humanities are in trouble. But there is less of a distinction between “scientists” and “humanists” than between the institutional structures and the social reward systems within which the two groups operate. As you report, there are now humanists who think like scientists. It is also probably true that the number of scientists who are provincial in their outlook, who ignore the long-term implications of their work, who disdain anyone outside their circle, is at least as large as that of the benighted humanists. The difference is that the scientists are doing a job appreciated by the majority, while the humanists are not.

My solution to this problem is in some ways the opposite of yours: The humanities need to rediscover their true calling and stick by it. Of course, this does mean that in order to evaluate, select, and transmit valuable knowledge, the humanist has to be acquainted with the products of science and understand their implications. It may no longer be possible for an artist to be at the forefront of science as Leonardo was, but the insularity of both camps ought to decrease. With a common fund of knowledge, the two endeavors can then proceed toward their respective goals.

MIHALYI CSIKSZENTMIHALYI is the Davidson Professor of Management at the Claremont Graduate University, and author of Flow: The Psychology of Optimal Experience; and Finding Flow. [more....]



From: Denis Dutton

It may be tempting to regard as passé the triumph of your New Humanists over the decayed scholarship that has come to count as academic humanism in the last generation. But your thesis needs to be reiterated and elaborated. It touches a nerve, not least because you are talking about how whole careers have been wrought and rewarded in the century past, and how the organized pursuit of knowledge will be undertaken in the century ahead.

We cannot identify what the New Humanists are for without having some notion of what they stand against. As academic disciplines, the humanities, particularly studies of culture and the arts, have arrived at a dead end. If they were moribund it would be bad enough, but they have become a general laughing-stock. The annual meeting of the Modern Language Association is now a standard target for smart journalists looking to deflate the pompous and foolish, giggle-inducing jargon proliferates, and political tendentiousness replaces aesthetic insight in what modestly used to be called literary criticism.

The social reasons why the traditional humanities have split off from the rest of creative, productive thought are complex. Anyone who teaches at a university will know the difficulty of trying to get students to read the long fictional works that used to be the centerpieces of the curriculum of English and literature departments. It's easier to rename an area "cultural studies" and start watching movies and soap operas. The shallowness of discussions of popular culture requires that they be dressed up in impenetrable jargon. While no one would deny the need for a technical vocabulary in genetics, neuroscience, or physics, the jargon encountered in academic cultural studies has become a smokescreen for a lack of thinking-the emperor's clothing of choice.

The humanities have, in adopting jargon, tried to ape the sciences without grasping the actual nature of scientific thinking. In other respects, they have consciously and dogmatically rejected the scientific model altogether. Either way, the result, as you say, is that humanities academics have "so marginalized themselves that they are no longer within shouting distance of the action." E. M. Forster's celebrated phrase "Only connect" has become the misleading slogan of much academic research in the humanities. It misleads because useful thinking does so much more than "only connect" anything with anything else. In science, making connections is a matter of using observation to disclose the mechanisms that lie under experience and produce it: This means disregarding some classes of connection (my star sign and my personality) and deeply analyzing others (my genetic makeup and my eye color). Science advances by using experiment and observation to learn which connections are worth studying and which are pointless to pursue.

The popularity of deconstruction as a humanistic methodology was that it allowed free rein to an "only connect" mentality. As the connections are between words and ideas, the humanities are thus made into a closed system in which any connection-symbolic, metaphorical, however whimsical-is possible and valid. The system is shut off from any external regulation or constraint: Anything really does go. You're therefore right to say that the literary humanities have become self-referential: not only in the sense that they refer constantly to their own history but that they are unchecked by any external standard of reality. From this flows not only the hollowness (and therefore the jargon-mongering) but also the tiresome appeals to authority (name-dropping references replacing argument in scholarship) and the impulse to politicize questions (find the victim, name the oppressor) in order to imbue them with a sense of importance.

As the cheap fireworks of the "theory years" in the humanities fizzle and sputter, your New Humanists do in fact offer a revival of productive, creative thought for anyone who wants to understand better the nature of the human race. Science of the kind you champion supports itself on an independently existing reality: the physical and biological (evolved) universe as it is-independent of human will, including the wishful thinking of English professors. Even when dealing with traditional social and cultural achievements of human history, we need not fall into a "social constructionist" view of the human world. It is a historical fact that human beings have found myriad ways to construct their social and political arrangements, and endless avenues to express themselves artistically. It is equally true that history and anthropology both reveal universal human tendencies in societies and arts, and that the discovery of these universals is not just another social construction but has in principle an epistemic status equivalent to the discoveries of astronomy or genetics. It may be harder to count universal human values and tendencies than to count the planets, but that does not mean that it is pointless or impossible.

Yes, there is something new in the air, after two or three generations of humanistic scholarship that has run itself into the ground. You call it a "realistic biology of the mind." It is a view of humanity that takes the best from physics, biochemistry, evolutionary research and theory, genetics, anthropology, and even rigorous philosophy. It is keen to find an experimental and empirical basis for its general conclusions. It is, frankly, exciting. And best of all, it is just getting started.

DENIS DUTTON, a philosopher, is founder and editor of the Web publication Arts & Letters Daily , and is editor of the journal Philosophy and Literature. [more....]


From: Daniel C. Dennett

I'm happy to join in the Third Culture victory dance, and I agree with most of what you have to say in your essay, but I also share some of the misgivings expressed, and would like to add a few of my own.

As Nick Humphrey urges, you should drop the paranoia. You've—we've—won. And as usual, there's a danger of squandering the spoils and ignoring some of the problems created or exacerbated by victory. As Mihalyi Csikszentmihalyi notes, many of the problems in the humanities these days are due to misplaced science envy, misbegotten attempts to make the humanities more like the natural sciences. And as Marc Hauser says, your essay does contain some self congratulatory caricatures.

Contrary to what you say, there are "systems" and "schools" in science, every bit as ruthless in the suppression of heresy as their counterparts in the humanities. Science abounds in received doctrines and authorities that one questions at the risk of being branded a fool or worse, and for every young humanities scholar writing fashionably formulaic drivel about one deservedly obscure poet or critic or another, there are several young scientists uncritically doing cookbook science, filling in the blanks of data tables that nobody will ever care to consult. I'm told that Sturgeon's Law is that 95 percent of everything is crap, and while I would be inclined to adjust that percentage to about 50 percent (I'm a softie, I guess), as far as I can see, the percentage, whatever it is, is not markedly lower in neuroscience than it is in literary theory. Don't make the mistake of comparing some of the best examples on one side with some of the worst on the other. Hebb's Rule—that if it isn't worth doing, it isn't worth doing well—could put a lot of scientists out of work along with their makework colleagues in the humanities.

It's a two way-street. When scientists decide to "settle" the hard questions of ethics and meaning, for instance, they usually manage to make fools of themselves, for a simple reason: They are smart but ignorant. The reason philosophers spend so much of their time and energy raking over the history of the field is that the history of philosophy consists, in large measure, of very tempting mistakes, and the only way to avoid making them again and again is to study how the great thinkers of the past got snared by them. Scientists who think their up-to-date scientific knowledge renders them immune to the illusions that lured Aristotle and Hume and Kant and the others into such difficulties are in for a rude awakening.

DANIEL C. DENNETT is a philsopher at Tufts University and the author of Darwin's Dangerous Idea; Kinds of Minds; and Freedom Evolves. [more....]



From: Howard Rheingold

Because scientific propositions must be testable, and because questions of humanism versus science come down to how these ways of knowing affect our lives, I propose a test for the role of scientific understanding in human affairs: Can science improve life for most people alive today, and for our heirs, by understanding the nature of cooperation as profoundly as physicists understand matter and biologists understand the processes of life and evolution?

I suspect that if this question, above all others, is not answered soon by some method, all other questions are likely to become moot. Even if we stipulate the advent of a technological singularity in the manner of Vernor Vinge and Ray Kurzweil several decades hence, who today does not have at least a reasonable doubt that machine intelligence will mature quickly enough to take over soon enough to prevent human intelligence from beating itself to death with its own creations?

I pose this as a scientific, not a philosophical question. Certainly the attempt to apply scientific methods to psyches, societies, markets, and civilizations has been less successful to this point than scientific probes into the nature of the cosmos, matter, and life itself. Does this mean that the atom or DNA of cooperation, the fundamental element of human collective good, is eternally elusive, perhaps in some Heisenbergian-Gödelian-Zen sense? Or does it mean that current scientific knowledge of human cooperation and conflict remains inadequate? This is a key question, because we know that science did move beyond age-old inadequate understandings of the physical world when the "new methods" of rational, empirical inquiry emerged from the work of Descartes, Newton, Galileo, and Bacon centuries ago. Is human social behavior beyond the understanding of science, or has science simply not caught up yet?

It isn't necessary to make a case to anyone who follows world events that some serious new thinking about solving the problems of genocide, warfare, terrorism, murder, assault—violent human conflict on all scales—is urgently needed. Traditionally, discourse about this aspect of human nature has been the province of the humanities. Can any scientist say with certainty, however, that such questions are forever beyond the reach of scientific inquiry? Investigations into the nature of disease meandered for centuries in unsupported theory and superstition. When optics and experimentation made possible the knowledge of the germ theory of disease, discovery and application of scientific knowledge directly alleviated human suffering.

Some general characteristics of cooperation among living organisms in general, and humans in particular, have emerged from biological and economic experiments using game theory and sociobiological theories explaining the behavior of organisms. The use of computer simulations in Prisoner's Dilemma and other public-goods games and the application of public-goods games to human subjects has begun to provide the first pieces of the puzzle of how cooperation has evolved up to the present—and, most important, small clues to how it might continue to evolve in the future. Sociological studies of the way that some groups successfully manage common resources have illuminated a few general characteristics of cooperative groups. Recent economic studies of online markets have demonstrated the power of reputation systems. Social network analysis, experimental economics, complex-adaptive-systems theory, all provide relevant evidence. The evolution of social cooperation, aided and abetted by the evolution of technologies, has been the subject of meta-theories of social evolution.

The entire puzzle of how groups of different sizes agree to cooperate, why and how cooperation breaks down, how conflicts arise, intensify, and resolve, is largely unknown. But the puzzle pieces from a dozen different disciplines are beginning to fit together to reveal larger patterns. Part of the current lack of understanding may stem from the nature of specialized scientific inquiry: Biologists, economists, psychologists, sociologists, anthropologists, computer scientists, game theorists, and political scientists have only recently begun to suspect that they hold parts of the same puzzle. It has taken some time for those studying cooperation, reputation, and conflict to recognize the need for interdisciplinary syntheses.

The practical chances of this proposed test of the power of science to do what the humanities have tried to do for centuries depend on whether someone marshals resources and spurs organizational motivation for a full-scale, cross-disciplinary effort to understand cooperation. Unlike knowledge that might lead to new weapons, new media, or new medicines, no organizational or economic structure currently exists to support an Apollo Program of cooperation. And even the best organized and funded effort can't guarantee that an answer exists, or that it won't take a century to discover. The consequences of failure might or might not be the end of all cultures, but if scientific inquiry does succeed in elucidating the nature and dynamics of social cooperation, it will have proved its superiority as a way of knowing that can improve the way most people live. Curing diseases was impressive. Curing conflict would be proof.

HOWARD RHEINGOLD is a communications theorist; his books include The Virtual Community and Smart Mobs: The Next Social Revolution. [more....]



From:
Chris Anderson

First off, the philosopher in me suspects there is some language confusion seeping into this discussion. Both Marc Hauser and Mihalyi Csikszentmihalyi seem to characterize your essay as championing the cause of "scientists" over "humanists." But I think in fact you are arguing that Third Culture scientists have now been joined by enlightened new thinkers from the humanities and that together they can lay claim to the term "humanists."

So I have two questions.

1. Are you sure you want to use the term "humanist" as the banner to unite under? In his controversial speech at TED in 2002, Richard Dawkins pointed out that there is a kind of species-ism inherent in the term that runs counter to some of the most profound insights of the Third Culture revolution...that we are special, but still just part of a much bigger, mind-bogglingly complex evolutionary process that (in your own words) is at an early stage. Dawkins's preferred banner of "atheist" has its own problems. (Why use a negative to define something that is profoundly positive?) If the goal is to reference Michelangelo and Leonardo da Vinci, how about "Renaissance thinker"? Actually this would be a great forum for you to canvas alternatives. "Rationalist"? "Universalist"? There's a lot of historical baggage whichever way you turn.

2. How far can the revolution go without the "humanists" providing something to replace the role of religion? Suppose it turns out that religious instinct and consequent religious group behavior has been a part of our species since sentience first arose? Then the assumption of some scientists that the new intellectual framework they've provided means religion can be abandoned may be as mistaken as the now discredited belief that cultures can simply reinvent sexual and moral norms. Maybe most societies just need religious expression as part of being human. What's interesting is that science, or at least the breathtakingly mysterious world unveiled by science, is potentially capable of filling that role. As Douglas Rushkoff says: "The jaw drops, the eyes widen, the mind opens." But so far this is typically experienced by an individual alone. There is no venue for a group celebration of the mystery of our planet and universe. The very idea seems embarrassing. Yet without the group experience, it is possible that the psychological appeal of church, mosque, and synagogue will be too strong for the revolution you believe in ever to become more than the conviction of an enlightened minority. Howard Rheingold asks whether science can crack the problem of "cooperation." It's a key question. But even more important may be whether it can ever inspire cooperation.

CHRIS ANDERSON, a philosopher by training, is the chairman and host of the TED (for Technology, Entertainment, and Design) Conference held each February in Monterey, California. [more....]



John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: [email protected]
Copyright © 2003 by
Edge Foundation, Inc
All Rights Reserved.
|Top|