Edge 304—November 13, 2009
(German language original: Führende Wissenschafter beantworten die Frage: Was ist Ihre gefährlichste Idee?)
By Robert Buchacher
He is "a kind of thinker, that does not exist in Europe," said La Stampa, the international Turin newspaper. The New York writer, literary agent, corporate and political advisor, John Brockman, 68, is a sore thumb, a maverick brings people and ideas under the same roof, which at first glance don't go together at all. One of his many books is titled: Einstein, Gertrude Stein, Wittgenstein and Frankenstein (1993). Brockman loves the challenge, he is a great lover of art, science, technology, media and the Internet. An intellectual catalyst.
Fascinated by new and unusual ideas, he is an assiduous networker. According to his friend Richard Dawkins, he has "the most enviable address book in the English-speaking world." In 1997 he created with the Internet platform edge (www.edge.org) a sort of Facebook of thinkers, imagine where minds not just their own ideas and projects, but also comment on the thoughts of others, "deliberately in a spirit of provocation," as Brockman says. In his own words Edge presents "speculative ideas, explores new territory in the fields of evolutionary biology, genetics, computer science, neurophysiology, psychology and physics, and answers questions like: What are the origins of the universe, of life, the mind? For the most exciting answers Brockman has created a book that has recently appeared in German. Supplemented by contributions of high profile Austrian scientists we publish excerpts from the book entitled: "What is your dangerous idea? The leading scientists of our time to think the unthinkable", edited by John Brockman.
J. Craig Venter; Paul CW Davies; Rodney Brooks; Paul W. Ewald; Martin Rees; Samuel Barondes; John Horgan; Peter C. Aichelburg; Ray Kurzweil; Mihaly Csikszentmihalyi; Josef Smolen; Georg Wick; Clifford Pickover; Lawrence M. Krauss; Michael Freissmuth; Jordan Pollac; Haim Harari.
We make technology, but our technology also makes us. At the online science/culture journal Edge, BB pal John Brockman went deep -- very deep -- into this concept. Frank Schirrmacher is co-publisher of the national German newspaper FAZ and a very, very big thinker. Schirrmacher has raised public awareness and discussion about some of the most controversial topics in science research today, from genetic engineering to the aging population to the impacts of neuroscience. At Edge, Schirrmacher riffs on the notion of the "informavore," an organism that devours information like it's food. After posting Schirrmacher's thoughts, Brockman invited other bright folks to respond, including the likes of George Dyson, Steven Pinker, John Perry Barlow, Doug Rushkoff, and Nick Bilton. Here's a taste of Schirrmacher, from "The Age of the Infomavore" [...]
JOHN BROCKMAN: What exactly is "the cybernetic idea"? Well, it's not to be confused with the discipline of cybernetics, which hit a wall, and stopped evolving during the 1950s. And it's not your usual kind of idea. The cybernetic idea is an invention. A very big invention. The late evolutionary biologist Gregory Bateson called it the most important idea since the idea of Jesus Christ. [...]
ANNALENA MCAFEE: Unlike your best friend, or the long-vanished bookstore owner, or the former manager of the defunct record shop — all of whom made a number of unintentionally insulting errors of taste — these predictive programs get it right 90 per cent of the time. I am willing to trade my free will — surely already compromised by my birthplace, my parents' religion and circumstances, my genetic inheritance — for these time-saving and life-enriching programs. [...]
GEORGE DYSON: Response to John Bargh ... First we had digital representations of existing ideas. Then we had digital expressions of new, previously unrepresented ideas. And now we have network processes (including human collaboration) that might actually be ideas. ... [...]
At a dinner in the mid-sixties, the composer John Cage handed me a copy of Norbert Wiener's book, Cybernetics. He was talking about "the mind we all share" in the context of "the cybernetic idea". He was not talking Teilhard de Chardin, the Noosphere, or any kind of metaphysics.
The cybernetic idea was built from Turing's Universal Machine in the late thirties; Norbert Wiener's work during World War II on automatic aiming and firing of anti-aircraft guns; John von Neumann's theory of automata and its applications (mid-forties); Claude Shannon's landmark paper founding information theory in 1948.
What exactly is "the cybernetic idea"? Well, it's not to be confused with the discipline of cybernetics, which hit a wall, and stopped evolving during the 1950s. And it's not your usual kind of idea. The cybernetic idea is an invention. A very big invention. The late evolutionary biologist Gregory Bateson called it the most important idea since the idea of Jesus Christ.
The most important inventions involve the grasping of a conceptual whole, a set of relationships which had not been previously recognized. This necessarily involves a backward look. We don't notice it. An example of this is the "invention" of talking. Humans did not notice that they were talking until the day someone said, "We're talking." No doubt the first person to utter such words was considered crazy. But that moment was the invention of talking, the recognition of pattern which, once perceived, had always been there.
So how does this fit in with the cybernetic idea?
It's the recognition that reality itself is communicable. It's the perception that the nonlinear extension of the brain's experience — the socialization of mind — is a process that involves the transmission of neural pattern — electrical, not mental — that's part of a system of communication and control that functions without individual awareness or consent.
This cybernetic explanation tears the apart the fabric of our habitual thinking. Subject and object fuse. The individual self decreates. It is a world of pattern, of order, of resonances. It's an undone world of language, communication, and pattern. By understanding that the experience of the brain is continually communicated through the process of information, we can now recognize the extensions of man as communication, not as a means for the flow of communication. As such they provide the information for the continual process of neural coding.
How is this playing out in terms of the scenarios presented by Frank Schirrmacher in his comments about the effect of the Internet on our neural processes? Here are some random thoughts inspired by the piece and the discussion:
Danny Hillis once said that "the web is the slime mold of the Internet. In the long run, the Internet will arrive at a much richer infrastructure, in which ideas can potentially evolve outside of human minds. You can imagine something happening on the Internet along evolutionary lines, as in the simulations I run on my parallel computers. It already happens in trivial ways, with viruses, but that's just the beginning. I can imagine nontrivial forms of organization evolving on the Internet. Ideas could evolve on the Internet that are much too complicated to hold in any human mind." He suggested that "new forms of organization that go beyond humans may be evolving. In the short term, forms of human organization are enabled."
Schirrmacher reports on Gerd Gigerenzer's idea that "thinking itself somehow leaves the brain and uses a platform outside of the human body. And that's the Internet and it's the cloud. And very soon we will have the brain in the cloud. And this raises the question of the importance of thoughts. For centuries, what was important for me was decided in my brain. But now, apparently, it will be decided somewhere else."
John Bargh notes that research on the prediction and control of human judgment and behavior, has become democratized. "This has indeed produced (and is still producing) an explosion of knowledge of the IF-THEN contingencies of human responses to the physical and social environment … we are so rapidly building a database or atlas of unconscious influences and effects that could well be exploited by ever-faster computing devices, as the knowledge is accumulating at an exponential rate." The import of Bargh's thinking is that the mere existence of a social network becomes an unconscious influence on human judgment and behavior.
George Dyson traces how numbers have changed from representing things, to meaning things, to doing things. He points out that the very activity involved in the socialization of mind means that "we have network processes (including human collaboration) that might actually be ideas."
What does all this add up to?
Schirrmacher is correct when when he points out that in this digital age we are going through a fundamental change which includes how our brains function. But the presence or absence of free will is a trivial concern next to the big challenge confronting us: to recognize the radical nature of the changes that are occuring and to grasp an understanding of the process as our empirical advances blow apart our epistemological bases for thinking about who and what we are. "We're talking."
The editor of the Wall Street Journal is reported to have accused Google of "encouraging promiscuity", which, if true, is a serious charge. His complaint, however, refers not to any advocacy of lax sexual behaviour but to Google News, whose daily aggregate of enticing headlines from around the world discourages old-style loyalty among readers. Once we plighted our troth for life, for better or worse, circulation rise or slump, to one newspaper delivered regularly to our door or bought at newsagents. Now, dazzled by the daily digital passeggiata, we've turned our backs on the stale pleasures of familiarity and spend heady hours communing with whichever passing news providers take our fancy. All for free. We barely pause to ask their names. Monetary issues apart, is this a bad thing? Or even new? Isn't it about choice? Survival of the slickest. I believe they call it the market. Once Rupert Murdoch has worked out a way of actually charging these fickle punters for his services, surely everyone will be happy and it will be business as usual.
So in the absence of hard evidence about the baleful effects of the Internet, one has to resort to anecdote and ill-informed personal observation. These are mine. Young people are on the whole nicer in the digital age. They are happier to spend time with, or at least tolerate the company of, adults than was my own generation of posturing post-soixante-huitards. The digital generation values friendship and understands reciprocity more than its earlier counterparts and is more emotionally insightful and expressive, qualities which I speculate may be enhanced by social networking and texting. These young people may not necessarily be more literate or informed but, unlike previous generations, they know exactly where to go when information is required.
Yes, the personal blogs and social network sites have unleashed an embarrassing pandemic of exhibitionism. But, and here is the liberating thing, we don't have to read it. Look away. And this compulsive urge to tell all in blush-makingly dull detail is no departure in terms of human behaviour. People have been writing painstaking accounts of their inconsequential lives since they first learned the alphabet and got hold of cheap stationery. For every Samuel Pepys, James Boswell or Virginia Woolf there were legions of tedious self-regarding monologuists, showing off to an imagined posterity.
As to predictive programs and I-Tunes "Genius", don't they serve a useful and very old-fashioned purpose? Like your best friend, or the cheerful proprietor of the neighbourhood bookstore, or the dreadlocked manager of your favourite record shop, they know what you like and will helpfully alert you when that artist/writer or, someone operating in a similar field, is about to produce.
I'm alarmed, though, at the prospect that I might have to understand exactly how they work; I cannot imagine buckling down to reviews of "the structures of software", alongside critiques of the new biographies of John Cheever or Ayn Rand. I had always assumed that one of the pleasures of civilization was being relieved of our ancestors' obligation to know how things work. I open my fridge in the morning and, while I am aware that it achieves its effect with compressed gas and electricity, I could not tell you how it works, only that it is pleasingly cold. I reach for a pint of milk, untroubled by my ignorance of modern milking techniques, and my landline rings, by a process which, for all I know, might as well be sorcery.
There are many things I do not know and do not care to know, and I am sure refrigerator engineers, cattle farmers and telephone operatives would have no desire to acquaint themselves with my own small area of expertise. Granted, I find computers more intrinsically interesting than fridges or phones or cows and I did spend some time learning MS-DOS and html code. But in our fast-moving age, these skills are now as useful to me as fluency in Greenlandic Norse or Crimean Gothic. I no longer have the time or patience to find out how it works; just show me what it does.
There is an anxiety that we're all like fat frat boys gorging ourselves at the free, 24-hour, all-you-can-read information buffet. Even here, though, in bouts of online gluttony, we display timeless human traits: the urge to binge in times of plenty, feasting till we're queasy on roasted mammoth, since instinct tells us there might be nothing to eat out there again for a month or two. But our systems are their own regulators. We can only take so much. After a while we long for a simple glass of water, an indigestion pill and wholesome human pleasures, which may or may not involve a book (electronic or paper), music (ipod or live), sport, landscape, love. And as one of your correspondents writes, the young — for whom digital innovation is an unremarkable fact of life — are better at handling the screen-life balance than their seniors, who are too often awestruck by innovation and waylaid by serendipity. The young take for granted today's surfeit of mammoths and they moderate their intakes accordingly.
To baulk at this ease of access, to pathologise online abuse and ignorance, is to behave like a medieval monk, horrified by the dawn of secular literacy and fearful that his illuminated manuscripts will fall into unclean hands. There will always be more penny dreadfuls than priceless masterworks; there is only one Book of Kells but there are many, many Da Vinci Codes. And there is room for them all. No shortage of shelf space here. No shortage of readers, either.
You might be frowning into your screen over findings on the absence of retroviral restriction in cells of the domestic cat, he might be surfing Google News for updates on Britney's latest tour, she might be assessing commercial possibilities of hydrogen-producing algae, while they browse the UFO sites, book a cheap flight, and check the last stanza of a John Donne sonnet. It's all there for the taking. How you use it is the point. And who knows? The Britney fan, who might never have strayed into a library in his life, could one day find himself momentarily sidetracked by a website about the New Zealand poet Charles Edgar Spear, click through a hyperlink to TS Eliot and develop a passion for modernist verse. Or, more likely, drawn by a wittily-worded account of Britney's current Australian tour in one of Mr Murdoch's publications, he might eschew his libertine ways, shell out for an online subscription to the newspaper, and settle down to a life of blameless monogamy.
In the beginning, numbers represented things. Digital encoding then gave numbers the power of meaning things. Finally, the order codes were unleashed, and numbers began doing things.
The cultural and intellectual transitions described by Schirrmacher and his commentators are higher-level manifestations of this. First we had digital representations of existing ideas. Then we had digital expressions of new, previously unrepresented ideas. And now we have network processes (including human collaboration) that might actually be ideas.
Is free will ours to lose?
|Gigerenzer v. Thaler. Decision-making: "Risk school Can the general public learn to evaluate risks accurately, or do authorities need to steer it towards correct decisions? Michael Bond talks to the two opposing camps. Nature [...]|
Inflation does not provide a natural explanation for why the early universe looks like it does unless you can give me an answer for why inflation ever started in the first place. That is not a question we know the answer to right now. That is why we need to go back before inflation into before the Big Bang, into a different part of the universe to understand why inflation happened versus something else. There you get into branes and the cyclic universe. ... I really don't like any of the models that are on the market right now. We really need to think harder about what the universe should look like.
DOES THE UNIVERSE LOOK THE WAY IT DOES?
SEAN CARROLL, a theoretical physicist, is a senior research associate at Caltech. His research interests include theoretical aspects of cosmology, field theory, and gravitation. He is the author of a Spacetime and Geometry: An Introduction to General Relativity; and From Eternity to Here: The Quest for the Ultimate Theory of Time. And he is cofounder and contributor to the Cosmic Variance blog.
WHY DOES THE UNIVERSE LOOK THE WAY IT DOES?
[SEAN CARROLL:] Why does the universe looks the way it does?
We are in a very unusual situation in the history of science where physics has become slightly a victim of its own success. We have theories that fit the data, which is a terrible thing to have when you are a theoretical physicist. You want to be the one who invents those theories, but you don't want to live in a world where those theories have already been invented because then it becomes harder to improve upon them when t
hey just fit the data. What you want are anomalies given to us by the data that we don't know how to explain.
One of the interesting things about the string theory situation, where we are victims of our own success, where we have models that fit the data very well but we are trying to move beyond them, is that the criteria for success has changed a little bit. It's not that one theory or another makes a prediction that you can go out and test tomorrow. We all want to test our ideas eventually, but it becomes a more long-term goal when it's hard to find data that doesn't already agree with the existing theories. We know that the existing theories aren't right and we need to move beyond them.
I have an opinion which is slightly heterodox, about the standard ideas in cosmology. The inflationary universe scenario, that Alan Guth really pioneered, people like Andre Linde and Paul Steinhardt really pushed very hard. This is a wonderful idea, which I suspect is right. I suspect that some part of the history of the universe is correctly explained by the idea of inflation, the idea that we start in this little tiny region that expanded and accelerated at this super-fast rate. However, I think that the way most people, including the people who invented the idea, think about inflation is wrong. They are too sanguine about the idea that inflation gets rid of all the problems that the early universe might have had. There is this feeling that inflation is like confession — that is wipes away all prior sins. I don't think that is right. We haven't explained what needs to be explained until we take seriously the question of why inflation ever started in the first place. It's actually a mistake and something wrong on the part of many of the people who buy into inflation that inflation doesn't need to answer that question because once it starts it answers all the questions that you have.
Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.