About
Features
Editions
Press
Events
Dinner
Question Center
Subscribe

Edge 303—November 5, 2009
[7,050 words]

THE REALITY CLUB

On "THE AGE OF THE INFORMAVORE": A Talk with Frank Schirrmacher
Jaron Lanier, Nick Bilton, Nick Carr, Douglas Rushkoff, Jesse Dylan, Virginia Heffernan, Gerd Gigerenzer, John Perry Barlow, Steven Pinker, John Bargh

On ARE THE DISCIPLINARY BOUNDARIES PERMEABLE? (STUTTGARTER ZEITUNG)
Gábor Paál responds to
Michael Naumann


THIRD CULTURE NEWS
TERCERA CULTURA — CHILE
Un podcast divulgacion de la Cience Cognitiva Contemporanea

BEYOND EDGE
Jerry Adler, Alison Gopnik, Stewart Brand, Jerry Coyne, Scott Atran


On "THE AGE OF THE INFORMAVORE": A Talk with Frank Schirrmacher

Jaron Lanier, Nick Bilton, Nick Carr, Douglas Rushkoff, Jesse Dylan, Virginia Heffernan, Gerd Gigerenzer, John Perry Barlow, Steven Pinker, John Bargh

PERMALINK


JOHN BARGH: The discovery of the pervasiveness of situational priming influences for all of the higher mental processes in humans does say something fundamentally new about human nature (for example, how tightly tied and responsive is our functioning to our particular physical and social surroundings).  It removes consciousness or free will as the bottleneck that exclusively generates choices and behavioral impulses, replacing it with the physical and social world itself as the source of these impulses. [...]

STEVEN PINKER: I would suggest another way to look at the effects of technology on our collective intelligence. Take the intellectual values that are timeless and indisputable: objectivity, truth, factual discovery, soundness of argument, insight, explanatory depth, openness to challenging ideas, scrutiny of received dogma, overturning of myth and superstition. Now ask, are new technologies enhancing or undermining those values? [...]

JOHN PERRY BARLOW: I have always wanted to convey to every human being the Right to Know — the protected technical means to fulfill all curiosities with the best answers human beings had yet derived — but the Ability to Know (Everything) is a capacity we don't and won't possess individually. [...]

GERD GIGERENZER: We might think of mentality and technology as two sides of the same coin, as a system in which knowledge, skills, and values are distributed.  This requires a new type of psychology that goes beyond the individual and studies the dynamics of human adaptation to the very tools humans create. [...]

VIRGINIA HEFFERNAN: ... there is a great deal of anxiety, irritation, unease and impatience in Internet use. There is even some self-loathing. What am I doing on the Web—when I used to read books bound in Moroccan leather; stroll in the sunshine; spend hours in focused contemplation of Hegel or Coleridge? [...]

JESSE DYLAN: How the human brain must adapt to the modern era and where those changes will take us are a mystery. What knowledge will a person need in the future when information is ubiquitous and all around us? Will Predictive technologies do away with free will. Google will be able to predict wether you are enjoying the Neil Young concert you are attending before you yourself know. Science fiction becomes reality. [...]

DOUGLAS RUSHKOFF: We continue to build and accept new technologies into our lives with little or no understanding of how these devices have been programmed. We do not know how to program our computers. We spend much more time and energy trying to figure out how to program one another, instead. And this is potentially a grave mistake. [...]

NICHOLAS CARR: "Importance is individualism," says Nick Bilton, reassuringly. We'll create and consume whatever information makes us happy, fulfills us, and leave the rest by the wayside. Maybe. Or maybe we'll school like fish in the Web's algorithmic currents, little Nemos, each of us convinced we're going our own way because, well, we never stop talking, never stop sharing the minutiae of our lives and thoughts. Look at me! Am I not an individual? [...]

NICK BILTON: The new generation, born connected, does not feel the need to consume all the information available at their fingertips. They consume what they want and then affect or change it, they add to it or negate it, they share it and then swiftly move along the path. They rely on their community, their swarm, to filter and share information and in turn they do the same; it's a communism of content. True ideology at it's best. [...]

JARON LANIER: To continue to perceive almost supernatural powers in the Internet (an ascendant perception, as Schirrmacher accurately reports) is to cede the future to reactive religious fanatics. [...]

GEORGE DYSON: When you are an informavore drowning in digital data, analog looks good. [...]

DANIEL KAHNEMAN: The link with Bargh is also interesting, because John pushes the idea that we are driven from the outside and controlled by a multitude of cues of which we are only vaguely aware — we are bathing in primes. [...]


JOHN BARGH
Social Psychologist, Yale University; Director. the ACME (Automaticity in Cognition, Motivation and Evaluation) Lab

I tend to worry less about information overload at the personal, individual level and more about it at the societal and governmental level. The human brain is long used to being overloaded with sensory information, throwing most input away in the first half-second after sensing it; we are constantly bombarded by 'primes' or implicit suggestions as to what to think, feel, and do — yet we manage usually to stably do one thing at a time.  The brain is used to dealing with conflicting messages too, and managing and integrating the activity of so many physiological and nervous subsystems — but as the work of Ezequiel Morsella is showing, keeping all of that management out of conscious view so we never experience it.

We are already and have long been multitaskers, in other words, we just do it (so well) unconsciously, not consciously.  It is conscious multitasking (talking on the phone while driving) that we are so bad at because of the limits of conscious attention, but multitasking per se — we are built for that. As we gain skills those skills require less and less of that conscious attention so that an expert such as Michael Jordan, or today, Kobe or Lebron, can consciously plot his strategy for weaving through a maze of defenders down the court because his limited conscious attention is no longer needed for dribbling, body movements, head fakes, and so on.  Driving a car requires incredible multitasking at first but is soon much less difficult because the multitasking 'moves downstairs' and out of the main office, over time.

But Schirrmacher is quite right to worry about the consequences of a universally available digitized knowledge base, especially if it concerns predicting what people will do.  And most especially if artificial intelligence agents can begin to search and put together the burgeoning data base about what situation (or prime) X will cause a person to do. The discovery of the pervasiveness of situational priming influences for all of the higher mental processes in humans does say something fundamentally new about human nature (for example, how tightly tied and responsive is our functioning to our particular physical and social surroundings).  It removes consciousness or free will as the bottleneck that exclusively generates choices and behavioral impulses, replacing it with the physical and social world itself as the source of these impulses.

But the discovery that people are actually rather easy to influence and predict (once we know the triggering environmental cues or prompts) is fact is today being exploited as a research tool because we know now that we can activate and study complex human psychological systems with very easy priming manipulations.  A quarter century ago the methods to activate (to then study) aggressive or cooperative tendencies were more expensive and difficult, involving elaborate deceptions, confederates, and staged theatrics.  It is said that the early cognitive dissonance theorists such as Eliot Aronson used to routinely have their graduate students take theater classes.  And other social psychologists of that generation, such as Richard Nisbett, have publicly complained (in a good-natured way) about 'rinky-dink' priming manipulations that somehow produce such strong effects. (This reminds me of Kahneman and Tversky's representativeness heuristic; here the belief that complex human outputs must require complex causes.)

It is because priming studies are so relatively easy to perform that this method has opened up research on the prediction and control of human judgment and behavior, 'democratized' it, basically, because studies can be done much more quickly and efficiently, and done well even by relatively untrained undergraduate and graduate students.  This has indeed produced (and is still producing) an explosion of knowledge of the IF-THEN contingencies of human responses to the physical and social environment.  And so I do worry with Schirrmacher on this score, because we so rapidly building a database or atlas of unconscious influences and effects that could well be exploited by ever-faster computing devices, as the knowledge is accumulating at an exponential rate.

More frightening to me still is Schirrmacher's postulated intelligent artificial agents who can, as in the Google Books example, search and access this knowledge base so quickly, and then integrate it to be used in real-time applications to manipulate the target individual to think or feel or behave in ways that suit the agent's (or its owner's) agenda of purposes. (Of course this is already being done in a crude way through advertising, both commercial and political; we have just shown for example that television snack food ads increase automatic consumption behavior in the viewer by nearly 50%, in children and adults alike.)


STEVEN PINKER
Harvard College Professor and Johnstone Family Professor of Psychology, Harvard University; Author, The Stuff of Thought

You're at a dinner in a restaurant, and various things come up in conversation — who starred in a movie, who was president when some event happened, what some religious denomination believes, what the exact wording is of a dimly remembered quotation. Just as likely as not, people around the table will pull out their iPhones, their Blackberries, their Androids, and search for the answer. The instant verification not only eases the frustration of the countless tip-of-the-tongue states that bog down a conversation, but offers a sobering lesson on how mistaken most of us are most of the time.

You'll be amazed at the number of things you remember that never happened, at the number of facts you were certain of that are plainly false. Everyday conversation, even among educated people, is largely grounded in urban legends and misremembered half-truths. It makes you wonder about the soundness of conventional wisdom and democratic decision-making — and whether the increasing availability of fact-checking on demand might improve them. 

I mention this because so many discussions of the effects of new information technologies take the status quo as self-evidently good and bemoan how intellectual standards are being corroded (the "google-makes-us-stoopid" mindset). They fall into the tradition of other technologically driven moral panics of the past two centuries, like the fears that the telephone, the telegraph, the typewriter, the postcard, radio, and so on, would spell the end of civilized society.

Other commentaries are nonjudgmentally fatalistic, and assume that we’re powerless to evaluate or steer the effects of those technologies — that the Internet has a mind and a will of its own that’s supplanting the human counterparts. But you don’t have to believe in "free will" in the sense of an immaterial soul to believe in "free will" in the sense of a goal-directed, intermittently unified, knowledge-sensitive decision-making system. Natural selection has wired that functionality into the human prefrontal cortex, and as long as the internet is a decentralized network, any analogies to human intentionality are going to be superficial.

Frank Schirrrmacher’s reflections thankfully avoid both extremes, and I would suggest another way to look at the effects of technology on our collective intelligence. Take the intellectual values that are timeless and indisputable: objectivity, truth, factual discovery, soundness of argument, insight, explanatory depth, openness to challenging ideas, scrutiny of received dogma, overturning of myth and superstition. Now ask, are new technologies enhancing or undermining those values? And as you answer, take care to judge the old and new eras objectively, rather than giving a free pass to whatever you got used to when you were in your 20s.

One way to attain this objectivity is to run the clock backwards and imagine that old technologies are new and vice-versa. Suppose someone announced: "Here is a development that will replace the way you’ve been doing things. From now on, you won’t be able to use Wikipedia. Instead you’ll use an invention called The Encyclopedia Britannica. You pay several thousand dollars for a shelf-groaning collection of hard copies whose articles are restricted to academic topics, commissioned by a small committee, written by a single author, searchable only by their titles, and never change until you throw the entire set and buy new ones." Would anyone argue that this scenario would make us collectively smarter?

If social critics started to scrutinize the immediate past and obsolescing present and not just the impending future, our understanding of the effects of technology on intellectual quality would be very different. The fact is that most of our longstanding, prestigious informational institutions are, despite their pretentions, systematically counter-intellectual. In the spirit of the technophobe screeds, let me describe them in blunt, indeed hyperbolic terms.

Many of the articles in printed encyclopedias stink — they are incomprehensible, incoherent, and instantly obsolete. The vaunted length of the news articles in our daily papers is generally plumped out by filler that is worse than useless: personal-interest anecdotes, commentary by ignoramuses, pointless interviews with bystanders ("My serial killer neighbor was always polite and quiet"). Precious real-estate in op-ed pages is franchised to a handful of pundits who repeatedly pound their agenda or indulge in innumerate riffing (such as interpreting a "trend" consisting of a single observation). The concept of "science" in many traditional literary-cultural-intellectual magazines (when they are not openly contemptuous of it) is personal reflections by belletristic doctors. And the policy that a serious book should be evaluated in a publication of record by a single reviewer (with idiosyncratic agendas, hobbyhorses, jealousies, tastes, and blind spots) would be risible if we hadn’t grown up with it.

For all their flaws, media such as Wikipedia, news feeds, blogs, website aggregators, and reader reviews offer the potential for great advances over the status quo — not just in convenience but in intellectual desiderata like breadth, rigor, diversity of viewpoints, and responsibility to the factual record. Our intellectual culture today reflects this advance — contrary to the Cassandras, scientific progress is dizzying; serious commentary on the internet exceeds the capacity of any mortal reader; the flow of philosophical, historical, and literary books (many of doorstop length) has not ebbed; and there is probably more fact-checking, from TV news to dinner tables, than an any time in history. Our collective challenge in dealing with the Internet is to nurture these kinds of progress. 


JOHN PERRY BARLOW
Co-founder , Co-Chair, Electronic Frontier Foundation; Cyberspace pioneer ("The Jefferson of the Internet")

I am the very definition of fiercely mixed feelings on this subject.

I have always wanted to convey to every human being the Right to Know — the protected technical means to fulfill all curiosities with the best answers human beings had yet derived — but the Ability to Know (Everything) is a capacity we don't and won't possess individually.

Even as we can drill deeper into the collectively-known, our ability to know the collective becomes more superficial.

More than ever, we have to trust the formation of Collective Consciousness, the real Ecosystem of Mind.


GERD GIGERENZER
Psychologist; Director of the Center for Adaptive Behavior and Cognition, Max Planck Institute for Human Development, Berlin; Author, Gut Feelings: The Intelligence of the Unconscious

Technology and Mentality

Frank Schirrmacher asks, does new technology change human cognition and behavior, and if so, how? This question is a true wake-up question, but its answer is far from obvious. The technophobe might conjecture that new technologies grow smarter while humans grow dumber, like my bank accountant yesterday, who could not calculate 20% of 500 euros without a pocket calculator. The technophile would respond that everything simply gets better, just as eyesight improves with glasses and friendship becomes easier with Facebook.

But there is a more interesting answer: the dynamic symbiosis of technology and mentality. A symbiosis is to the mutual benefit of two different species but requires mutual adaptation. Consider the invention that has changed human mental life more than anything else, writing and, subsequently, the printing press. Writing made analysis possible: One can compare texts, which is difficult in an oral tradition.

Writing also made exactitude possible, as in higher-order arithmetic; without a written form, these mental skills quickly reach their limits. But writing makes long-term memory less important than it once was, and schools have largely replaced the art of memorization with training in reading and writing. So it’s neither loss nor gain, but both. And this means new adaptations between mentality and technology. In turn, new abilities create new tools that support new abilities, and so the spiral evolves.

The computer is another instance. The invention of the computer has been described as the third information revolution, after the advent of writing and the printing press. As early as the 1960s, electrical engineer Doug Engelbart had designed the first interactive computer tools, including the mouse, on-screen editing, screen windows, hypertext, and electronic mail. However, at this time, human-computer interaction still seemed science fiction; computers were for processing punched cards, not for interacting with humans. The impact computers had on society and science was difficult to imagine, and it went in both directions: computers and humans coevolve.

The first computer was a group of human beings: the large-scale division of labor, as evidenced in the English machine-tool factories and in the French government's manufacturing of logarithmic and trigonometric tables for the new decimal system in the 1790s.

Inspired by Adam Smith's praise of the division of labor, French engineer Prony organized the project in a hierarchy of tasks. At the top were a handful of first-rank mathematicians who devised the formulas; in the middle, seven or eight persons trained in analysis; and at the bottom, 70 or 80 unskilled persons who performed millions of additions and subtractions. Once it was shown that elaborate calculations could be carried out by an assemblage of unskilled workers rather than by a genius such as Gauss, each knowing very little about the larger computation, Charles Babbage was able to conceive of replacing these workers with machinery.

Babbage, an enthusiastic "factory tourist," explicitly referred to this division of mental labor as the inspiration for his mechanical computer, using terms from the textile industry, such as 'mill' and 'store' to describe its parts. Similarly, he borrowed the use of punched cards from the Jacquard loom, the programmable weaving machines that used removable cards to weave different patterns. Thus, initially there was a new social system of work, and the computer was created in its image.

Through dramatic improvements in hardware and speed, the computer became the basis for a fresh understanding of the human mind. Herbert Simon and Allan Newell proposed that human thought and problem solving were to be understood as a hierarchical organization of processes, with subroutines, stores, and intermediate goal states that decomposed a complex problem into simple tasks.

In fact, a social system rather than a computer performed the trial run for the Logic Theorist, their first computer program. Simon's wife, children, and graduate students were assembled in a room, and each of them became a subroutine of the program, handling and storing information. This was the same the Manhattan project, where calculations were done by an unskilled workforce of mostly women, at low pay.

Similarly, Marvin Minsky, one of the founders of artificial intelligence, regarded the mind as a society of dumb agents, collectively creating true intelligence. Similarly, anthropologists have begun to use computer analogies to understand how social groups make decisions "in the wild," such as how the crew on a large ship solves the problem of navigation by storing, processing, and exchanging information. The direction of the analogy thus eventually became reversed: Originally, the computer was modeled on a new social system of work; now social systems of work are modeled on the computer.

We might think of mentality and technology as two sides of the same coin, as a system in which knowledge, skills, and values are distributed. This requires a new type of psychology that goes beyond the individual and studies the dynamics of human adaptation to the very tools humans create.


VIRGNIA HEFFERNAN
Columinist ("The Medium"), The New York Times

The metaphor that seems most alive to me in Frank Schirrmacher's disquisition is one of eating. On the one hand, the title of the interview — "The Age of the Informavore" — suggests a model of man as an eater of information. On the other, Schirrmacher speaks provocatively of information that battens on human attention (and dies when starved of it); of information, in other words, that eats us. This two-way model of consumption in the Internet age — we consume information, information consumes us — ought to be kept before us, lest we repress it and be made anxious that way.

Because — right? — there is a great deal of anxiety, irritation, unease and impatience in Internet use. There is even some self-loathing. What am I doing on the Web—when I used to read books bound in Moroccan leather; stroll in the sunshine; spend hours in focused contemplation of Hegel or Coleridge?

If the Internet is a massive work of art, as I believe it is, it has modernist properties: it regularly promotes a feeling of unease and inadequacy (rather than jubilation, satisfaction, smugness, serenity, etc). As Schirrmacher's interview suggests, perhaps this is because the Internet user feels as though he is forever trying to eat or be eaten, and he's both undernourished and afraid.

A critic of the Internet attuned to its aesthetic properties might ask: How does it generate this effect? I'm inclined to believe there's a long and fascinating answer to this question. I'm also inclined to believe that, in time, consumers and producers of the Internet — and we are all both at once — will find ways to leave off apocalyptic thinking and generate and savor the other sensory-emotional effects of the Web.


JESSE DYLAN
Film-Maker; Founder, free-form.tv; Lybba.org

How the human brain must adapt to the modern era and where those changes will take us are a mystery. What knowledge will a person need in the future when information is ubiquitous and all around us? Will Predictive technologies do away with free will. Google will be able to predict wether you are enjoying the Neil Young concert you are attending before you yourself know. Science fiction becomes reality.

Schirrmacher speaks about Kafka and Shakespeare reflecting the societies they lived in and the importance of artists to translate the computer age.

This lecture is a warning to us to be aware of the forces that shape us. The pace of change in new technologies is so rapid it makes me wonder wether it's already too late.


DOUGLAS RUSHKOFF
Media Analyst; Documentary Writer; Author, Life, Inc.


These are refreshingly disturbing reflections on the digital, from the mind of a caring individual who would hate to see human cognition overrun before its time. As one who once extolled the virtues of the digital to the uninitiated, I can't help but look back and wonder if we adopted certain systems too rapidly and unthinkingly. Or even irreversibly.

But I suspect Schirrmacher and most of us cheering for humanity also get unsettled a bit too easily — drawn into obsessing over the disconnecting possibilities of technology, and making us no better than an equal and opposite force to techno-libertarians celebrating the Darwinian wisdom of hive economics. Both extremes of thought and prediction are a symptom of thinking too little rather than too much about all this.

This is why Schirrmacher's thinking is, at its heart, a call to do more thinking — the kind of real reflection that happens inside and among human brains relating to one another in small groups, however elitist that may sound to the technomob. ("Any small group will do" is the answer to their objections, of course. Freedom means freedom to choose your fellow conversants, and not everything needs to be posted for the entire world with "comments on" and "copyright off'".)

It's the inability to draw these boundaries and distinctions — or the political incorrectness of suggesting the possibility — that paints us into corners, and prevents meaningful discussion. And I believe it's this meaning we are most in danger of losing.

I would argue we humans are not informavores at all, but rather consumers of meaning. My computer can digest and parse more information than I ever will, but I dare it to contend with the meaning. Meaning is not trivial, even though we have not yet found metrics capable of representing it. This does not mean it does not exist, or shouldn't.

Faced with a networked future that seems to favor the distracted over the focused and the automatic over the considered, it's no wonder we should want to press the pause button and ask what this all means to the future of our species. And while the questions this inquiry raises may be similar in shape to those facing humans passing through other great technological shifts, I think they are fundamentally different this time around.

For instance, the unease pondering what it might mean to have some of our thinking done out of body by an external device is in some ways just a computer-era version of the challenges to "proprioception" posed by industrial machinery. Where does my body or hand really end? becomes "what are the boundaries of my cognition?"

But while machines replaced and usurped the value of human labor, computers do more than usurp the value of human thought. They not only copy our intellectual processes — our repeatable programs — but they discourage our more complex processes — our higher order cognition, contemplation, innovation, and meaning making that should be the reward of "outsourcing" our arithmetics to silicon chips.

The way to get on top of all this, of course, would be to have some inkling of how these "thinking" devices were programmed — or even to have some input into the way they do so. Unlike our calculators, we don't even know what we are asking our machines to do, much less how they are going to go about doing it. Every Google search is — at least for most of us — a Hail Mary pass into the datasphere, requesting something from an opaque black box.

So we continue to build and accept new technologies into our lives with little or no understanding of how these devices have been programmed. We do not know how to program our computers. We spend much more time and energy trying to figure out how to program one another, instead. And this is potentially a grave mistake.


NICHOLAS CARR
Author, Does IT Matter?; The Big Switch

The digital computer, Alan Turing told us, is a universal machine. We are now learning that, because all types of information can be translated into binary code and computed, it is also a universal medium. Convenient, cheap, and ubiquitous, the great shared computer that is the Internet is rapidly absorbing all our other media. It's like a great sponge, sucking up books, newspapers, magazines, TV and radio shows, movies, letters, telephone calls, even face-to-face conversations. With Google Wave, the words typed by your disembodied correspondent appear on your screen as they're typed, in real time.

As Frank Schirrmacher eloquently and searchingly explains, this is the new environment in which our brains exist, and of course our brains are adapting to that environment — just as, earlier, they adapted to the environment of the alphabet and the environment of print. As the Net lavishes us with more data than our minds can handle, Schirrmacher suggests, we will experience a new kind of natural selection of information and ideas, even at the most intimate, everyday level: "what is important, what is not important, what is important to know?" We may not pause to ask those questions, but we are answering them all the time.

I expect, as well, that this kind of competition, playing out in overtaxed, multitasking, perpetually distracted brains, will alter the very forms of information, and of media, that come to dominate and shape culture. Thoughts and ideas will need to be compressed if they're to survive in the new environment. Ambiguity and complexity, expansiveness of argument and narrative, will be winnowed out. We may find ourselves in the age of intellectual bittiness, which would certainly suit the computers we rely on. The metaphor of brain-as-computer becomes a self-fulfilling prophecy: To keep up with our computers, we have to think like our computers.

"Importance is individualism," says Nick Bilton, reassuringly. We'll create and consume whatever information makes us happy, fulfills us, and leave the rest by the wayside. Maybe. Or maybe we'll school like fish in the Web's algorithmic currents, little Nemos, each of us convinced we're going our own way because, well, we never stop talking, never stop sharing the minutiae of our lives and thoughts. Look at me! Am I not an individual? Even if Bilton is correct, another question needs to be asked: does the individualism promoted by the Net's unique mode of information dispersal deepen and expand the self or leave it shallower and narrower? We've been online for twenty years. What have we accomplished, in artistic, literary, cultural terms? Yes, as Schirrmacher points out, we have "catharsis" — but to what end?

Resistance is not futile, says Jaron Lanier. That's certainly true for each of us as individuals. I'm not so sure it's true for all of us as a society. If we're turning into informavores, it's probably because we want to.


NICK BILTON
Adjunct Professor, NYU/ITP; Design Integration Editor, The New York Times


I am utterly perplexed by intelligent and innovative thinkers who believe a connected world is a negative one. How can we lambast new technology, transition and innovation? It's completely beyond my comprehension.

It is not our fear of information overload that stalls our egos, it's the fear that we might be missing something. Seeing the spread of social applications online over the past few years I can definitively point to one clear post-internet generational divide.

The new generation, born connected, does not feel the need to consume all the information available at their fingertips. They consume what they want and then affect or change it, they add to it or negate it, they share it and then swiftly move along the path. They rely on their community, their swarm, to filter and share information and in turn they do the same; it's a communism of content. True ideology at it's best. They, or should I say I, feel the same comfort from a pack of informavores rummaging together through the ever-growing pile of information while the analog generation still feels towards an edited newspaper or the neatly packaged one-hour nightly news show.

Frank Schirrmacher asks the question "what is important, what is not important, what is important to know?" The answer is clear and for the first time in our existence the internet and technology will allow it: importance is individualism. What is important to me is not important to you, and vice-a-versa. And individualism is the epitome of free will. Free will is not a prediction engine, it's not an algorithm on Google or Amazon, it's the ability to share your thoughts and your stories with whomever wants to consume them, and in turn for you to consume theirs. What is import is our ability to discuss and present our views and listen to thoughts of others.

Every moment of our day revolves around the idea of telling stories. So why should a select group of people in the world be the only ones with a soapbox or the keys to the printing press to tell their stories? Let everyone share their information, build their communities, and contribute to the conversation. I truly believe that most in society have only talked about Britney Spears and Ashton Kutcher because they were only spoken to in the past, not listened to. Not allowed to a part of the conversation. Of course they threw their hands in their air and walked away. Now they are finally coming back to the discussion.

As someone born on the cusp of the digital transition, I can see both sides of the argument but I can definitively assure you that tomorrow is much better than yesterday. I am always on, always connected, always augmenting every single moment of my analog life and yet I am still capable of thinking or contemplating any number of existential questions. My brain works a little differently and the next generation's brains will work a little differently still. We shouldn't assume this is a bad thing. I for one hold a tremendous amount of excitement and optimism about how we will create and consume in the future. It's just the natural evolution of storytelling and information.


JARON LANIER
Musician, Computer Scientist; Pioneer of Virtural Reality

It is urgent to find a way to express a softer, warmer form of digital modernity than the dominant one Schirrmacher correctly perceives and vividly portrays.  The Internet was made up by people and stuffed with information by people, and there is no more information in it than was put in it.  That information has no meaning, or existence as information in the vernacular sense, except as it can be understood by an individual someday.  If Free Will is an illusion, then the Internet is doubly an illusion.

To continue to perceive almost supernatural powers in the Internet (an ascendant perception, as Schirrmacher accurately reports) is to cede the future to reactive religious fanatics.  Here is why:

The ideas Schirrmacher distills include the notion that free will is an illusion, while the Internet is driven by powers that are beyond any of us; essentially that we don't have free will but the Internet does.  If the message of modernity is "people don't exist, but computers do,"  then expect modernity to be rejected by most people.  Those who currently like this formulation are the ones who think they will be the beneficiaries of it- the geeky, technical, educated elite.  But they are kidding themselves.  

Partisan passions and the "open" anonymous vision of the Internet promoted by the Pirates are so complementary, it's as if they were invented for each other.  The Pirates will only thrive briefly before they have super-empowered more fanatical groups.  

If the new world brought about by digital technologies is to enhance Darwinian effects in human affairs, then digital culture will devour itself, becoming an ouroboros that will tighten into a black hole and evaporate. Unless, that is, the Pirates can become immortal through technology before it is too late, before their numbers are overtaken, for instance, by the high birth rates of retro religious fanatics everywhere.  This race for immortality is not so hidden in the literature of digital culture.  The digital culture expressed by the Pirates is simultaneously nihilist and maniacal/egocentric.

My one plea to Schirrmacher is to shed the tone of inevitability.  It is absolutely worth resisting the trend he identifies.


GEORGE DYSON
Science Historian; Author, Darwin Among the Machines

Nine years after his Wake Up Call for European Tech, issued just as the Informavores sat down to eat, Frank Schirrmacher is back, reminding us of the tendency to fall asleep after a heavy meal. All digital all the time may be too much of a good thing. Can we survive the deluge?

I see hope on the horizon. Analog computing! For real. The last we saw of analog computing, we were trying to get differential analyzers to solve problems that can be solved much more accurately, and much faster, digitally. Analog computing is as extinct as your grandfather's slide rule! Nonetheless, many things can be done better by analog computing than by digital computing, and analog is making a return.

Some of the most successful recent developments — Google, Facebook, Twitter, not to mention the Web as a whole — are effectively operating as large analog computers, although there remains a digital substrate underneath. They are solving difficult, ambiguous, real-world problems — Are you really my friend? What's important? What does your question mean — through analog computation, and getting better and better at it, adaptation (and tolerance for noise and ambiguity) being one of analog computing's strong suits.

When you are an informavore drowning in digital data, analog looks good.


DANIEL KAHNEMAN
Eugene Higgins Professor of Psychology, Princeton; Recipient, 2002 Nobel Prize in Economic Sciences


Very interesting interview, which is itself a nice example of what Schirrmacher is talking about: it should be read very quickly, to get a vague sense of unease, of possibilities, of permeable boundaries between self and others, between one's thoughts and those you get from others. You do get something out of it, and may find yourself thinking slightly differently because of it.

The interview vividly expresses the sense many of us are getting that when we are bathed in information (it is not really snippets of information, we need the metaphor of living in a liquid that is constantly changing in flavor and feel) we no longer know precisely what we have learned, nor do we know where our thoughts come from, or indeed whether the thoughts are our own or absorbed from the bath. The link with Bargh is also interesting, because John pushes the idea that we are driven from the outside and controlled by a multitude of cues of which we are only vaguely aware — we are bathing in primes.

Will all this change what it is like to be human? Will it change what consciousness is like? There must be people out there who study teenagers who have lived in this environment all their life, and they should be the one to tell us. The only teenagers I know well are my grandchildren, and that is not enough of a sample. They use computers a lot, but it has not made them very different. Of course they read much less, and they have a sense of how knowledge is organized that I can only envy — I keep being frustrated by how much better young people are at the task of searching.

Schirrmacher feels that the loss of the notion of free will may be dangerous, especially in Germany — I have a vague sense of what he is saying — perhaps this is a return to the old idea that psychoanalysis was causal in loosening the hold of morality. There really is a lot of stuff there.


On ARE THE DISCIPLINARY BOUNDARIES PERMEABLE? (STUTTGARTER ZEITUNG)
By Gábor Paál

GÁBOR PAÁL
German Radio Journalist; Author; Founder of the Network on Science and the Media

Response to Michael Naumann's comment

When Hegel wrote about Realphilosophie he was not historical — his examples came from astronomy and biology. But anyway a revival of "Realphilosophie" does not at all mean to postulate a revival of Hegel and his other ideas. It's not a matter of looking for a neighboorhood to any person but to a very special concept and to fill it, of course with modern content.




TERCERA CULTURA — CHILE [Google Translation page]
Un podcast divulgacion de la Cience Cognitiva Contemporanea

Who Are We?

Third Culture was born as a podcast in August 2009. Our idea was to spread the extraordinary findings, illuminations and epiphanies that we had throughout this decade in our studies of science of the mind.

Our ideas was to spread the extraordinary findings, illuminations and epiphanies that we had throughout this decade in our studies of science of the mind."Coming from the Faculty of Philosophy and Humanities at the University of Chile, we had the experience of being a somewhat rare beasts: interested in science in a humanistic environment. We found, in the concept of Third Culture (developed in CP Snow in the late fifties and sponsored by John Brockman in the nineties), a space where we could move easily and at the same time, share our experience students and our academic colleagues. ...

...We believe we can build a community around the issues of mind, not only among specialists of the six disciplines founding (if we ignore the hexagon of the Sloan Foundation in the seventies): Artificial Intelligence, Neuroscience, Philosophy, Psychology, Linguistics and Anthropology, but also between those who come from the humanities, which, as you said people like Jonah Lehrer or Ian Richardson, have been turning the problem of the mind since time immemorial.

We know that the others can be seen as a kind of "sensationalism" intellectual, or syncretism, even as accommodationist: we believe that this is one of the greatest dangers. We also know that you can see the third culture as "selling the system" in the humanities, dominated by epistemological pessimism, not relying on scientific research. Finally we know that on that same line of reasoning, the third culture can be seen as an unconditional surrender to the dominant ideas of the traditional right, the market, and so on. We put it bluntly, we are people with leftist values, but we are not the guerrilla left ... we are from the Darwinian left (... that is, at bottom, we are only interested in sex ).

The page / blog terceracultura.cl is our third step in the dissemination of the Third Culture in Chile and Chilean in this space will links to programs, more extensive post blogs, discuss recent articles, open the door to debate and establish links with elsewhere. We expect maximum contact.

[...]

[ED. NOTE: A new podcast website from Chile on The Third Culture with entries about Danlel Gilbert, Steven Pinker, Daniel Dennett, Leda Cosmides, John Tooby, Guns Germs and Steel, Darwin in Chile, among others. — JB]


Beyond Edge


Jerry Adler responds to Dinesh D'Souza's Life After Death: The Evidence, with a moving essay on the death of his son. Newsweek [...]

Alison Gopnik says babies can answer philosophical questions on Colbert [...]

Jerry Coyne on the debate that won't die [...]

Stewart Brand makes the case for nuclear power. Jim Witkin New York Times [...]

Scott Atran: "A Memory of Lévi-Strauss" on CongnitionandCulture.net [...]



subscribe
Email address:
Your name
(required) :
country:

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.


|Top|