| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |

next >


BRUCE HOOD
Director of the Bristol Cognitive Development Centre in the Experimental Psychology Department at the University of Bristol; Author, Supersense

I CAN MAKE A DIFFERENCE BECAUSE OF THE INTERNET

Who has not Googled thyself? Most humans have a concept of self that is constructed in terms of how we think we are perceived by those around us and the Internet has made that preoccupation trivially easy. Now anyone can assess their impact factor through a multitude of platforms including Facebook, Twitter and of course, blogging.

Last year, on the request of my publisher, I started a blog to comment on weird and bizarre examples of supernatural thinking from around the world. From the outset I thought that blogging was a self-indulgent activity but I agreed to give it a whirl to help promote my book. In spite of my initial reluctance I very soon became addicted to feedback. It was not enough to post blogs for some unseen audience. I needed the validation from visitors that my efforts and opinions were appreciated. Within weeks, I had become a numbers junkie looking for more and more hits.

However, the Internet has also made me sentient of my own insignificance and power at the same time. Within the blogosphere, I am no longer an expert on any opinion as it is one that can be shared or rejected by multitude of others. But insignificant individuals can make a significant difference when they coalesce around a cause. As this goes to press, a British company is under public scrutiny for allegedly selling bogus bomb-detecting dowsing rods to the Iraqi security forces. This has come about because of a blog campaign by like-minded skeptics who have used the Internet to draw attention to what they consider to be questionable business activity. This would have been very difficult and daunting in the pre-Internet days and not something that the ordinary man in street would have taken on. In this way, the Internet can empower the individual through collective campaigns.

I can make a difference because of the Internet. I'll be checking back on Google to see if anyone shares my opinion.


LERA BORODITSKY
Assistant Professor of Psychology, Stanford University

HOW I THINK ABOUT HOW I THINK

Consider a much earlier piece of technology than the Internet: a fork. When I take a fork (or any tool) in my hand, the multi-modal neurons in my brain that track the position of my hand immediately expand their receptive fields. They start to keep track of a larger part of space, expanding their view to include perhaps that succulent morsel of lamb that is now within my fork's reach. My brain absorbs the tool in my hand into the very representation of my physical self; the fork is now, in an important neural sense, a part of my body. (In case absorbing a fork into your sense of self seems strange, it may help to note that this phenomenon was discovered by a former dentist who ingeniously trained rhesus monkeys to search for food with tools suspiciously resembling dental endoscopes.) If grabbing a humble fork can expand my neurons' receptive fields, imagine what happens when I grab a mouse and open a web browser. Should I be worried about the size of my receptive fields?

Indeed, research in the last decade has shown that our brains change, grow, and adapt dramatically as we engage with the world in new ways. London taxi drivers grow larger hippocampi (a part of the brain heavily involved in navigation), as they gain "the knowledge" maneuvering through the maze of London streets. Playing video games dramatically improves people's spatial attention and object-tracking abilities, giving a regular schmoe the attentional skills of a fighter pilot. At this rate, we'll be lucky if the list of basic drives controlled by the hypothalamus – the famous four Fs of fighting, fleeing, feeding, and how's your father – doesn't soon need to be augmented with a fifth for facebook. This by the way is the reason I give for not joining social networking sites – my hypothalamus has more important business to attend to, thanks!

To be honest, my favorite human technologies are the ones we no longer even notice as technologies: they just seem like natural extensions of our minds. Numbers are one such example: a human-invented tool that once learned has incredible productive power in the mind. Writing is another such example. It no longer seems magical in the literate world that one could communicate a complex set of thoughts silently across vast reaches of time and space using only a cocktail napkin and some strategically applied stains. Yet being able to write things down, draw diagrams, and otherwise externalize the contents of our minds into some stable format has drastically augmented our cognitive and communicative abilities. By far the most amazing technological marvels that humans ever created, and what I spend most of my time thinking about, are the languages we speak. Now there's an immensely complex tool that really changed things for us humans. You think keeping up a correspondence with friends was hard before email, well you should have tried it before language! Importantly the particulars of the languages we speak have shaped not only how we communicate our thoughts, but the very nature of the thoughts we form to begin with.

There are of course facile or insipid ways of construing the nature of human thought such that "how I think" isn't and can't be changed by technology. For example, I could define the basic mechanisms of thought as "neurons fire, at different times some more than others, and that is how I think." Well alright, that is technically true, and the Internet is not changing that. But on any more interesting or useful construal of human thought, technology has been shaping us for as long as we've been making it.

More than shaping how I think, the Internet is even shaping how I think about how I think. Scholars interested in the nature of mind have long relied on technology as a source of metaphors for explaining how the mind works. First the mind was a clay tablet, then an abacus, a calculator, a telephone switchboard, a computer, a network. These days, new tools continue to provide convenient (perhaps in the 7-11 sense of convenient, as in nearby but ultimately unsatisfying) metaphors for explaining the mind. Consciousness for example, is not unlike twitter. Millions of mundane messages bouncing around, all shouting over each other, with only a few rising as trending topics. Take that Dan Dennett! Consciousness explained in 140 characters or less!


RALPH  GIBSON
Art Photographer

"TIME" AND AGING

I believe that the history of time has been impacted by several enormous inventions. First was the watch which unified man's concept of measurement of time. It is interesting to note that China was the last country to join the rest of the world in embracing the clock. It was chairman Mao who brought in this drastic change, among others.

The invention of photography created several concrete displacements of our perception of the past. The world was quick to accept the photograph as a forcible document containing absolute evidence. This concept endured until sometime in the 1950s when the photograph was no longer accepted in courts of law.

From my point of view the next great watershed that influenced our perception of time has been the arrival of the Internet. I know that it certainly speeds things up etc. but beyond this obvious fact there seems to be much more to it as an experience. I believe that there is a metaphysical element that surely the mystics could define. But for me the most blatant phenomena is that my life has to an extent compressed to the extent that I am not only aging in the conventional sense but also not aging, due to the fact that rather than losing information with the passing of "time" I am in fact accruing more and more information.

Being a photographer for over 50 years has created an innate suspicion of cyber space but this superstition/suspicion does not interfere with my use of the Internet as a system of communication and research. I remain indifferent to the entire event of place as it is experienced by young arrivals to the planet who find the most concrete forms of reality floating upon the surface of their computer display.

I am not a luddite per se, in fact I own 4 or 5 computers at all times but prefer to use the machine for accessing the Net and  for book layout purposes. The idea of an Internet without some form of computer device is, for the time being, out of reach. Thus the Internet and the computer are married in some ethereal place, as yet undefined.

As an amateur musician I find the Internet linked in time with the nature of music itself. I imagine  the sound is compressed and sent through space it only to have it be uncompressed and sent back into space at a different wave form frequency.....music....I can hear it now.


KARL SABBAGH
Writer and Television Producer; Author, The Riemann Hypothesis

When the British playwright Harold Pinter developed cancer of the oesophagus, his wife, Lady Antonia Fraser, discovered from the Internet that there was a 92% mortality rate. "If you have cancer, don't go on the Internet," she said in an interview published by The Sunday Times in January 2010.

This set me thinking about my own interactions with the Internet, and how they might differ fundamentally from using any other sources of information.

Lady Antonia could, I suppose, have said, "If you have cancer, don't look at the Merck Manual," or some other medical guide, but there must be more to it than that. It is, first of all, the effortlessness with which it can be used. I used to joke that if I had a query which could be answered by consulting a book in the shelves on the other side of my study or by using the Internet, it would be quicker and less energy-consuming to find the answer on the Internet. It's not even funny any more, because it's obviously the most efficient way to do things. I am one of the few people who seem to trust Wikipedia. Its science entries, in particular, are extremely thorough, reliable and well-sourced. People who trust books (two or more years out of date) rather than Wikipedia are like people who balk at buying on the Internet for security reasons but happily pay with a credit card in restaurants where an unscrupulous waiter could keep the carbon copy of the slip and run up huge bills before they knew it.

Lady Antonia Fraser's remark was really a tribute to the reliability and comprehensiveness of the Internet. It wasn't so much that she came across a pessimistic forecast of Harold's prognosis, more that it was probably a reliable pessimistic forecast, based on up-to-date information. It doesn't of course mean that it was accurate. She may not have consulted all cancer sites, or it may be that no one really knows for sure what the prognosis was for oesophageal cancer. But she assumed — and I assume myself when using the Internet — that with a little skill and judgment you can get more reliable information there than anywhere else.

This, of course, has nothing to do with thinking. It could be that I would think the same if I'd been writing my books with a quill pen and had only the Bible, Shakespeare and Dr. Johnson's Dictionary to consult. But the Internet certainly constrains what I think about. It stops me thinking any more about that great idea for a book that I now find was published a few years ago by a small university press in Montana.

It also reinforces my belief in my own ideas and opinions because it is now much quicker to test them, particularly when they are new opinions. By permitting anyone to publish anything, the Internet allows me to read the whole range of views on a topic, and infer from the language used the reasonableness or otherwise of the views. Of course, I was inclined to disbelieve in Intelligent Design before I had access to the wide range of wacky and hysterical Websites that promote it. But now I have no doubts at all that the theory is tosh. (SLANG CHIEFLY BRIT nonsense; rubbish — The Free Dictionary)

But this is still not to do with thinking. What do I do all day, sitting at my computer? I string words together, reread them, judge them, improve them if necessary and print them out or send them to people. And underlying this process is a judgement about what is interesting, novel or in need of explanation, and the juggling of words in my mind to express these concepts in a clear way. None of that, as far as I am aware, has changed because of the Internet.

But this is to deal with only one aspect of the Internet, its provision of factual content. There is also email and attachments and blogs and software downloads and You Tube and Facebook and Internet shopping and banking and weather forecasts and Googlemaps and and and…. But before all this, I knew there were lots of people in the world, capable of using language and saying clever or stupid things. Now I have access to them in a way I didn't before, but again this is just information provision rather than a change in ways of thinking.

Perhaps the crucial factor is speed. If I was setting out to write a book, I would start with a broad outline and a chapter breakdown, and these would lead me to set a series of research tasks which could take months: look in this library, write to this expert, look for this book, find this document. Now the order of things has changed. While I was doing all the above, which could take weeks or months, my general ideas for the book would be evolving. My objectives might change, and my research tasks with them. I would do more 'broad brush' thinking. Now, when documents can be found and downloaded in seconds, library catalogues consulted from one's desk, experts emailed and a reply received within 24 hours, the idea is set in stone much earlier. But even here there is no significant* difference in thinking. If, in the course of the research, some document reveals a different an — gle, the fact that this happens within hours or days rather than months can only be to the good. The broad brush thinking is now informed rather than uninformed.

I give up. The Internet hasn't changed how I think. It's only a tool. An electric drill wouldn't change how I many holes I make in a piece of wood, it would only make the hole-drilling easier and quicker. A car doesn't change the nature and purpose of a journey I make to the nearest town, it only makes it quicker and leads to me making more journeys, than if I walked.

But what about Lady Antonia Fraser? Is the truth-telling power of the Internet something to avoid? The fact is, the Internet reveals in its full horror the true nature of mankind — its obsessions, the triviality of its interests, its scorn for logic or rationality, its inhumanity, the power of capital, the intolerance of the other. But anyone who says this is news just doesn't get out enough. The Internet magnifies and specifies what we know already about mankind, or if we don't we're rather naïve. The only way my thinking would have been changed by this 'revelation' would have been if I believed along with Dr Pangloss that all is for the best in the best of all possible worlds. And I don't.


HU FANG
Writer, Co-founder OF Vitamin Creative Space in Guangzhou and the shop in Beijing, China

NOTES FROM A FILM DIRECTOR

I am particularly fond of this story: 7 men and 7 women who do not know one another, living in a glass house together for a month. Because their circumstances require that they sever all ties with their previous ways of life, they develop a brand new dynamic amongst themselves, and as a result, this sparks off the fundamental emotions of humankind — love, desire, passion and hatred.

During the first week, their caution with one another is evident. They make tentative attempts at communication, tapping on their past glories and social statuses to get into the good books of others. However, all that happens within the glass house is as convincing as empty promises. Gradually, they realise: the sole elements to victory are their own beings and the purity and simplicity of words; it is these things that are needed to reveal a "true self" to the other party.

Everything in this transparent and closed space is captured by the camera, and viewers from all over the country (including their own loved ones) are gathered around their televisions sets, watching their every move with intense interest, whipping out their cell-phones to send text messages.

At times, the participants wonder if they should seek help from the director, admit to their personal weaknesses, and then withdraw from the competition. But the lure of millions of dollars in prize money is irresistible (everyone has valid reasons for why they ought to win). They are also constrained by their sense of personal pride, hence no one would allow himself or herself to give up that easily. Some of them endure sleepless nights, and their loved ones — following their struggles as observed by the camera — consequently suffer the same insomnia with them. How difficult it was to make the right decision!

As required, each of them has to say a few words via the camera to their loved ones each day; most of the time, these revolve around their recollections on the past, realizations about life and confessions when their consciences are pricked. These in turn elicit widespread national tears. When the participants look right into the camera, and speak to their loved ones with deep emotions, in actual fact, they are gazing at the audience, confiding in them with great sentiment. Time and time again, this experience reiterates to them: what is important is not leaving good impressions on the opposite sex in that glass house, but rather, winning the favor of the audiences outside the glass house.

The participants' views are indistinct, and when projected beyond the glass house, are akin to messages sent from earth into the dark unknown that is outer space.

Finally, a pair amongst the participants kiss. Their profound love spur on another pair, unwilling to be left behind, to embrace each other. This incredibly lucid and protracted feature story drives their loved ones outside of the glass house to resort to smashing up their television sets in a bid to break that endless kiss.

The fragments of the television set are symbolic of the shattering of the glass house. Yet the image of the kissing lovers remains deeply seared into the minds of that man or that woman; it has become an indelible memory in their lives.

In my youth, I dreamed of becoming the director of that "tragicomic reality show". As the participants are wrapped up in their passionate embraces, I would have the shot cut to a series of personal, private spaces, to focus on the despair on the face of that man or woman sitting before the television.


JON KLEINBERG
Professor of computer science, Cornell University

THE HUMAN TEXTURE OF INFORMATION

When Rio de Janeiro was announced as the site of the 2016 Summer Olympics, I was on the phone with colleagues, talking about some ideas for how to track breaking news on the Internet. Curious to see how reactions to the announcement were playing out, we went onto the Web to take a look, pushing our way like tourists into the midst of a celebration that was already well underway. The sense that we were surrounded by crowds was not entirely in our imaginations: over a thousand tweets per minute about Rio were appearing on Twitter; Wikipedians were posting continuous updates to their "2016 Summer Olympics" page; and political blogs were filled with active conversations about the lobbying of world leaders on behalf of different cities.

This is the shape that current events take on-line, and there is something more going on here than simple volume. Until recently, information about an event like this would have been disseminated according to a top-down structure, consisting of an editorially assembled sampling of summaries of the official announcement, reports of selected reactions, and stories of crowds gathering at the scene. But now the information emerges bottom-up, converging in tiny pieces from all directions: the crowd itself speaks, in a million distinct voices — a deluge of different perspectives.

The Web hasn't always looked this way. When I first used an Internet search engine in the early 1990s, I imagined myself dipping into a vast, universal library, a museum vault filled with accumulated knowledge. The fact that I shared this museum vault with other visitors was something that I knew in principle, but could not directly perceive — we had the tools to engage with the information but not with one another, and so we all passed invisibly by each other.

When I go on-line today, all those rooms and hallways are teeming, and I can see it. What strikes me is the human texture of the information — the visible conversations, the spikes and bursts of text, the controlled graffiti of tagging and commenting. I've come to appreciate the way the event and the crowd in fact live in symbiosis, each dependent on the other — the people all talking at once about the event, but the event only fully comprehensible as the sum total of the human reaction to it. The construction feels literary in its complexity — a scene as though described by an omniscient narrator, jumping between different points of view, except that here all these voices belong to real, living beings, and there's no master narrative coordinating them. The cacophony might make sense, and it might not.

But the complexity does not just arise from all the human voices — it is accentuated by the fact that the online world is one where human beings and computational creations commingle. You bump into these computational artifacts like strange characters in a Carrollian Wonderland. There is the giant creature who has memorized everything ever written, and will repeat excerpts back to you (mainly out of context) in response to your questions. There are the diaphanous forms, barely visible at the right-hand edge of your field of vision, who listen mutely as you cancel meetings and talk about staying home in bed, and then mysteriously begin slipping you ads for cough medicine and pain relievers. And even more exotic characters are on the way; a whole industry works tirelessly to develop them.

The ads for cough medicine are important, and not just because they're part of what pays for the whole operation. They should continuously remind you that you're part of the giant crowd as well, that everything you do is feeding into a global conversation that is not only visible but recorded. I try to reflect on what behavioral targeting algorithms must think of me — what the mosaic of my actions must look like when everything is taken into account, and which pieces of that mosaic would have been better left off the table.

The complexity of the online world means that when I use the Internet today, even for the most mundane of purposes, I find myself drawing on skills that I first learned in doing research — evaluating many different observations and interpretations of the same events; asking how people's underlying perspectives, tools, and ways of behaving have served to shape their interpretations; and reflecting on my own decisions as part of this process. Think about the cognitive demands this activity involves — once the domain of scholarship, it is now something that the Internet requires from us on a daily basis. It suggests that in addition to "computer literacy," an old pursuit where we teach novices how to use computing technology in a purely operational sense, we need to be conveying the much more complex skill of "information literacy" at very young ages: how to reason about the swirl of perspectives you find when you consume information on-line, how to understand and harness the computational forces that shape this information, and how to reason about the subtle consequences of your own actions on the Internet.

Finally, the Internet has changed how I think professionally, as a computer scientist. In the thirteen years since I finished graduate school, the Internet has steadily and incontrovertibly advanced the argument that computer science is not just about technology but about human beings as well — about the power of human beings to collectively create knowledge and engage in self-expression on a global scale. This has been a thrilling development, and one that points to a new phase in our understanding of what people and technology can accomplish together, and about the world we've grown to jointly inhabit.


ALISON GOPNIK
Psychologist, UC, Berkeley; Author, The Philosophical Baby

THE STRANGERS IN THE CRIB

My thinking has certainly been transformed in alarming ways by a relatively recent information technology, but it's not the Internet. I often sit for hours in the grip of this compelling medium, motionless and oblivious, instead of interacting with the people around me. As I walk through the streets I compulsively check out even trivial messages — movie ads, street signs — and I pay more attention to descriptions of the world — museum captions, menus — than to the world itself. I've become incapable of using attention and memory in ways that previous generations took for granted. Yes, I know reading has given me a powerful new source of information. But is it worth the isolation, the damage to dialog and memorization that Socrates foresaw? Studies show, in fact, that I've become involuntarily compelled to read, I literally can't keep myself from decoding letters. Reading has even reshaped my brain, cortical areas that once were devoted to vision and speech have been hijacked by print. Instead of learning through practice and apprenticeship, I've become dependent on lectures and textbooks. And look at the toll of dyslexia and attention disorders and learning disabilities, all signs that our brains were just not designed to deal with such a profoundly unnatural technology.

Like many others I feel that the Internet has made my experience more fragmented, splintered and discontinuous. But I'd argue that's not because of the Internet itself but because I have mastered the Internet as an adult. Why don't we feel the same way about reading and schooling that we feel about the Web? These changes in the way we get information have had a pervasive and transformative effect on human cognition and thought, and universal literacy and education have only been around for a hundred years or so.

It's because human change takes place across generations, rather than within a single life. This is built into the very nature of the developing mind and brain. All the authors of these essays have learned how to use the Web with brains that were fully developed long before we sent our first e-mail. All of us learned to read with the open and flexible brains we had when we were children. As a result no-one living now will experience the digital world in the spontaneous and unselfconscious way that the children of 2010 will experience it, or in the spontaneous and unselfconscious way we experience print.

There is a profound difference between the way children and adults learn. Young brains are capable of much more extensive change — more rewiring — than the brains of adults. This difference between old brains and young ones is the engine of technological and cultural innovation. Human adults, more than any other animal, reshape the world around them. But adults innovate slowly, intentionally, and consciously. The changes that take place within an adult life, like the development of the Internet, are disruptive, attention-getting, disturbing or exciting. But those changes become second nature to the next generation of children. Those young brains painlessly absorb the world their parents created, and that world takes on a glow of timelessness and eternity, even if it was only created the day before you were born.

My experience of the Web, feels fragmented, discontinuous, effortful (and interesting!) because, for adults, learning a new technology depends on conscious, attentive, intentional processing. In adults, this kind of conscious attention is a very limited resource. This is even true at the neural level. When we pay attention to something, the prefrontal cortex, the part of our brain responsible for conscious goal-directed planning, controls the release of cholinergic transmitters, chemicals that help us learn, to certain very specific parts of the brain. So as we wrestle with a new technology we adults can only change our minds a little bit at a time.

Attention and learning work very differently in young brains. Young animals have much more wide-spread cholinergic transmitters than adults and their ability to learn doesn't depend on planned, deliberate attention. Young brains are designed to learn from everything new, or surprising or information-rich, even when it isn't particularly relevant or useful.

So children who grow up with the Web will master it in a way that will feel as whole and natural as reading feels to us. But that doesn't mean that their experience and attention won't be changed by the Internet, anymore than my print-soaked twentieth century life was the same as the life of a barely literate 19th century farmer.

The special attentional strategies that we require for literacy and schooling may feel natural since they are so pervasive, and since we learned them at such an early age. But at different times and places, different ways of deploying attention have been equally valuable and felt equally natural. Children in Mayan Indian cultures, for example, are taught to distribute their attention to several events simultaneously, just as print and school teach us to focus on just one thing at a time. I'll never be able to deploy the broad yet vigilant attention of a hunter-gatherer, though, luckily, a childhood full of practice caregiving let me master the equally ancient art of attending to work and babies at the same time.

Perhaps our digital grandchildren will view a master reader with the same nostalgic awe that we now accord to a master hunter or an even more masterly mother of six. The skills of the hyper-literate 20th century may well disappear, or at least become highly specialized enthusiasms, like the once universal skills of hunting, poetry and dance. It is sad that after the intimacy of infancy our children inevitably end up being somewhat weird and incomprehensible visitors from the technological future. But the hopeful thought is that my grand-children will not have the fragmented, distracted, alienated digital experience that I do. For them the Internet will feel as fundamental, as rooted, as timeless, as a battered Penguin paperback, that apex of the literate civilization of the last century, feels for me.


JESSE BERING
Psychologist, Director, Institute of Cognition and Culture, Queens University, Belfast; Columnist, Scientific American ("Bering in Mind"); Author, Under God's Skin

A RETURN TO THE SCARLET-LETTER SAVANNAH

Only ten thousand years ago, our Homo sapiens ancestors were still living in close-knit societies about the size of a large lecture hall in a state university. What today might be seen as an embarrassing faux pas back then could have been the end of the line for you. At least, it could have been the end of the line for your reproductive success, since an irreversibly spoiled reputation in such a small group could have meant a surefire death for your genes.

Just imagine the very worst thing you've ever done: the most vile, scandalous and vulgar. Now imagine all the details of this incident tattooed on your forehead. This scenario is much like what our ancestors would have encountered if their impulsive, hedonistic and self-centered drives weren't kept in check by their more recently evolved prudent inhibitions. And this was especially the case, of course, under conditions in which others were watching them, perhaps without them realizing. If their ancient, selfish drives overpowered them, our ancestors couldn't simply up sticks and move to a new town. Rather, since they were more or less completely dependent on those with whom they shared a few hundred kilometers, cutting off all connections wasn't a very viable option. And effectively hiding their identities behind a mantle of anonymity wasn't really doable either, since they couldn't exactly be just a nameless face. The closest our ancestors had to anonymity was the cover of night. Thus, in the ancestral past, being good, being moral, by short-circuiting our species' evolved selfish desires was even more a matter of life and death than it is today. It was a scarlet-letter Savannah.

Yet, curiously, for all its technological sophistication and seeming advances, the Internet has heralded something of a return to this scarlet-letter Savannah environment, and in many ways has brought our species back to its original social roots.

After a long historical period during which people may have been able to emigrate to new social groups and to "start over" if they spoiled their reputations, the present media age more accurately reflects the conditions faced by our ancestors. With newspapers, telephones, cameras, television and especially the Internet at our disposal, personal details about medical problems, spending activities, criminal and financial history and divorce records (to name just a few tidbits potentially costly to our reputations) are not only permanently archived, but can be distributed in microseconds to, literally, millions of other people. With the Internet being an active microcosm of human sociality, the old adage "wherever you go, there you are" takes on new meaning in light of the evolution of information technology. From background checks to matchmaking services, to anonymous Website browsing to piracy and identity theft, from "Googling" others (and ourselves) to flaming bad professors (e.g., www.ratemyprofessor.com) and stingy customers (e.g., www.bitterwaitress.com), the Internet is simply ancient social psychology meeting new information technology.


JARON LANIER
Musician, Computer Scientist; Pioneer of Virtural Reality; Author, You Are Not A Gadget: A Manifesto

THE FLAWS OF THE LATEST POP VERSION OF THE INTERNET HAVE MADE ME MORE OF A BIOLOGICAL REALIST, AND IN PARTICULAR HAVE MADE ME MORE SENSITIVE TO NEOTENY

The Internet as it evolved up to about the turn of the century was a great relief and comfort to me, and influenced my thinking positively in a multitude of ways. There were the long-anticipated quotidian delights of speedy information access and transfer, but also the far more important optimism born from seeing so many people decide to create Web pages and become expressive, proving that the late 20th century's passive society on the couch in front of the TV was only a passing bad dream.

In the last decade, the Internet has taken on unpleasant qualities, and has become gripped by reality-denying ideology.

The current mainstream, dominant culture of the Internet is the descendant of what used to be the radical culture of the early Internet. The ideas are unfortunately motivated to a significant degree by a denial of the biological nature of personhood. The new true believers attempt to conceive of themselves as becoming ever more like abstract immortal information machines, instead of messy, mortal, embodied creatures. This is nothing but yet another approach to an ancient folly; the psychological denial of ageing and dying. To be a biological realist today is to hold a minority opinion during an age of profound, overbearing, technologically-enriched groupthink.

When I was in my twenties, my friends and I we were motivated by the eternal frustration of young people that they are not immediately all made rulers of the world. It used to seem supremely annoying to my musician friends, for instance, that the biggest stars, like Michael Jackson, would get millions of dollars in advance for an album, while an obscure, minor artist like me would only get $100K advance to make one (and this was in early 1990's dollars.)

So what to do? Kill the whole damned system! Make music free to share, and demand that everyone build reputation on a genuine all-to-all network instead of a broadcast network, so that it would be fair. Then we'd all go out and perform to make money, and the best musician would win.

The lecture circuit was particularly good to me as a live performer. My lecture career was probably one of the first of its kind that was driven mostly by my online presence. (In the old days, my crappy Web site got enough traffic to merit coverage as an important Web site by the mainstream media like the New York Times.) It seemed as though money was available on tap.

Seemed like a sweet way to run a culture back then, but in the bigger picture, it's been a disaster. Only a tiny, token number of musicians, if any, do as well within the new online utopia as even I used to do in the old world, and I wasn't particularly successful. Every musician I have been able to communicate with about their true situation, including a lot of extremely famous ones, has suffered after the vandalism of my generation, and the reason isn't abstract but because of biology.

What we denied was that we were human and mortal, that we might someday have wanted children, even though it seemed inconceivable at the time. In the human species, neoteny, the extremely slow fading of our juvenile characteristics, has made child rearing into an extreme, draining long-term commitment.

That is the reality. We were all pissed at our own parents for not coming through in some way or other, but evolution has extended the demands of human parenting to the point that it is impossible for parents to come through well enough, ever. Every child must be disappointed to some degree because of neoteny, but economic and social systems can be designed to minimize the frustration. Unfortunately the Internet, as it has come to be, maximizes it.

The way that neoteny relates to the degradation of the Internet is that as a parent, you really can't go running around to play gigs live all the time. The only way for a creative person to live with what we can call dignity is to have some system of intellectual property to provide sustenance while you're out of your mind with fatigue after a rough night with a sick kid.

Or, spouses might be called upon to give up their own aspirations for a career, but there was this other movement called Feminism happening at the same time that made that arrangement less common.

Or, there might be a greater degree of socialism to buffer biological challenges, but there was an intense libertarian tilt coincident with the rise of the Internet in the USA. All the options have been ruled out, and the result is a disjunction between true adulthood and the creative life.

The Internet, in its current fashionable role as an aggregator of people through social networking software, only values humans in real time and in a specific physical place, that is usually away from their children. The human expressions that used to occupy the golden pyramidion of Maslow's pyramid, are treated as worthless in themselves.

But dignity is the opposite of real time. Dignity means, in part, that you don't have to wonder if you'll successfully sing for your supper for every meal. Dignity ought to be something one can earn. I have focused on parenting here, since it is what I am experiencing now, but the principle becomes even more important as people become ill, and then even more as people age. So, for these reasons and many others, the current fashionable design of the Internet, dominated by so-called social networking designs, has an anti-human quality. But very few people I know share my current perspective.

Dignity might also mean being able to resist the near-consensus of your peer group.


KEITH DEVLIN
Executive Director, H-STAR Institute, Stanford University; Author, The Unfinished Game: Pascal, Fermat, and the Seventeenth-Century Letter that Made the World Modern

"IT ALL DEPENDS ON WHAT YOU MEAN BY"

I just googled the exact phrase "It all depends on what you mean by", and our favorite research tool returned 920,000 hits. As a result, my originally intended opening sentence is no longer, "As a mathematician, I always approach a question by first asking exactly what it means, both as a whole and all its constituent terms."

Google tells me that it is not just mathematicians that ask the "meaning" question. To be sure, in many cases (some famous, even infamous) the question seems to be used as a political, legal, or social get-out-of-jail card. Though others use it for more honorable purposes, I suspect that only mathematicians are quite literally unable to do anything until they have answered the question to their satisfaction. Indeed, much of the history of mathematics amounts to successive re-clarification and re-specification of terms.

In the case of this year's Edge question, the key phrase is surely "the way you think," and the key word therein is "think."

No one can contribute to an online discussion forum like this without thereby demonstrating that the Internet has changed and continues to change the way we work.

The Internet also changes the way we make decisions. I now choose my flights on the basis of a lot more information than any one air carrier would like me to have (except perhaps for Southwest, who currently benefit from the Internet decision process), and I select hotels based on reviews by other customers, which I temper by a judgment based (somewhat dubiously, I admit) on their use of language as to whether they are sufficiently "like me" for their views to be relevant to me.

But is that really a change in the way I think? I don't think so. In fact, we Edge contributors are probably a highly atypical society grouping to answer this question, since we have all been trained over many years to think in certain, analytic ways. In particular, we habitually begin by gathering information, questioning that information and our assumptions, looking at (some) alternatives, and basing our conclusions on the evidence before us.

We are also used to having our conclusions held up to public scrutiny by our peers. Which of course is why it is rare (though intriguingly, and I think all to the good, not totally impossible) to find trained scientists who believe in Biblical Creationism or who doubt that Global Warming is a real and dangerous phenomenon.

When I reflect on how I go about my intellectual work these days, the Internet has changed it dramatically, but what has changed is the execution process (and hence, on some occasions, the conclusions I reach or the way I present them), not the underlying thinking process.

I would hope for Humanity's future that the same is true for all my fellow highly-trained specialists. The scientific method for reaching conclusions has served us well for many generations, leading to a length and quality of life for most of us that was beyond the imagination of our ancestors. If that way of thinking were to be replaced by a blind "wisdom of the crowd" approach, which the Internet offers, then we are likely in for real trouble. For wisdom of the crowd, like its best known exemplar google search, gives you the mostly-best answer most of the time.

As a result of those two "mostly's, wisdom of the crowd without questioning, though fine for booking flights or selecting hotels, can be potentially dangerous, even when restricted to experts. To give one example, not many decades ago, wisdom of the crowd among the scientific community told us that Plate Tectonics was nonsense; now it is the accepted theory.

The good thing about the analytic method, of course, is that once there was sufficient evidence in support of Plate Tectonics, the entire scientific community switched from virtual dismissal to total acceptance.

That example alone explains why I think it is good that a few well-informed (this condition is important) individuals question both global warming and evolution by natural selection. Our conclusions need to be constantly questioned. I remain open to having my mind changed on either. But to make that change, I require convincing evidence rather than blind faith or discomfort with the conclusions, evidence that is so far totally lacking. In the meantime, I will continue to accept both theories.

The real "Edge question" for me, is one that is only implied by the question as stated: Does the Internet change the way of thinking for those people born in the Internet age — the so-called Digital Natives? Only time can really answer that.

Living organisms adapt and the brain is a highly plastic organ, so it strikes me as not impossible that the answer to this modified question may be yes. On the other hand, recent research by my Stanford colleague Cliff Nass (and others) suggests that there are limitations to the degree to which the digital environment can change our thinking.

An even more intriguing question is whether the Internet is leading to society as a whole (at least those who are on the Net) constituting an emergent global thinking. By most practical definitions of "thinking" I can come up with, distinguishing it from emotions and self-reflective consciousness, the answer seems to be "Yes." And that development will surely change our future in ways we can only begin to imagine.


DANIEL HAUN
Director, the Research Group for Comparative Cognitive Anthropology, the Max Planck Institute for Evolutionary Anthropology

REPETITION, NOT TRUTH

I am born in 1977, or 15 b.I. if you like. That is if you take the 1992 version of the Internet to be the real thing. Anyway, I don't really remember being without it. When I first looked up, emerging out of the dark, quickly forgotten days of a sinister puberty, it was already there. Waiting for me. So it seems to me, it hasn't changed the way I think. Not in a before-after fashion anyway. But even if you are reading these lines through grey, long, uncontrollable eyebrow hair, let me reassure you, it hasn't changed the way you think either. Of course it changed the content of your thinking. Not just through the formidable availability of information you seek, but most importantly through the information you don't. But from what little I understand about human thought, I don't think the Internet has changed the way you think. It's architecture has not changed yours.

Let me try and give you an example of the way people think. The way you think. I have already told you three times that the Internet hasn't changed the way you think (4 and counting) and every time you are reading it, my statement becomes more believable to you. Psychologists have reported the human tendency to mistake repetition for truth for more than sixty years. This is called the "illusion of truth effect". You believe to be true what you hear often. The same applies to whatever comes to mind first or most easily.

People, including you, believe the examples they can think of right away to be most representative and therefore indicative of the truth. This is called the "availability heuristic". Let me give you a famous example. In English, what's the relative proportion of words that start with the letter K versus words that have the letter K in 3rd position? The reason most people believe the former to be more common than the latter is that they can easily remember a lot of words that start with a K, but few that have a K in the 3rd position. The truth in fact is that there are three times more words with K in third than in first position. Now if you don't believe people really do this, maybe because you don't, you just proved my point. Availability creates the illusion of truth. Repetition creates the illusion of truth. I would repeat that, but you get my point.

Let's reconsider the Internet. How do you find the truth on the Internet? You use a search engine. Search engines evidently have very complicated ways to determine which pages will be most relevant to your personal quest for the truth. But in a nutshell, a page's relevance is determined by how many other relevant pages link to it. Repetition, not truth. Your search engine will then present a set of ranked pages to you, determining availability. Repetition determines availability, and both together the illusion of truth. Hence, the Internet does just what you would do. It isn't changing the structure of your thinking, because it resembles it. It isn't changing the structure of your thinking, because it resembles it. 


| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 |

next >