How Has The Internet Changed The Way You Think?
JUDITH RICH HARRIS
Independent Investigator and Theoretician; Author, No Two Alike: Human Nature and Human Individuality
THE JOY OF JUST-ENOUGHNESS
The Internet dispenses information the way a ketchup bottle dispenses ketchup. At first there was too little; now there is too much.
In between, there was a halcyon interval of just-enoughness. For me, it lasted about ten years.
They were the best years of my life.
Author, Archimedes to Hawking
THE RISE OF INTERNET PROSTHETIC BRAINS AND SOLITON PERSONHOOD
With increasing frequency, people around the globe seek advice and social support from other individuals connected via the Internet. Our minds arise not only from our own brains but from Internet prosthetic brains (IPBs) — those clusters of people with whom we share information and advice through electronic networks. The simple notion of you and me is changing. For example, I rely on others to help me reason beyond the limits of my own intuition and abilities. Many of my decisions in life are shaped by my IPBs around the globe, and these decisions range from advice on software, computer problems, health issues, and emotional concerns. Thus, when asked to make a decision, who is the me who is actually making that decision?
The IPBs generated by social network connectivity can be more important than the communities dependent on geographic locality. Through the IPBs, we exchange parts of minds with one another. By the information we post on the Web and the interactions we have, we become IPBs for others. In some ways, when we die physically, a part of us survives as an IPB in the memories and thoughts of others, but also as trails we leave on the Internet. Individuals who participate in social groups, blogs, and Twitter, and who deposit their writings on the Web leave behind particles of themselves. Before the Internet, most of us rarely left marks on the world, except on our immediate family or a few friends. Before the Internet, even your immediate family knew nothing of you within four generations. In the "old days," your great-grandchildren might have carried some vestigial memory of you, but that faded like a burning ember when they died — and you would have often been extinguished and forgotten. I know nothing about my great grandparents.
However, in the Internet Age, the "complete extinguishing" never really happens, especially for prominent or prolific users. For example, the number of Internet searchers for something you wrote may asymptotically approach zero over the decades, but it will never quite reach zero. Given the ubiquity of the Internet, its databases, and search engines, someone a hundred years from now may smile on something you wrote or wonder about who you were. You may become part of this future person's own IPB as he navigates through life. In the future, simulacrums of you, derived in part by your Internet activities, will be able to converse with future generations.
Moreover, studies show that individuals within your social network have a profound influence on your personal health and happiness, for example, through your contacts on the Internet (whom you usually know) and their friends (whom you may not know). Habits and ideas spread through a vast Web of interconnectivity, like a virus. Behaviors can sometimes skip links — spreading to a friend of a friend without affecting the person who connects them. In summary, in the age of the Internet, the concept of you and personhood is more diffuse than ever before.
Because your interests, decision-making capabilities, habits, and even health are so intertwined with others, your personhood is better defined as a pseudo-personhood that is composed of yourself and the assembly of your IPBs out to at least three degrees of network separation. When we die, the Web of interconnectivity becomes torn, but one's pseudo-personhood, in some sense, continues to spread, like a soliton wave on a shoreless sea of Internet connections.
When Marc Chagall was asked to explain why he became a painter, he said that a painting was like a window through which he "could have taken flight toward another world." Chagall explored the boundaries between the real and unreal. "Our whole inner world is reality," he once wrote, "perhaps more real still than the apparent world."
As the notion of IPBs and soliton personhood expands, this kind of boundary will become even more blurred. The IPBs become of Chagallian importance and encourage the use of new windows on the world. They foster a different kind of immortality, form of being, and flight.
Post-doctoral fellow, Mind/Brain/Behavior Interfaculty Initiative, Harvard University
THE NEW BALANCE: MORE PROCESSING, LESS MEMORIZATION
The Internet changes the way I behave, and possibly the way I think, by reducing the processing costs of information retrieval. I focus more on knowing how to obtain and use information online (a processing solution) and less on memorizing it in advance (a memory solution).
This tradeoff between processing and memory reminds me of one of my father's favorite stories, perhaps apocryphal, about studying the periodic table of the elements in his high school chemistry class. On their test, the students were given a blank table and asked to fill in names and atomic weights. All the students agonized over this assignment, except for one. He simply wrote, "The periodic table can be found inside the back cover of our textbook, including the full name and atomic weight of each element."
What the smart-aleck ninth-grader probably didn't realize was that he manipulated one of the most basic tradeoffs that governs the performance of brains, computers, and other computational systems. The teacher reckoned that the most efficient way to solve chemistry problems was a memory-intensive solution, holding facts about elements in a brain. The student reckoned that it was more efficient to solve chemistry problems with a process-intensive solution, retrieving facts about elements from books.
In a world where chemistry books are hard to obtain (i.e. processing is expensive) the teacher has the right solution. In a world where chemistry books are easy to obtain (i.e. processing is cheap) the student has the right solution. A few decades ago, you would walk to the library for encyclopedias, books and maps. Today, I access them from my pocket. This fact is easy to recite, but its important to emphasize just how different the costs of processing are in these two cases. Suppose it takes about 20 minutes to walk to the library, and about 5 seconds to pull out an iPhone and open up the web browser. The processing demands on me are 1/240th as great as they were for my father. By analogy, my computer has a 2.4 gigahertz processor. A processor 1/240th as powerful operates at 10 megahertz — just a touch faster than the original Macintosh, released in 1984. Computers today operate very differently because of their vastly increased processing power. It would be surprising I didn't, too.
How has the Internet changed my behavior? When I walk out the door with my suitcase, I usually don't know what airline I'm flying on, what hotel I'll be staying in, how to get to it, where or when my first meeting will be, where a nearby restaurant is for dinner, and so on. A few years ago, I would have spent a few moments committing those details to memory. Now, I spend a few moments finding the "app for that".
After I see a good talk, I forget many of the details — but I remember to email the author for the slides. When I find a good bottle of wine, I take a picture of the label. I don't have to skim an interesting-looking paper as thoroughly before I file it, as long as I plug a few good keywords into my reference manager. I look up recipes after I arrive at the supermarket. And, when a friend cooks a good meal, I'm more interested to learn what website it came from than how it was spiced. I don't know most of the APA rules for style and citation, but my computer does. For any particular "computation" I perform, I don't need the same depth of knowledge because I have access to profoundly more efficient processes of information retrieval.
So, the Internet clearly changes the way I behave. It must be changing the the way I think at some level, insofar as my behavior is a product of my thoughts. It probably is not changing the basic kinds of mental processes that I can perform, but it might be changing their relative weighting. We psychologists love to impress undergraduates with the fact that taxi drivers have unusually large hippocampuses. But, today taxi drivers have GPS systems. This makes it relatively less important for them to memorize locations, and relatively more important for them to quickly read maps. It is a reasonable guess that GPS changes the way that taxi drivers' brains weight memory versus processing; it seems like a reasonable guess that the Internet changes the way that my brain does, too.
Often, the transformational role of the Internet is described in terms of memory; that is, in terms of the information that the Internet stores. It is easy to be awed by sheer magnitude of the data available on Wikipedia, Google Earth, or the Gutenberg project. But what makes these Websites transformative to me is not the data. Encyclopedias, maps, and books all existed long before their titles were dressed up in dots and slashes. What makes them transformative is the availability: The new processes by which that information can be accessed.
Evolutionary Psychologist, University of New Mexico; Author, Spent: Sex, Evolution, and Consumer Behavior
MY JUDGEMENT ENHANCER
The Internet changes every aspect of thinking for the often-online human: perception, categorization, attention, memory, spatial navigation, language, imagination, creativity, problem-solving, Theory of Mind, judgment, and decision-making. These are the key research areas in cognitive psychology, and constitute most of what the human brain does. BBC News and The Economist Website extend my perception, becoming my sixth sense for world events. Gmail structures my attention through my responses to incoming messages: delete, respond, or star for response later? Wikipedia is my extended memory. An online calendar changes how I plan my life. Google Maps change how I navigate through my city and world. FaceBook expands my Theory of Mind — better understanding the beliefs and desires of others.
But for me, the most revolutionary change is in my judgment and decision-making — the ways I evaluate and choose among good or bad options. I've learned that I can offload much of my judgment on to the large samples of peer ratings available on the Internet. These, in aggregate, are almost always more accurate than my individual judgment. To decide which Blu-ray disks to put in my Netflix cue, I look at the average movie ratings on Netflix, IMDB, and Metacritic. These reflect successively higher levels of expertise among the raters — movie renters on Netflix, film enthusiasts on IMDB, and film critics on Metacritic. Any film with high ratings across all three sites is almost always exciting, beautiful, and thoughtful.
My fallible, quirky, moody judgments are hugely enhanced by checking average peer ratings: book and music ratings on Amazon, used car ratings on Edmunds, foreign hotel ratings on Tripadvisor, and citations to scientific papers on Google scholar. We can finally harness the Law of Large Numbers to improve our decision-making: the larger the sample of peer ratings, the more accurate the average. As ratings accumulate, margins of error shrink, confidence intervals get tighter, and estimates improve. Ordinary consumers have access to better product-rating data than market researchers could hope to collect.
Online peer ratings empower us to be evidence-based about almost all of our decisions. For most goods and services, and indeed most domains of life, they offer the consumer a kind of informal meta-analysis — an aggregation of data across all the analyses already performed by other like-minded consumers. Judgment becomes socially distributed and statistical rather than individual and anecdotal.
Rational-choice economists might argue that sales figures are a better indication than online ratings of real consumer preferences, insofar as people vote with their dollars to reveal their preferences. This ignores the problem of buyer's remorse: consumers buy many things that they find disappointing. Their post-purchase product ratings mean much more than their pre-purchase judgments. Consumer Reports data on car owner satisfaction ('Would you buy your car again?') are much more informative than new-car sales figures. Metacritic ratings of the Twilight movies are more informative about quality than first-weekend box office sales. Informed peer ratings are much more useful guides to sensible consumer choices than popularity-counts, sales volumes, market share, or brand salience.
You might think that post-purchase ratings would be biased by rationalization — I bought product X, so it must be good, or I'd look like a fool. No doubt that happens when we talk with friends and neighbors, but the anonymity of most online ratings reduces the embarrassment effect of admitting one's poor judgments and wrong decisions.
Of course, peer ratings of any product, like votes for a politician, can be biased by stupidity, ignorance, fashion cycles, mob effects, lobbying, marketing, and vested interests. The average online consumer's IQ is only a little above 100 now, and their average education is just a couple of years of college. Runaway popularity can be mistaken for lasting quality. Clever ads, celebrity endorsements, and brand reputations can bias the judgment of even the most independent-minded consumers. Rating sites can be gamed and manipulated by retailers. Nonetheless, online peer ratings remain more useful than any other consumer-empowerment movement in the last century.
To use peer ratings effectively, we have to let go of our intellectual and aesthetic pretensions. We have to recognize that some of our consumer judgments served mainly as conspicuous displays of our own intelligence, openness, taste, or wealth, and are not really the best way to choose the best option. We have to learn some humility. My best recent movie-viewing experiences have all come from valuing the Metacritic ratings over my own assumptions, prejudices, and pre-judgments. In the process, I've learned a new-found respect for the collective wisdom of our species. This recognition that my own thinking is not so different from, or better than, everyone else's, is one of the Internet's great moral lessons. Online peer ratings reinforce egalitarianism, mutual respect, and social capital. Against the hucksterism of marketing and lobbying, they knit humanity together into collective decision-making systems of formidable power and intelligence.
Curator, TED conferences, TED Talks
THE REDISCOVERY OF FIRE
Amidst the apocalyptic wailing over the Internet-inflicted demise of print, one counter-trend deserves a hearing. The Web has allowed the re-invention of the spoken word. Thanks to a massive expansion of low-cost bandwidth, the cost of online video distribution has fallen almost to zero. As a result, recorded talks and lectures are taking on new forms, and spreading across the Web like wildfire.
They are tapping into something primal and powerful.
Before Gutenberg, we had a different technology for communicating ideas and information. It was called talking. Human to human speech is powerful. It evolved over millions of years and there's a lot more happening than just the words passing from brain to brain. There's modulation, tone, emphasis, passion. And the listener isn't just listening. She's watching. Subconsciously she notes the widening of the speaker's eyes, the movement of the hands, the swaying of the body, the responses of other listeners. It all registers and makes a difference to the way the receiving brain categorizes and prioritizes the incoming information. By increasing the motivation to understand, the speaker's lasting impact on the intellectual world of the listener may be far greater than the same words in print.
Read a Martin Luther King speech, and you may nod your head in agreement. But then track down a video of the man in action delivering those same words in front of an energized crowd. It's a wholly different experience. You feel the force of the words. Their intent seems clearer, more convincing. You end up motivated, inspired. And so throughout history, when people have wanted to persuade, they have gathered a crowd together and made their case, often with startling effect.
If non-verbal communication has a far bigger impact than verbal, how did books catch on? Simple. They offered scale. It might be harder to explain and inspire via the printed page, but if you could, tens of thousands could benefit. And so we ended up with a mass-communication culture where for a while books and other printed media were the stars. And surprisingly, although radio and television could have reopened the door to spoken persuasion, they largely ignored the opportunity. In the increasingly frenetic battle for attention — and constrained by economic models that required mass audiences — victory went to entertainment, news, gossip, drama and sport. "Talking heads" were regarded as bad television, and little effort went into figuring out how to present them in an interesting way.
Meanwhile in the academic world, the emphasis was on papers, research... and somehow teaching schedules settled on painfully long lectures as the default unit of verbal communication. Man in coat behind lectern reading notes while his audience snoozed. All the intellectual brilliance in the world matters not a whit if the receiving brains can't register it as interesting.
Our ancestors would have been appalled. They knew better. Picture a star-lit night outside a village in one of the ancient cradles of civilization. The people gather. The fire is lit. The drums beat. The dancers sway. A respected elder hushes the crowd. His face lit by flickering flames, he begins telling a story, his voice softly rising as the drama builds. The meaning of the story becomes apparent. The gathered crowd roar their approval. They have understood something new. And more than that, they have felt it. They will act on it.
This is a scene that has played itself countless times in our evolutionary history. It's not unreasonable to think that our brains are fine-tuned to respond to evocative speech delivered in a powerful theatrical setting by a talented speaker.
And now, the Web is making it possible for such speakers to do what print authors have been doing for centuries: reach a mass audience. What is more, the online explosion in serious talks could rectify the Web-inflicted damage to book authors' bank-balances and thereby make it possible to continue making a living as a contributor to the world's intellectual commons. For one thing, when a talk goes viral it boosts the author's book sales and generates new connections, contracts and consultancies. Significantly, it also creates demand for the author's paid speaking appearances. Those $20k speaker fees soon add up. (An under-reported impact of the increase in our time online is a growing craving for live experience. You can see it in the music industry where, all the revenue is moving away from album sales toward live performances. It's easy to imagine a musician of the future making all their music free digitally, but creating unforgettable live experiences for their fans at $100 a ticket. The same may be starting to happen for book authors.)
Beyond that, there are numerous brilliant thinkers, researchers and inventors who would never contemplate writing a book. They too now have the opportunity to become one of the world's teachers. Their efforts, conveyed vividly from their own mouths, will bring knowledge, understanding, passion and inspiration to millions.
When Marshall McLuhan said "the medium is the message" he meant, among other things, that every new medium spawns its own unexpected units of communication. In addition to the Web-page, the blog and the tweet, we are witnessing the rise of riveting online talks, long enough to inform and explain, short enough for mass impact.
The Web has allowed us to rediscover fire.
Archaeologist, Journalist; Author, Artifacts
I saw in the new decade wrapped against the English Channel chill under one of the few surviving Timeball Towers in the world. It was hardly a Times Square ball-drop, but my personal nod to a piece of 18th century tech which was a part of communications history and ergo, a link to the Internet. For years this slim landmark signalled navigators off the White Cliffs of Dover to set their chronometers to Greenwich Mean Time. It was a Twitter ball with just one message to relay.
History is my way in this year. I am answering this year's Question against the deadline, as the answer slips as defiantly as time. The Internet has not only changed the way I think, but prompted me to think about those changes, over time, weighted by the uneven-ness of technology take-up and accessibility to the Net.
I encountered the Web as a researcher at Oxford in the mid-1990s. I learned later that I was at Tim Berners-Lee's former college, but I was pretty blase about being easily online. I saw the Internet as more a resource for messaging, a faster route than the bike-delivered pigeon post. I didn't see it as a tool for digging and remained resolutely buried in books. But when I visited non-academic friends and asked if I could check emails on their dial-ups, I began to equate the Net with privilege, via phone bill anxiety. As they hovered nervously, I dived in and out again. The Internet was not a joy, but a catch-up mechanism. And for a while, I couldn't think about it any other way.
In 2000, something happened. I found myself drawn to write a book about Silicon Valley. Moving frequently between the UK, and America's East and West Coasts, I began to think about the implications of the Internet and, moreover, about how not being able to get online was starting to affect me. What was I missing intellectually, and culturally by being sometimes out of the game. I began to appreciate a new hunger, for a technology which was still forming. I knew all that information was out there, and I couldn't realise its potential. Sometimes I believed ignorance was bliss. Travelling around America by bus and train for several months was a revelation. At every stop I tried to get online, which usually meant I waited in line. I relished my login gifts: a precious 30 minutes at New York Public Library, a whole hour at small towns in the mid-west, a grabbed few minutes in a university department before giving a lecture somewhere.
Then — joy! — luxuriating in the always-on technology at my friends' homes in the Bay Area, where even the kitchens had laptops panting to 'go search'. But as I made those flights east, the differential was widening. I lost hours trawling the streets of European cities for an Internet cafe, to feel it was merely a brushed kiss from a stranger; there's always be someone else in line. I had the taste and knew tech was building on tech out there in the ether. I was like some Woody Allen character, gazing out of an empty carriage window into a train full of revelers. Being barred from the Web felt like a personal blow; I'd lost the key to the library.
In 2004, I moved to Rome just as the tsunami was showing how the Internet could be mobilised for the good. I made my first ever post. I began my own blog, charting Rome's art and culture for Stanford's metamedia lab. The Pope was declining and by March, 2005, St.Peter's piazza was mushrooming with satellite dishes. In the Sistine Chapel, God and Adam were connecting on Michelangelo's ceiling, outside fingers were twitching on laptops and cellphones for one of the Internet's seminal news moments. But I heard the news the old fashioned way. Walking with a bag of warm pizza, I heard a sudden churning of bells, when it was not the marking of the hour. As I ran with the thousands to St.Peter's, I recall feeling moved by these parallel communications, where people could still be summoned by the bells. A few weeks later, watching wide screen TV in a Roman cafe, white smoke rose from the Vatican chimney. The ash drifted over the Vatican's ancient walls, morphing into a messaging cacophony of Italian cellphones, and clattering keyboards in heaving Internet cafes.
Science Writer; Consultant; Lecturer, Copenhagen; Author, The Generous Man
DARE, CARE AND SHARE
The more you give, the more you get. The more you share, the more they care. The more you dare, the more is there for you. Dare, care and share.
The Internet has become the engine of gift economy and cooperation. The simple insight that there is so much more knowledge, data and wisdom out there than I can ever attend in a lifetime, shows me that life is not about collecting information into a depot of books, theorems, rote memories or titles. Life is about sharing with others what you have. Use it, share it, pick it when you need it. There is plenty out there.
In ecology, the waste of one organism is the food of another. Plants produce oxygen as a waste product — animals need it to live. We produce carbon dioxide as waste — and the plants enjoy it. To live is to be able to share your waste.
Human civilization seems to have been forgetting that through centuries of building and isolating waste depots and by exploiting limited resources. Now, we start learning that it is all about flows. Matter, energy, information, social links. They all flow through us. We share them with each other and all other inhabitants of this planet. The climate problem show us what happens if we ignore that renewable flows are the real stuff while depots and fortresses are illusions in the long run.
The Internet makes us think in the right way: Pass it on, let it go, let it flow. Thinking is renewed. Now we only need to change the way we act.
Psychologist; Author, Consciousness: An Introduction
A THIRD REPLICATOR
The way "I" think? I'm not sure that I know any more who or what is doing the thinking. That's the question the Internet is forcing me to ask.
When I was just a human being, writing books and research papers, or appearing on radio and television, I could happily imagine that "I" wrote my books. I didn't need to question who or what was doing the thinking or having the new ideas. In those days body, brain and knowledge were all bound up together in one place. To use an old metaphor, hardware, software and data were all bound up in one entity; it was reasonable to call it "Me".
The Internet has changed all that. It has changed both the nature of selves and the nature of thinking. "I" am no longer just the imagined inner conscious self who inhabits this body, but the smiling face on my Website and the fictional character other people write about in cyber space. If someone asks "Who is Sue Blackmore?" this body will have less say in the answer than the questioner's search engine.
The change to thinking itself began gradually. Humans have long outsourced their knowledge to paper and books. So in the old days I would sit at my desk with my typewriter and look up things I needed to know in books in my own, or the university library. Then I got a word processor. This new hardware shifted a little of the work but all the creative thinking still went on inside my head, taking in countless old memes and bringing them together to make new ones, selecting among the results and writing just a few of them down.
Then came the Internet. This meant I could communicate with more people, which meant more mixing of ideas, but did not change the process fundamentally. The real change was the advent of the World Wide Web. Suddenly — and in retrospect it really does seem to have been sudden — masses of information was available right there on my desk. Almost over night I stopped using the university library. Indeed I haven't physically been there for years now.
The Web needed search engines and these changed the world amazingly quickly. By sifting through mountains of data and coming up with relevant items, they took over a large part of what used to be human thinking.
I like to see all this in evolutionary terms. The creativity of an evolutionary process depends on the three processes of copying, varying and selecting information. First we had genes — replicators that banded together to create organisms. Then we had memes — replicators that worked together to create human minds. Now we have a third replicator and a new process of creative evolution. All those computers, programs, servers, cables and other essentials of the Internet might once have seemed to be hardly more than an extension of books, typewriters and telephones, but we should no longer see them that way.
Books, typewriters and telephones store information or pass it on, but they do not select the information they copy. They can vary the information by poor quality copying but they cannot put together old memes to make new ones. The Internet, or parts of the Internet, can.
Out there in cyberspace are search engines and kinds of software that copy, vary and select information, concocting new combinations and passing them around the globe in microseconds, making the results available to us all. This is truly a new evolutionary process; one that deals in ideas; one that creates images and original texts. Thinking has escaped from the human scale.
These days I still sit at my desk, but I am not just a human being thinking and writing down my thoughts. The keyboard I type on is recognisably like my old typewriter, but the process I am engaged in is nothing like it was before. Now, as I write, I jump quickly and often to things other people have written. I call up pages of information selected by software I do not understand and incorporate these into the text I am working on. This new text may go straight onto my Website or a blog and from there may, or may not, be picked up by other sites and copied on again. Even books partake of this extraordinary creative process, with Google scanning and propagating pages to students, other writers, and bloggers. No one can possibly know where all the copies and fragments of copies have gone, how many times they have been copied or by what process they were selected. Ever more of the copying, varying and selecting goes on outside of human brains and outside of human control.
Is the Internet itself thinking? I would say yes, or if not it is on the verge of doing so. The digital information it passes around is a third replicator; a kind of information that is copied, varied and selected by the massive machinery of the Internet and the Web.
So how has the Internet changed the way I think? The words I am writing now are far less "mine" than they were before. Indeed they have been created as much by John Brockman, the Edge community, and the entire Internet as by little me. I did not so much write them, as they used me to get themselves written.
So the answer is not that the Internet is changing the way I think; it is changing the nature of thinking itself.
Doris Duke Professor of Conservation Ecology; Author, The World According to Pimm: a Scientist Audits the Earth
THINKING WITH OTHERS SO CLOSE THAT YOU CAN SMELL THEIR SPIT
Once upon a time, we had the same world we do now. We knew little about its problems.
Wise men and women pontificated about their complete worlds, worlds that, for some, stretched only to the limits of their city centres or, sometimes, only to the grounds of their colleges. This allowed them clever conceits about what was really important in life, art, science and the rest of it.
Lesser minds would come to pay homage and, let's be honest, use the famous library since that was the only way of knowing what was known and who knew it. The centres ruled and they knew it.
It's late in evening, when I see the light on in the lab and stop by to see who else is working late. There's a conversation going on over Skype. It's totally incomprehensible. Even its sounds aren't familiar. There's no RosettaStone© for the language my two students are learning from their correspondent who sits in a café in a wretched oil town on the edge of the rainforest in Ecuador. It's only spoken by a few hundred Indians. All but their children were born as nomads in a forest that has the luck to be sitting on billions of barrels of oil. I didn't say "good luck."
In a few months, we'll be in that forest. My students will improve their language skills with the Indian women, helping them prepare chicha, by chewing manioc, spitting it into the bowl and chewing another mouthful.
With the Internet, what happens there is as exactly close as anything else I want to understand or communicate, give or take the slow phone line, or cell phone reception. When an oil company pushes a road far closer to a reserve than it promised, we'll know about it immediately. When some settlers try to clear forest, we'll know about them killing Indians just as quickly and when the Indians kill them with their spears. So will everyone else.
The Internet is instant news from remote places with photos to prove it. What we now think about instantly is suddenly much larger, more frightening, and far more challenging than it once was.
The Internet has vastly more coverage of everything, immediate, future, and past. So when we want to know who has signed which oil exploration leases to which tracts of remote forest, the data are not in Duke's library (or anyone else's), but I can get them online from the Web site of local newspapers. And I can do that in the forest clearing, surrounded by those who futures have been signed away. Knowledge is now everywhere. You can find it from everywhere too.
Internet has vastly increased the size of the problem set about humanity's future. Some problems now look really puny. They probably always were.
Who does the thinking has changed too. When knowledge is everywhere, so are the thinkers.
Theoretical Physicist, Caltech; Author, From Eternity to Here: The Quest for the Ultimate Theory of Time
CALLING YOU ON YOUR CRAP
I wanted to write that the Internet keeps people honest. The image of thousands of readers bursting into laughter gave me pause.
So let me put it this way: the Internet helps enable honesty. Many of us basically want to be honest, but we're fighting all sorts of other impulses — the desire to appear clever or knowledgeable, to support a point we're trying to make, to feel the satisfaction of a rant well-ranted. In everyday conversation, when we know something specific about the expertise and inclinations of our audience, these impulses may tempt us into laziness: pushing a point too hard, claiming as fact some anecdote whose veracity isn't completely reliable. We're only human.
Nothing highlights our natural tendencies to exaggerate and overclaim quite like a widely-distributed, highly-interconnected communication network with nearly-instantaneous feedback. There is no shortage of overblown and untrue claims on the Internet, as anyone who has actually looked at it will attest. But for those of us who would really like to be as honest and as accurate as is reasonably possible, the Internet is an invaluable corrective.
All else being equal, it is a virtue to know true things. But there is the virtue of assigning accurate degrees of confidence to the things we think we know. There are some things I have studied personally and in depth, such that I have acquired some expertise; there are other things that I've read somewhere, or heard from a friend, and sound pretty reasonable. And there are still other things that wouldn't sound at all reasonable at all to an objective observer, but which line up with other cherished beliefs I already have. Distinguishing between these different categories is a major part of being intellectually honest.
Engaging with ideas online — stating what I believe, arguing in favor of it to the best of my ability, and stretching my mind by reading things outside my comfort zone — is immensely helpful in separating well-established facts from wishful thinking. The thing about the Internet is: people will call you on your crap. Even if I don't know exactly what I'm talking about, somebody out there does. On discussion boards, in blog comment threads, on Websites from colleagues or from students on another continent, if I say something that manages to be interesting but wrong, chances are someone will set me straight. Not that everyone necessarily listens. It's my responsibility to be open enough to listen to the critiques and improve my position; but that's always been my job. The Internet merely helps us along.
The distinction is not only between the Internet and sitting around a table having a bull session with your friends; it applies to conventional print media as well, from books to newspapers and magazines. Sure, someone can write a book review or pen a strident letter to the editor. But timescales matter. If I put up a blog post in the morning and get several comments before lunchtime along the lines of "That's about as wrong as anything I've seen you ever write" and "Yeah, what were you thinking?", complete with links to sources that set me straight, it's difficult to simply pretend I don't notice.
I once heard, as an example of how online communication was degrading our discourse by drowning us in lies and misinformation, the example of the crazy claim that Stephen Hawking wouldn't have been cared for under the UK's National Health Service — which, of course, is exactly who did care for him, thus offering an unusually juicy self-refutation. But bringing up this example as a criticism of the Internet is equally self-refuting. The initial lie didn't appear online — it was in a good old-fashioned newspaper. Twenty years ago, that's as far as it would have circulated, after making a brief impression in the minds of its readers. But today, countless online sources leapt to make fun of the ridiculous lengths to which opponents of health-care reform were willing to go. Perhaps next time the editorial writers will be more careful in their choice of colorful counterfactuals.
All of which is incredibly small potatoes, of course. The Internet in its current configuration is only a hint at what we will have a hundred years from now — feel free to visualize your own favorite chip-in-your-head scenario. Cutting down on the noise will ultimately be just a great a challenge as connecting to the signal. But even now, the Internet is a great help to those of us who prefer to be kept honest — it's just up to us to take advantage.