The true glory of Wikipedia continues to lie in the obscure, the arcane, and the ephemeral. Nowhere else will you find such lovingly detailed descriptions of TV shows, video games, cartoons, obsolete software languages, Canadian train stations, and the workings of machines that exist only in science fiction. Whatever else it may be, Wikipedia is a monument to the obsessive-compulsive fact-mongering of the adolescent male. Never has sexual sublimation been quite so wordy.
One of my favorite examples is Wikipedia’s wonderfully panoramic coverage of the popular sixties sitcom Gilligan's Island. Not only is there an entry for the show itself, but there are separate articles for each of the seven quirky castaways—Gilligan, the Skipper, the Professor, Mary Ann, Ginger, Thurston Howell III, and Eunice "Lovey" Howell—as well as the actors that played the roles, the ill-fated SS Minnow, and even the subsequent TV movies that were based on the show, including 1981’s The Harlem Globetrotters on Gilligan's Island. Best of all is the annotated list of all 98 of the episodes in the series, which includes a color-coded guide to "visitors, animals, dreams, and bamboo inventions."
It goes deeper than Wikipedia, though. Gilligan's Island has been a great motivator of user-generated content across the breadth of the web. Check out this YouTube take on the eternal question "Mary Ann or Ginger?":
In fact, if I were called in to rename Web 2.0, I think I'd call it Gilligan's Web, if only to underscore the symbiosis between the pop-culture artifacts of the mass media and so much of the user-generated content found online.
So imagine my bewilderment as I listened to Clay Shirky argue, in his speech "Gin, Television, and Cognitive Surplus," that Gilligan's Island and Web 2.0 are actually opposing forces in the grand sweep of human history. Whoa. Is Professor Shirky surfing a different web than the rest of us?
To Shirky, the TV sitcom, as exemplified by Gilligan's Island, was "the critical technology for the 20th century." Why? Because it sucked up all the spare time that people suddenly had on their hands in the decades after the second world war. The sitcom "essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat." I'm not exactly sure what Shirky means when he speaks of society overheating, but, anyway, it wasn't until the arrival of the World Wide Web and its "architecture of participation" that we suddenly gained the capacity to do something productive with our "cognitive surplus," like edit Wikipedia articles or play the character of an elf in a World of Warcraft clan. Writes Shirky:
Shirky's calculus seems to go something like this:
But that's not quite fair, because Shirky is making a larger argument about society and its development. He's got bigger fish to fry than Gilligan and his mates. The journalist and blogger Scott Rosenberg does a nice job of summing up Shirky's argument:
What Shirky is doing here, in essence, is repackaging the liberation mythology that has long characterized the more utopian writings about the Web. That mythology draws a sharp distinction between our lives before the coming of the Web (BW) and our lives after the Web's blessed birth (AW). In the dark BW years, we were passive couch potatoes who were, in Shirky's words, "forced into the channel of media the way it was because it was the only option." We were driftwood, going with whatever flow "the media" imposed on us. We were all trapped in Shirky's musty cellar.
The Web, the myth continues, emancipated us. We no longer were forced into the channel of passive consumption. We could "participate." We could "share." We could "produce." When we turned our necks from the TV screen to the computer screen, we were liberated:
I think we'd all agree that the Web is changing the structure of media, and that's going to have many important ramifications. Some will be good, and some will be bad, and the way they will all shake out remains unknown. But what about Shirky's idea that in the BW years we were unable to do anything "interesting" with our "cognitive surplus"—that the "only option" was watching TV? That, frankly, is bull. It may well be that Clay Shirky spent all his time pre-1990 watching sitcoms in his cellar (though I very much doubt it) but I was also alive in those benighted years, and I seem to remember a whole lot more going on.
Did my friends and I watch Gilligan's Island? You bet we did—and thoroughly enjoyed it (though with a bit more ironic distance than Shirky allows). Watching sitcoms and the other junk served up by the boob tube was certainly part of our lives. But it was not the center of our lives. Most of the people I knew were doing a whole lot of "participating," "producing," and "sharing," and, to boot, they were doing it not only in the symbolic sphere of the media but in the actual physical world as well. They were making 8-millimeter films, playing drums and guitars and saxophones in bands, composing songs, writing poems and stories, painting pictures, making woodblock prints, taking and developing photographs, drawing comics, souping up cars, constructing elaborate model railroads, reading great books and watching great movies and discussing them passionately well into the night, volunteering in political campaigns, protesting for various causes, and on and on and on.
People were, in other words, every bit as capable of living rich, multidimensional, interesting, creative, and "participative" lives before the web came along as they are today—and a lot of people did live such lives. And they often lived them even while spending considerable portions of their time watching TV or drinking gin or sitting in a lotus position intentionally frittering away their "cognitive surplus." (There’s a creepy kind of neo-Puritanism at work in Shirky’s calculations of how productively we’re "deploying" our "cognitive surplus," but that’s a different story.)
It's worth remembering that Gilligan's Island originally ran on television from late 1964 to late 1967, a period noteworthy not for its social passivity but for its social activism. These were years not only of great cultural and artistic exploration and inventiveness (it was the first great age of the garage band, for one thing) but also of widespread protest, when people organized into very large—and very real—groups within the civil rights movement, the antiwar movement, the feminist movement, the folk movement, the psychedelic movement, and all sorts of other movements. People weren't in their basements; they were in the streets.
If everyone was so enervated by Gilligan's Island, how exactly do you explain 1968? The answer is: you don't, and you can't.
Indeed, once you begin contrasting 1968 with 2008, you might even find yourself thinking that, on balance, the Web is not an engine for social activism but an engine for social passivity. You might even suggest that the Web funnels our urges for "participation" and "sharing" into politically and commercially acceptable channels—that it turns us into play-actors, make-believe elves in make-believe clans.
To use a computer science rather than economic analogy, what Shirky is talking about is what I call the "awesome power of spare cycles"—the human potential that isn't tapped by our jobs, which for most of us is a lot of it. People wonder how Wikipedia magically arose from nothing, and how 50 million bloggers suddenly appeared, almost all of them writing for free. Who knew there was so much untapped energy all around us, just waiting for a catalyst to become productive? But of course there was. People are bored, and they'd rather not be. The guy playing Solitaire on his laptop at the airport? Spare cycles. Multiply it times a million.
Edge has latterly published two provocative pieces, Jon Haidt's essay on why people vote Republican and Clay Shirky's ruminations and calculations on the cognitive surplus we have at our disposal. To a historian, these pieces dovetail and underscore a fundamental landslip that's taking place around us. ... no Edge visitor should miss either. Roughly speaking, we are discovering that words don't matter.
Or they don't matter as much as we thought. ...
Shirky's piece gives more context for our transition away from words that matter. I don't mean we don't speak and write and that words aren't highly functional tools, but the exact framing of sentences and the precise structure of the verbal argument are less and less important. Bullet points on a powerpoint get the conversation going and the group working together gets to the result that matters. The "writer" is less important than he has been since, oh, Herodotus. (Example? Obama's speech on race earlier this summer. Good work, well-written, seen by almost no one, read by a few, and then blown off the screens by his preacher's TV appearances. Net result, the image and the illogic prevail.)
Shirky is one of many voices confirming that this fading of the power of the specific written word is not all bad news and even has good news to it, but the old classics professor in me at least needs to slow down long enough to observe the the humanistic culture of the orator from Demosthenes to Martin Luther King Jr. is decisively gone. We don't fully understand what's replacing it, but it's happening all around us—you might even call it a third culture...
When Clay Shirky says "here comes everybody" and foresees a rapidly-exponentiating realm for assertive human creativity, I am a fellow-traveler—although with some worries and dour reservations.
Twenty years ago I wrote about a near future when online communications, agile vision and instant knowledge would unleash individual self-expression in a profusion of hobbies, avocations, side-vocations and ad-hoc interest groups, shattering rigid categories and guild boundaries of the past. The coming of an "Age of Amateurs" seemed obvious then, for a number of reasons.
And yes, I appreciate Clay Shirky's historical narrative about "getting accustomed to surplus." These revolutions go way back and often require adjustment. I loved his reference to everybody getting stoned in the Age of Gin, and comparing this (simplistically but amusingly) to the way folks became couch potatoes in the era of TV. And yet, Nicholas Carr is right to take umbrage, pointing out that those decades contained plenty of people bent on being more than mere passive content-consumers. The great work of improving society took gumption. Indeed, the personal computer arose out of hobbyists who saw, in the CRT screen, something potentially far greater than a mere glass teat.
Still, I balked when Carr brought up 1968 as a year of shining involvement. I shuddered. But more on that, anon.
As usual, I find myself pointing out the obvious—that things have been a lot more complex than either Shirky or Carr would have us perceive. Both of them are very right and both tragically wrong. For example, even in the 1960s, Marshall McLuhan sensed somehow—without actually foreseeing the Internet—that new media would foster a more active way of viewing the world. He had a vague notion of what he wanted, what was needed, but only vaporous hopes for how it might come about.
Recall how some folks fantasized such a role for Public Access cable TV? Again, desire far exceeded reification. And yet, desire sometimes refuses to be thwarted! Think about the lowly VCR—a Rube Goldberg contraption of such astounding complexity, that it should never have worked, let-alone been mass produced so cheaply and reliably that, soon, nobody even bothered to repair them. Long before the arrival of digital media, people somehow got what they wanted most, an ability to control what they would watch, the purest case of mass desire overcoming the limitations of practical technology.
Key point: Yes, new tools can propel new ways of thinking. But just as often, vision precedes the tools. A theme that I'll return-to.
Getting back to the notion of specialization, I've asserted that the one and only truly monotonic trend of the 20th Century was the professionalization of everything—continuing down a road that began in early farming towns of the Zagros mountains. When agriculture provided a predictable excess of food production, a top layer of specialists could be supported. At first, specialist thieves and bullies. Then—after some adjustment—a layer of specialists who gave value back with literacy (expanded memory) and the perspective from atop a ziggurat (expanded vision.) These vision/knowledge revolutions have happened many times since, and Shirky is right that adjustment is never easy.
And, yes, he can sense that the most recent shift is new and different from all of those that came before. After six thousand years, that trend toward ever-greater specialization has reached both its culmination (in a society filled with highly-trained college graduates) and its ultimate limit.
Indeed, given the range of proliferating problems that lie ahead, only such a civilization will have the agility to respond quickly to rapidly varying demands for attention and expertise and critical exchanges of accountability. So, yes, so far I agree with Shirky. But where we part company is over how natural or easy the next step on this path will be, or whether all that eager "involvement" will actually accomplish much.
In fact, I am far less satisfied than he is with the enabling systems that exist, or seem to be on the drawing boards. A world filled with assertive amateurs will be better than one of bland consumer-drones, sure. But it will still fall far short of its potential, if those amateurs are effectively lobotomized by software and interfaces and tools that limit what they can ponder, communicate or achieve.
Indeed, there are some failure modes—e.g. the creation of a myriad super-empowered angry young fanatics—that are likely to be fostered by a primitive fiesta of self-expression. We already see a grand vista, not of discourse but of miniature Nuremberg rallies, with millions coalescing to heil their group totems. And when this goes sour, there will be only two possible solutions. Either a retreat into hierarchical control, or a true continuation down the path of empowered citizenship, to a world where reciprocal accountability and mass/individualist creativity take us to another level.
Clay Shirky's essay prompts a question: "If those past "revolutions" were so chaotic and painful, why are you so blithe about the present one going well?" I look at all the crude socialnet sites, at Second Life, at the blogosphere, and perceive something halfway between his wondrous, self-organizing realm of free citizenship and the cesspool of rancid opinion perceived by Nicholas Carr and the cybergrouches. A lot of good has come out of the new trends... the web and wikis and blogosphere have been (variously) useful and empowering and a lot more potential is there. But overall, if this is all we can hope for—a Force 5 gale of raw opinion—then the grouches win on points.
Tell me about the sites where really bad assertions go to die —the way phlogiston and witch-burnings died—a well-deserved death that ought to follow the most noxious assertions across our culture, so that truly disproved nonsense can actually go away, making way for new ideas. If you dismiss this as impossible, then I think your hopes for the web are far too timid, since the allegory should be a vivid human mind—and complex human beings, sane ones, can actually drop a bad idea, from time to time.
Show me the synchronous virtual realms where people communicate in units larger than a cutoff sentence. Yes, there are asynchronous realms, like this one, where bright adults do express ideas more complex than a sentence. Terrific. But does anything actually happen? Show me the software that helps really smart mobs to coalesce. To those who say such things already exist, I have to reply "Guys, your standards and expectations are really low! And unworthy of your dreams."
Recall Nicholas Carr's evocation of that dire year, 1968, one that was more exhausting than any decade. A majority of Americans did sit at home, across that awful, compact-epoch, suckling their boob tubes and nursing resentment toward those who had chosen to get involved. Shirky is right that the post-Web world would have overcome some of that passivity and provided more varieties of involvement. Still, Carr is also right, to suggest plus ca change...
To me, the allegory of that year is far more disturbing. My father was twenty feet from RFK when he was shot. I saw the roiling maelstrom of sanctimony and delusion that drenched all sides, in an era when people thought that they were fantastically well-informed by new media and when oversimplifications made caricatures of every good intention. And I see a chilling reflection of today.
|Return to "GIN, TELEVISION, AND COGNITIVE SURPLUS: A Talk by Clay Shirky"|