About
Features
Editions
Press
Events
Dinner
Question Center
Subscribe

We must stop perpetuating the fiction that existence itself is dictated by the immutable laws of economics. These so-called laws are, in actuality, the economic mechanisms of 13th Century monarchs. Some of us analyzing digital culture and its impact on business must reveal economics as the artificial construction it really is. Although it may be subjected to the scientific method and mathematical scrutiny, it is not a natural science; it is game theory, with a set of underlying assumptions that have little to do with anything resembling genetics, neurology, evolution, or natural systems.

ECONOMICS IS NOT NATURAL SCIENCE [8.11.09]
By Douglas Rushkoff

An Edge Original Essay

DOUGLAS RUSHKOFF is a media analyst; documentary filmmaker, and author. His latest book is Life Inc.: How the World Became a Corporation and How to Take It Back.

Doulgas Rushkoff's Edge Bio Page

The Reality Club: George Dyson




On "Economics Is Not a Natural Science"

George Dyson:
...How to best transcend the current economic mess? Put Jeff Bezos, Pierre Omidyar, Elon Musk, Tim O'Reilly, Larry Page, Sergey Brin, Nathan Myhrvold, and Danny Hillis in a room somewhere and don't let them out until they have framed a new, massively-distributed financial system, founded on sound, open, peer-to-peer principles, from the start. And don’t call it a bank. Launch a new financial medium that is as open, scale-free, universally accessible, self-improving, and non-proprietary as the Internet, and leave the 13th century behind.

[...]


ECONOMICS IS NOT NATURAL SCIENCE

The marketplace in which most commerce takes place today is not a pre-existing condition of the universe. It's not nature. It's a game, with very particular rules, set in motion by real people with real purposes. That's why it's so amazing to me that scientists, and people calling themselves scientists, would propose to study the market as if it were some natural system — like the weather, or a coral reef.

It's not. It's a product not of nature but of engineering. And to treat the market as nature, as some product of purely evolutionary forces, is to deny ourselves access to its ongoing redesign. It's as if we woke up in a world where just one operating system was running on all our computers and, worse, we didn't realize that any other operating system ever did or could ever exist. We would simply accept Windows as a given circumstance, and look for ways to adjust our society to its needs rather than the other way around.

It is up to our most rigorous thinkers and writers not to base their work on widely accepted but largely artificial constructs. It is their job to differentiate between the map and the territory — to recognize when a series of false assumptions is corrupting their observations and conclusions. As the great interest in the arguments of Richard Dawkins, Daniel Dennett, Sam Harris, and Christopher Hitchens shows us, there is a growing acceptance and hunger for thinkers who dare to challenge the widespread belief in creation mythologies. That it has become easier to challenge the supremacy of God than to question the supremacy of the market testifies to the way any group can fall victim to a creation myth — especially when they are rewarded to do so.

Too many technologists, scientists, writers and theorists accept the underlying premise of our corporate-driven marketplace as a precondition of the universe or, worse, as the ultimate beneficiary of their findings. If a "free" economy of the sort depicted by Chris Anderson or Clay Shirky is really on its way, then books themselves are soon to be little more than loss leaders for high-priced corporate lecturing. In such a scheme how could professional writers and theorists possibly escape biasing their works towards the needs of the corporate lecture market? It's as if the value of a theory or perspective rests solely in its applicability to the business sector.

Whether it's being done in honest ignorance, blind obedience, or cynical exploitation of the market, the result is the same: our ability to envision new solutions to the latest challenges is stunted by a dependence on market-driven and market-compatible answers. Instead, we are encouraged to apply the rules of genetics, neuroscience, or systems theory to the economy, and to do so in a dangerously determinist fashion.

In their ongoing effort to define and the defend the functioning of the market through science and systems theory, some of today's brightest thinkers have, perhaps inadvertently, promoted a mythology about commerce, culture, and competition. And it is a mythology as false, dangerous, and ultimately deadly as any religion.

The trend began on the pages of the digital business magazine, Wired, which served to reframe new tech innovations and science discoveries in terms friendly to disoriented speculators. Wired would not fundamentally challenge the market; it would provide bankers and investors with a map to the new territory, including the consultants they'd need to maintain their authority over the economy.

The first and probably most influential among them was Peter Schwartz, who, in 1997, with Peter Leyden, forecast a "long boom" of at least 25 years of prosperity and environmental health fueled by digital technology and, most importantly, the maintenance of open markets. Kevin Kelly foresaw the way digital abundance would challenge scarce markets, and offered clear rules through which the largest companies could still thrive on the phenomenon.

Stewart Brand joined Schwartz and others in cofounding GBN, a futurist consulting firm whose very name Global Business Network , seemed to cast the emergence of a web economy in a new light. What did it mean that everyone from William Gibson to Brian Eno to Marvin Minsky would now be consulting to the biggest corporations on earth? Would they even be able to control their own messages? Brand did famously say in 1984 that "information wants to be free." But, much less publicized and remembered, he did so only after explaining that "information wants to be expensive, because it's so valuable." Would his and others' work now be parsed for the tidbits most effective at promoting a skewed vision of the new economy? Would the counterculture be able to use its newfound access to the board rooms of the Fortune 500 to hack the business landscape, or had they simply surrendered to the eventual absorption of everything and everyone to an eternal primacy of corporate capitalism? The "scenario plans" that resulted from this work, through which corporations could envision continued domination of their industries, appeared to indicate the latter.

Chris Anderson has analyzed where all this is going, and — rather than offering up a vision of a post-scarcity economy — advised companies to simply leverage the abundant to sell whatever they can keep scarce. Likewise, Tim O'Reilly and John Batelle's new, highly dimensional conception of the net — Web Squared— ultimately offers itself up as a template through which companies can make money by controlling the indexes people use to navigate information space.

Both science and technology are challenging long-held assumptions about top-down control, competition, and scarcity. But our leading thinkers are less likely to provide us with genuinely revolutionary axioms for a more highly evolved marketplace than reactionary responses to the networks, technologies, and discoveries that threaten to expose the marketplace for the arbitrarily designed poker game it is. They are not new rules for a new economy, but new rules for propping up old economic interests in the face of massive decentralization.

While we can find evidence of the corporate marketplace biasing the application of any field of inquiry, it is our limited economic perspective that prevents us from supporting work that serves values external to the market. This is why it is particularly treacherous to limit economic thought to the game as it is currently played, and to present these arguments with near-scientific certainty.

The sense of inevitability and pre-destiny shaping these narratives, as well as their ultimate obedience to market dogma, is most dangerous, however, for the way it trickles down to writers and theorists less directly or consciously concerned with market forces. It fosters, both directly and by example, a willingness to apply genetics, neuroscience, or systems theory to the economy, and of doing so in a decidedly determinist and often sloppy fashion. Then, the pull of the market itself does the rest of the work, tilting the ideas of many of today's best minds toward the agenda of the highest bidder.

So Steven Johnson ends up leaning, perhaps more than he should, on the corporate-friendly evidence that commercial TV and video games are actually healthy. (Think of how many corporations would hire a speaker who argued that everything bad — like marketing and media — is actually bad for you.) Likewise, Malcolm Gladwell finds himself repeatedly using recent discoveries from neuroscience to argue that higher human cognition is more than trumped by reptilian impulse; we may as well be guided by advertising professionals, since we're just acting mindlessly in response to crude stimuli, anyway. Everything becomes about business — and that's more than okay.

This widespread acceptance of the current economic order as a fact of nature ends up compromising the impact of new findings, and changing the public's relationship to the science going on around them. These authors do not chronicle (or celebrate) the full frontal assault that new technologies and scientific discoveries pose to, say, the monopolization of value creation or the centralization of currency. Instead, they sell corporations a new, science-based algorithm for strategic investing on the new landscape. Higher sales reports and lecture fees serve as positive reinforcement for authors to incorporate the market's bias even more enthusiastically the next time out. Write books that business likes, and you do better business. The cycle is self-perpetuating. But just because it pays the mortgage doesn't make it true.

In fact, thanks to their blind acceptance of a particular theory of the market, most of these concepts end up failing to accurately predict the future. Instead of 25 years of prosperity and eco-health, we got the dotcom bust and global warming. Immersion in media is not really good for us. People are capable of responding to a more complex call to action than the over-simplified and emotional rants of right-wing ideologues. The decentralizing effect of new media has been met by an overwhelming concentration of corporate conglomeration.

These theories fail not because the math or science underlying them is false, but rather because it is being inappropriately applied. Yet too many theorists keep buying into them, desperate for some logical flourish through which the premise of scarcity can somehow fit in, and business audiences won. In the process, they ignore the genuinely relevant question: whether the economic model, the game rules set in place half a millennium ago by kings with armies, can continue to hold back the genuine market activity of people enabled by computers.

People are beginning to create and exchange value again, and they are coming to realize the market they have taken for granted is not a condition of nature. This is the threat — and no amount of theoretical recontextualization is going to change that — or successfully prevent it.

Making Markets: From Abundance To Artificial Scarcity

The economy in which we operate is not a natural system, but a set of rules developed in the Late Middle Ages in order to prevent the unchecked rise of a merchant class that was creating and exchanging value with impunity. This was what we might today call a peer-to-peer economy, and did not depend on central employers or even central currency.

People brought grain in from the fields, had it weighed at a grain store, and left with a receipt — usually stamped into a thin piece of foil. The foil could be torn into smaller pieces and used as currency in town. Each piece represented a specific amount of grain. The money was quite literally earned into existence — and the total amount in circulation reflected the abundance of the crop.

Now the interesting thing about this money is that it lost value over time. The grain store had to be paid, some of the grain was lost to rats and spoilage. So each year, the grain store would reissue the money for any grain that hadn't actually been claimed. This meant that the money was biased towards transactions — towards circulation, rather than hording. People wanted to spend it. And the more money circulates (to a point) the better and more bountiful the economy. Preventative maintenance on machinery, research and development on new windmills and water wheels, was at a high.

Many towns became so prosperous that they invested in long-term projects, like cathedrals. The "Age of Cathedrals" of this pre-Renaissance period was not funded by the Vatican, but by the bottom-up activity of vibrant local economies. The work week got shorter, people got taller, and life expectancy increased. (Were the Late Middle Ages perfect? No — not by any means. I am not in any way calling for a return to the Middle Ages. But an honest appraisal of the economic mechanisms in place before our own is required if we are ever going to contend with the biases of the system we are currently mistaking for the way it has always and must always be.)

Feudal lords, early kings, and the aristocracy were not participating in this wealth creation. Their families hadn't created value in centuries, and they needed a mechanism through which to maintain their own stature in the face of a rising middle class. The two ideas they came up with are still with us today in essentially the same form, and have become so embedded in commerce that we mistake them for pre-existing laws of economic activity.

The first innovation was to centralize currency. What better way for the already rich to maintain their wealth than to make money scarce? Monarchs forcibly made abundant local currencies illegal, and required people to exchange value through artificially scarce central currencies, instead. Not only was centrally issued money easier to tax, but it gave central banks an easy way to extract value through debasement (removing gold content). The bias of scarce currency, however, was towards hording. Those with access to the treasury could accrue wealth by lending or investing passively in value creation by others. Prosperity on the periphery quickly diminished as value was drawn toward the center. Within a few decades of the establishment of central currency in France came local poverty, an end to subsistence farming, and the plague. (The economy we now celebrate as the happy result of these Renaissance innovations only took effect after Europe had lost half of its population.)

As it's currently practiced, the issuance of currency — a public utility, really — is still controlled in much the same manner by central banks. They issue the currency in the form of a loan to a bank, which in turn loans it a business. Each borrower must pay back more then he has acquired, necessitating competition — and more borrowing. An economy with a strictly enforced central currency must expand at the rate of debt; it is no longer ruled principally by the laws of supply and demand, but the debt structures of its lenders and borrowers. Those who can't grow organically must acquire businesses in order to grow artificially. Even though nearly 80% of mergers and acquisitions fail to create value for either party, the rules of a debt-based economy — and the shareholders it was developed to favor — insist on growth at the expense of long-term value.

The second great innovation was the chartered monopoly, through which kings could grant exclusive control over a sector or region to a favored company in return for an investment in the enterprise. This gave rise to monopoly markets, such as the British East India Trading Company's exclusive right to trade in the American Colonies. Colonists who grew cotton were not permitted to sell it to other people or, worse, fabricate clothes. These activities would have generated value from the bottom up, in a way that could not have been extracted by a central authority. Instead, colonists were required to sell cotton to the Company, at fixed prices, who shipped it back to England where it was fabricated into clothes by another chartered monopoly, and then shipped to back to America for sale to the colonists. It was not more efficient; it was simply more extractive.

The resulting economy encouraged — and often forced — people to accept employment from chartered corporations rather than create value for themselves. When natives of the Indies began making rope to sell to the Dutch East India Trading Company, the Company sought and won laws making rope fabrication in the Indies illegal for anyone except the Company itself. Former rope-makers had to close their workshops, and work instead for lower wages as employees of the company.

We ended up with an economy based in scarcity and competition rather than abundance and collaboration; an economy that requires growth and eschews sustainable business models. It may or may not better reflect the laws of nature — and that it is a conversation we really should have — but it is certainly not the result of entirely natural set of principles in action. It is a system designed by certain people at a certain moment in history, with very specific interests.

Like artists of the Renaissance, who were required to find patrons to support their work, most scientists, mathematicians, theorists, and technologists today must find support from either the public or private sectors to carry on their work. This support is not won by calling attention to the Monopoly board most of us mistake for the real economy. It is won by applying insights to the techniques through which their patrons can better play the game.

This has biased their observations and their conclusions. Like John Nash, who carried out game theory experiments for RAND in the 1950's, these business consultants see competition and self-interest where there is none, and reject all evidence to the contrary. Although he later recanted his conclusions, Nash and his colleagues couldn't believe that their subjects would choose a collaborative course of action when presented with the "prisoner's dilemma," and simply ignored their initial results.

Likewise, the proponents of today's digital libertarianism exploit any evidence they can find of evolutionary principles that reflect the fundamental competitiveness of human beings and other life forms, while ignoring the much more rigorously gathered evidence of cooperation as a primary human social skill. The late archeologist Glynn Isaac, for one, demonstrated how food sharing, labor distribution, social networking and other collaborative activities are what gave our evolutionary forefathers the ability to survive. Harvard biologist Ian Gilby's research on hunting among bats and chimps demonstrates advanced forms of cooperation, collective action, and sharing of meat disproportional to the risks taken to kill it.

Instead, it is more popular to focus on the self-interested battle for survival of the fittest. Whether or not he intends his work to be used this way, Steven Pinker's arguments about decreasing violence among humans over time are employed by others as evidence of the free market's peaceful influence on civilization. Ray Kurzweil relegates the entire human race to a subordinate role in the much more significant evolution of machines — a dehumanizing stance that dovetails all too well with an industrial marketplace in which most human beings are now relegated to the reactive role of consumers.

In Chris Anderson's vision of the coming "Petabyte Age," no human scientists are even required. That's because the structures that emerge from multi-dimensional data sets will be self-organizing and self-apparent. The emergent properties of natural systems and artificial markets are treated interchangeably. Like Adam Smith's "invisible hand," or Austrian economist Friedrich Hayek's notion of "catallaxy," markets are predestined to reach equilibrium by their very nature. Just like any other complex, natural system.

In short, these economic theories are selecting examples from nature to confirm the properties of a wholly designed marketplace: self-interested actors, inevitable equilibrium, a scarcity of resources, competition for survival. In doing so, they confirm — or at the very least, reinforce — the false idea that the laws of an artificially scarce fiscal scheme are a species' inheritance rather than a social construction enforced with gunpowder. At the very least, the language of science confers undeserved authority on these blindly accepted economic assumptions.

The Net Effect

Worst of all, when a potentially destabilizing and decentralizing medium such as the Internet comes along, this half-true and half-hearted style of inquiry follows the story only until a means to arrest its development is discovered and new strategies may be offered.

The open source ethos, through which anyone who understands the code can effectively redesign a program to his own liking, is repackaged by Jeff Howe as "crowdsourcing" through which corporations can once again harness the tremendous potential of real people acting in concert, for free. Viral media is reinvented by Malcolm Gladwell as "social contagion," or Tim Draper as "viral marketing" — techniques through which mass marketers can once again define human choice as a series of consumer decisions.

The decentralizing bias of new media is thus accepted and interpolated only until the market's intellectual guard can devise a new countermeasure for their patrons to employ on behalf of preserving business as usual.

Meanwhile, the same corporate libertarian think tanks using Richard Dawkins' theories of evolution to falsely justify the chaotic logic of capitalism through their white papers also advise politicians how to exploit the beliefs of fundamentalist Christian creationists in order to garner public support for self-sufficiency as a state of personal grace, and to galvanize suspicion of a welfare state. This is cynical at best.

It doesn't take a genius or a scientist to understand how the rules of the economic game as it is currently played reflect neither human values nor the laws of physics. The market cannot expand infinitely like the redshifts in Hubble's universe. How many other species attempt to store up enough fat during their productive years so that they can simply "retire" on their horded resources? How could a metric like the GNP accurately reflect the health of the real economy when toxic spills and disease epidemics alike actually count as short-term booms?

The Internet may be very much like a rhizome, but it is still energized by a currency that is anything but a neutral player. Most Internet business enthusiasts applaud Google's efforts to build open systems the same way their predecessors applauded the World Bank's gift of open markets to developing nations around the world — utterly unaware of (or unwilling to look at) what exactly we are opening our world to.

The net (whether we're talking Web 2.0, Wikipedia, social networks or laptops) offers people the opportunity to build economies based on different rules — commerce that exists outside the economic map we have mistaken for the territory of human interaction.

We can startup and even scale companies with little or no money, making the banks and investment capital on which business once depended obsolete. That's the real reason for the so-called economic crisis: there is less of a market for the debt on which the top-heavy game is based. We can develop local and complementary currencies, barter networks, and other exchange systems independently of a central bank, and carry out secure transactions with our cell phones.

In doing so, we become capable of imagining a marketplace based in something other than scarcity — a requirement if we're ever going to find a way to employ an abundant energy supply. It's not that we don't have the technological means to source renewable energy; it's that we don't have a market concept capable of contending with abundance. As Buckminster Fuller would remind us: these are not problems of nature, they are problems of design.

If science can take on God, it should not fear the market. Both are, after all, creations of man.

We must stop perpetuating the fiction that existence itself is dictated by the immutable laws of economics. These so-called laws are, in actuality, the economic mechanisms of 13th Century monarchs. Some of us analyzing digital culture and its impact on business must reveal economics as the artificial construction it really is. Although it may be subjected to the scientific method and mathematical scrutiny, it is not a natural science; it is game theory, with a set of underlying assumptions that have little to do with anything resembling genetics, neurology, evolution, or natural systems.

The scientific tradition exposed the unpopular astronomical fact that the earth was not at the center of the universe. This stance challenged the social order, and its proponents were met with less than a welcoming reception. Today, science has a similar opportunity: to expose the fallacies underlying our economic model instead of producing short-term strategies for mitigating the effects of inventions and discoveries that threaten this inherited market hallucination.

The economic model has broken, for good. It's time to stop pretending it describes our world.


The engrossing essay collection which offers a youthful spin on some of the most pressing scientific issues of today—and tomorrow...Kinda scary? Yes! Super smart and interesting? Definitely. The Observer's Very Short List

"A captivating collection of essays ... a medley of big ideas." — Amanda Gefter, New Scientist

"The perfect collection for people who like to stay up on recent scientific research but haven't the time or expertise to go to the original sources." — Playback.stl.com

"[An] engaging book. Perhaps the world started with a bang, but if the scientists who contributed to "What's Next?" have anything to do with it, it will certainly not end with a whimper." — Washington Times

WHAT'S NEXT?
Dispatches on the Future of Science
Edited By Max Brockman

If these authors are the future of science, then the science of the future will be one exciting ride! Find out what the best minds of the new generation are thinking before the Nobel Committee does. A fascinating chronicle of the big, new ideas that are keeping young scientists up at night. Daniel Gilbert, author of Stumbling on Happiness

"A preview of the ideas you're going to be reading about in ten years." — Steven Pinker, author of The Stuff of Thought

"Brockman has a nose for talent." — Nassim Nicholas Taleb, author The Black Swan

"Capaciously accessible, these writings project a curiosity to which followers of science news will gravitate." — Booklist


John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

Alexandra Zukerman, Editorial Assistant
contact: [email protected]

Copyright © 2009 By Edge Foundation, Inc
All Rights Reserved.

|Top|