Charles II had the right idea. He trusted (and endowed) the small group of oddballs who were forming the Royal Society, and put a stop on the Exchequer. If he had rescued the bankers, and ignored William Petty’s band of Natural Philosophers, where would we be now?
THEORY OF GAMES AND ECONOMIC MISBEHAVIOR [7.26.09]
An Edge Original Essay
Further Reading. George Dyson on Edge: Turing's Cathedral; The Universal Library; Economic Dis-Equilibrium; Darwin Among The Machines; Or, The Origins Of [Artificial] Life
THEORY OF GAMES AND ECONOMIC MISBEHAVIOR
"I refuse to accept however, the stupidity of the Stock Exchange boys, as an explanation of the trend of stocks," wrote John von Neumann to Stanislaw Ulam, on December 9, 1939. "Those boys are stupid alright, but there must be an explanation of what happens, which makes no use of this fact." 1 This question led von Neumann (in collaboration with Oskar Morgenstern) to his monumental Theory of Games and Economic Behavior, demonstrating how a reliable economy can be constructed out of unreliable parts.
After threats of cancellation by Princeton University Press over the "mammoth" manuscript's escalating length, the book was finally published in September of 1944. "We accepted the manuscript without subsidy or any other stipulation in the fall of 1941, at which time the manuscript was believed to be virtually complete," complained Datus C. Smith in 1943. "At that time it had a manufacturing cost of $1,275... however, in the course of finishing [the last] chapter they uncovered a rich vein of new material, and the manuscript was not actually completed for something like eighteen months... and almost quadrupled in cost." 2
Theory of Games and Economic Behavior placed the foundations of economics, evolution, and intelligence on common mathematical ground. "Unifications of fields which were formerly divided and far apart," counseled von Neumann and Morgenstern in their introduction, "are rare and happen only after each field has been thoroughly explored." 3 The von Neumann and Morgenstern approach (extended by von Neumann's Probabilistic Logics and the Synthesis of Reliable Organisms From Unreliable Components) assumes that human unreliability and irrationality will, in the aggregate, be filtered out. In the real world, however, irrational behavior (including the "stupidity of the stock exchange boys") is not completely filtered out. A new generation of behavioral economists — and new modes of economic misbehavior — are reminding us of that.
Von Neumann and Morgenstern developed their New Testament from first principles and largely ignored prior art. Among the Old Testament prophets whose work preceded them were André-Marie Ampère (1775-1836) and Georges-Louis Leclerc comte de Buffon (1707-1788). Buffon was a celebrated naturalist whose evolutionary theories preceded both Charles and Erasmus Darwin, advancing ideas that were risky at the time. "Buffon managed, albeit in a somewhat scattered fashion," wrote Loren Eiseley, "at least to mention every significant ingredient which was to be incorporated into Darwin's great synthesis of 1859." 4 Buffon's son and Ampère's father both died under the guillotine in postrevolutionary France.
Buffon's Essai d'arithmétique morale, published in 1777, launched the fields of behavioral economics and evolutionary game theory by posing a simple question: why will someone purchase a lottery ticket when they are more likely to die in the next 24 hours than win? Buffon also introduced what, thanks to Stanislaw Ulam, we call the Monte Carlo method. First improvised as a way to numerically evaluate a physical system by invoking a statistical model, Monte Carlo approximations — given ever more powerful computers — are steadily becoming more faithful to the underlying nature of reality than the analytical methods they replaced.
André-Marie Ampère (who coined the term Cybernétique in reference to control theory) published his Considérations sur la théorie mathématique du jeu (On the mathematical theory of games) at the age of twenty-seven in 1802. Ampère began his study by crediting Buffon ("an author in whom even errors bear the imprint of genius") as the forefather of mathematical game theory, citing his Essai d'Arithmétique Morale. Ampère favored probability over strategy and saw games of chance as "certain ruin" to those who played indefinitely or indiscriminately against multiple opponents, "who must then be considered as a single opponent whose fortune is infinite." 5 He observed that a zero-sum game (where one player's loss equals the other players' gain) will always favor the wealthier player, who has the advantage of being able to remain longer in the game.
Von Neumann's initial contribution to the theory of games, extending the work of Émile Borel, was published in 1928. Where Ampère saw chance as holding the upper hand, von Neumann showed how losses at the hand of fate could be held to a minimum through his "minimax" theorem on the existence of good strategies, proving for a wide class of games that a determinable strategy exists that minimizes the expected loss to a player when the opponent tries to maximize that loss by playing as well as possible. Much of von Neumann and Morgenstern's 625-page treatise is devoted to showing how seemingly intractable situations can be rendered solvable through the assumption of coalitions among the players, and how non-zero-sum games can be reduced to zero-sum games by including a fictitious, impartial player (sometimes called Nature) in the game.
Game theory was applied to fields ranging from nuclear deterrence to evolutionary biology. "The initial reaction of the economists to this work was one of great reserve, but the military scientists were quick to sense its possibilities in their field," wrote J. D. Williams in The Compleat Strategyst, a RAND Corporation best-seller that made game theory accessible through examples drawn from everyday life. 6 The economists gradually followed. When John Nash was awarded a Nobel Prize for the Nash equilibrium in 1994, he became the seventh Nobel laureate in economics whose work had been derived from von Neumann's results. Von Neumann "darted briefly into our domain," commented mathematical economist Paul Samuelson, looking back after fifty years, "and it has never been the same since." 7
In 1945 the Review of Economic Studies published von Neumann's "Model of General Economic Equilibrium," a nine-page paper read to a Princeton mathematics seminar in 1932 and first published (in German) in 1937. Von Neumann elucidated the behavior of an economy where "goods are produced not only from 'natural factors of production,' but... from each other." In this autocatalytic economy, equilibrium and expansion coexist at the saddle-point between convex sets. "The connection with topology may be very surprising at first," von Neumann noted, "but the author thinks that it is natural in problems of this kind." 8
Some of the assumptions of von Neumann's "expanding economic model" — that "natural factors of production, including labour, can be expanded in unlimited quantities" and that "all income in excess of necessities of life will be reinvested" — appeared unrealistic at the time, less so now that Moore's Law is driving economic growth. Other assumptions, such as an invariant financial clock cycle, are conservative under the conditions now in play. The problem, as recent events have demonstrated, is that the surface of a bubble is also a convex set.
Von Neumann's collaboration with Oskar Morgenstern was followed by a similar collaboration with Stanislaw Ulam, aimed at developing a unified theory of self-reproduction among living and non-living (or yet-to-be-living) things. Von Neumann's death at age 54 (and the choice of a collaborator not as disciplined as Oskar Morgenstern) left a fragmentary, incomplete manuscript that was compiled posthumously and published as Theory of Self-Reproducing Automata, edited by Arthur W. Burks. 9
Theory of Games and Economic Behavior and Theory of Self-Reproducing Automata were launched on a collision course. The five kilobytes of random-access electrostatic memory that spawned von Neumann's original digital universe at a cost of roughly one hundred thousand 1947 dollars, costs one-hundredth of one cent today — and cycles a thousand times as fast. An increasingly permeable barrier separates living from non-living codes. Over billions of years, biological species learned to survive in a noisy, analog environment by passing themselves, once a generation, through a digital, error-correcting phase, the same way repeater stations are used to convey intelligible messages over submarine cables where noise is being introduced. A world that was digital once a generation is now all-digital, all the time.
"An organism (any reason to be afraid of this term yet?) is a universal automaton which produces other automata like it in space which is inert or only 'randomly activated' around it," explained Ulam, reporting on a conversation with von Neumann that took place on a bench in Central Park in early November 1952. The two mathematicians had probably retreated to that park bench to discuss the top secret Ivy Mike hydrogen bomb test conducted on November 1 at Eniwetok Atoll in the South Pacific, yielding 10.4 megatons — almost one thousand Hiroshimas — and a fireball three miles across. The explosion, based on an insight of Ulam's that had been fleshed out with calculations organized by von Neumann, removed the entire island of Elugelab from the map. That part of the conversation went unrecorded, but when the subject turned to creation and evolution in a digital universe, Ulam preserved some sketchy notes.
"This 'universality' is probably necessary to organize or resist organization by other automata?" Ulam asked. He (and von Neumann) realized that any real evolution in a digital universe "would have to involve an enormous amount of probabilistic superstructure to the outlined theory. I think it should probably be omitted unless it involves the crux of the generation and evolution problem — which it might?" The unbounded (and increasingly probabilistic) digital universe that Ulam and von Neumann imagined as a mathematical abstraction now exists, growing by billions of transistors per second and with more secondary storage than anyone can count. We are surrounded by codes (some Turing-universal) that make copies of themselves, and by physical machines that spawn virtual machines that in turn spawn demand for more physical machines. Some digital sequences code for spreadsheets, some code for music, some code for operating systems, some code for sprawling, metazoan search engines, some code for proteins, some code for the gears used in numerically-controlled gear-cutting machines, and, increasingly, some code for DNA belonging to individuals who serve as custodians and creators of more code. "It is easier to write a new code than to understand an old one," von Neumann warned.
Evolution in the Ulam-von Neumann universe now drives evolution (and economics) in our universe, rather than the other way around. The current misbehavior of our economy, however much it reflects misbehavior by human individuals and institutions, is more a reflection of the behavior of self-reproducing machines and self-replicating codes. We measure our economy in money, not in things. In the age of self-reproducing automata, we can suffer a declining economy, and pandemic unemployment, while still producing as much stuff as people are able to consume. We are facing the first economic downturn to include free cell phones, more automobiles than we have room for (in many locations you can now rent a car for less than it costs to park one) and computers that cost less than a month's health insurance yet run at billions of cycles per second for years.
Why the growing imbalance between the cost of people and the cost of machines? What prices are going up the fastest? Health care — the cost of maintaining human beings. What prices are going down the fastest? The cost of information and machines. What, really, is health-care reform? Human beings are being cared for by a dysfunctional, antiquated system, and we hope that this can be reformed by adopting efficiencies from the domain of machines. Where will this lead? Are we using computers to sequence, store, and more faithfully replicate our own genetic code, or are computers optimizing our genetic code (and health) so that we can do a better job of replicating them?
Replication of information is generally a public good (however strongly pockets of resistance may disagree). When financial instruments become self-replicating, trouble often ensues. The derivatives now haunting us were produced, not from natural factors of production or other goods, but from other financial instruments. There are numerous precedents for this.
As early as the twelfth century it was realized that money, like information but unlike material objects, can be made to exist in more than one place at a single time. An early embodiment of this principle, preceding the Bank of England by more than five hundred years, were Exchequer tallies — notched wooden sticks issued as receipts for money deposited with the Exchequer for the use of the king. "As a financial instrument and evidence it was at once adaptable, light in weight and small in size, easy to understand and practically incapable of fraud," wrote Hilary Jenkinson in 1911. "By the middle of the twelfth century, there was a well-organized and well-understood system of tally cutting at the Exchequer... and the conventions remained unaltered and in continuous use from that time down to the nineteenth century." 10
A precise description was given by Alfred Smee, resident surgeon to the Bank of England and the son of the accountant general (as well as the inventor of electroplating, electrical facsimile transmission, an artificial muscle, and other prescient ideas). "Curiously enough, I have ascertained that no gentleman in the Bank of England recollects the mode of reading them," Smee reported in 1850. "The tally-sticks were made of hazel, willow, or alder wood, differing in length according to the sum required to be expressed upon them. They were roughly squared, and one end was pointed; and on two sides of that extremity, the proper notches, showing the sum for which the tally was a receipt, were cut across the wood." 11
On the other two sides of the tally were written, in ink and in duplicate, the name of the party paying the money, the account for which it was paid, and the date of payment. The tally was then split in two, with each half retaining the notched information as well as one copy of the inscription. "One piece was then given to the party who had paid the money, for which it was a sufficient discharge," Smee continues, "and the other was preserved in the Exchequer. Rude and simple as was this very ancient method of keeping accounts, it appears to have been completely effectual in preventing both fraud and forgery for a space of seven hundred years. No two sticks could be found so exactly similar, as to admit of being identically matched with each other, when split in the coarse manner of cutting tallies; and certainly no alteration of the particulars expressed by the notches and inscription could remain undiscovered when the two parts were again brought together. And, as if it had been further to prove the superiority of these instruments over writing, two attempts at forgery were reported to have been made on the Exchequer, soon after the disuse of the ancient wooden tallies in 1834." 12
Exchequer tallies were ordered replaced in 1782 by an "indented cheque receipt," but the Act of Parliament (23 Geo. 3, c. 82) thereby abolishing "several useless, expensive and unnecessary offices" was to take effect only on the death of the incumbent who, being "vigorous," continued to cut tallies until 1826. "After the further statute of 4 and 5 William IV the destruction of the official collection of old tallies was ordered," noted Hilary Jenkinson. "The imprudent zeal with which this order was carried out caused the fire which destroyed the Houses of Parliament in 1834." 13
The notches were of various sizes and shapes corresponding to the tallied amount: a 1.5-inch notch for £1000, a 1-inch notch for £100, a half-inch notch for £20, with smaller notches indicating pounds, shillings, and pence, down to a halfpenny, indicated by a pierced dot. The code was similar to bar-coding, or the notches still used to identify the emulsion speed of photographic film in the dark. And the self-authentication achieved by distributing the information across two halves of a unique piece of wood is analogous to the way large numbers, split into two prime factors, are used to authenticate digital financial instruments today. Money was being duplicated: the King gathered real gold and silver into the treasury through the Exchequer, yet the tally given in return allowed the holder to enter into trade, manufacturing, or other ventures, producing real wealth with nothing more than a wooden stick.
Until the Restoration tallies did not bear interest, but in 1660, on the accession of Charles II, interest-bearing tallies were introduced. They were accompanied by written orders of loan which, being made assignable by endorsement, became the first negotiable interest-bearing securities in the English-speaking world. Under pressure of spiraling government expenditures the order of loan was soon joined by an instrument called an order of the Exchequer, drawn not against actual holdings but against future revenue and sold at a discount to the private goldsmith bankers whose hard currency was needed to prop things up. In January 1672, unable to meet its obligations, Charles II declared a stop on the Exchequer. At the expense of the private bankers, this first experiment with derivative financial instruments came to an end.
Today's Exchequer, distributed across the global banking network, splits digital tallies by the millions in milliseconds: above human scale in magnitude and beyond human scale in time. High-speed trading programs not only have access to unlimited funding; by dividing time into ever-smaller increments they also, effectively, have access to unlimited time, and, in the words of Ampère, "must then be considered as a single opponent whose fortune is infinite." Can this be stopped?
Financial systems exhibit the Gödelian incompleteness characteristic of all sufficiently powerful formal systems: within the given system it is possible to construct statements (or financial instruments) whose value appears to be sound, but cannot be proved within the system itself. No financial system can ever be completely secure and closed. There is no limit to the level of concepts (including fraudulent ones) that an economy is able to comprehend. The system depends on trust.
"A Banke is a certain number of sufficient men of Credit and Estates joyned together in a stock, as it were for keeping several mens Cash in one Treasury, and letting out imaginary money at interest... and making payment thereof by Assignation, passing each mans Accompte from one to another, yet paying little money," wrote Francis Cradocke in 1660, in An Expedient For taking away all Impositions, and raising a Revenue without Taxes, By Erecting Bankes for the Encouragement of Trade. 14 Establishing a bank requires secure information storage to keep accounts, a license from the government (or an entity beyond government), a small amount of capital, and a large amount of trust.
"A Banker," explained Sir William Petty, co-founder of the Royal Society and author of Political Arithmetick, in 1682, "is honest only upon the Penalty of losing a beneficial Trade, founded upon a good Opinion of the World, which is called Credit." Credit, by definition, cannot easily be restored; its nature is to shift somewhere else. We should be less concerned with loss of money and more concerned with loss of trust. If we have to start over with more trust and less money, is this really so bad? "Is not a Country the Poorer for having less Money?" asked William Petty. "Not always," he answered, "For as the most thriving Men keep little or no Money by them, but turn and wind it into various Commodities to their great Profit, so may the whole Nation also." 15
Charles II had the right idea. He trusted (and endowed) the small group of oddballs who were forming the Royal Society, and put a stop on the Exchequer. If he had rescued the bankers, and ignored William Petty's band of Natural Philosophers, where would we be now?
The engrossing essay collection which offers a youthful spin on some of the most pressing scientific issues of today—and tomorrow...Kinda scary? Yes! Super smart and interesting? Definitely. — The Observer's Very Short List
"A captivating collection of essays ... a medley of big ideas." — Amanda Gefter, New Scientist
"The perfect collection for people who like to stay up on recent scientific research but haven't the time or expertise to go to the original sources." — Playback.stl.com
"[An] engaging book. Perhaps the world started with a bang, but if the scientists who contributed to "What's Next?" have anything to do with it, it will certainly not end with a whimper." — Washington Times
If these authors are the future of science, then the science of the future will be one exciting ride! Find out what the best minds of the new generation are thinking before the Nobel Committee does. A fascinating chronicle of the big, new ideas that are keeping young scientists up at night. — Daniel Gilbert, author of Stumbling on Happiness
"A preview of the ideas you're going to be reading about in ten years." — Steven Pinker, author of The Stuff of Thought
"Brockman has a nose for talent." — Nassim Nicholas Taleb, author The Black Swan
"Capaciously accessible, these writings project a curiosity to which followers of science news will gravitate." — Booklist