| Home | About Edge| Features | Edge Editions | Press | Reality Club | Third Culture | Digerati | Edge:Feed | Edge Search |
John Barrow, Todd Siler, Peter Tallack, Brian Goodwin, Mary Catherine Bateson, Oliver Morton, Gino Segre, Colin Tudge, Murray Gell-Mann, and Patrick Bateson respond to the World Question Center, 1999.
To avoid incurring the wrath of some scholars, I wanted to add this parenthetical note (see asterisk below) to my statement about language. Hopefully, it clarifies my point a little; or, at least, focuses it.
My first candidate is "language"; specifically, our initial realization* of its creative potential, building on the intuitions of the ancient Greeks and Romans. Language is the life-force and body of communica- tion. It comprises all forms of symbolic creations, expressions and systems which we use to communicate: from the mathematical to the vernacular. Without language, every other invention and innovation may never have existed -- including humor!
My close-second candidate is E = mc2. When we learn to tap the full meaning of that piece of symbolic language, we'll create more than a Nuclear Age. "Matter is frozen energy," Einstein said, relating the essence of his insight into the mass-energy relationship. Similarly, language is frozen meaning. When we discover how to unleash the enormous energy in meaning by continually transforming information (data, ideas, knowledge, experience) in new contexts, we'll make a quantum leap in applying the power of language to achieve our boldest dreams.
* Note: Some people may choose to date our first deep realization of language's potential around the late 1700's. That's when the first scientific study of the nature and origins of language began to unfold through the systematic, comparative studies of the German scholars Friedrich Schlegel, Jakob Grimm, and Franz Bopp. Others may focus on the work of Ferdinand de Saussure whose general, descriptive method led to some basic laws that relate to all languages (about 3,000 or more now). My broad statement is meant to embrace the "makeup" of language: its symbolic nature, structures, semantics, and boundless usages. I'm not simply referring to the inventive act of classifying spoken and written languages into families, or categorizing the growth patterns of language, or charting the evolution of grammar.
From: Peter Tallack
The horse collar as the most important high-tech invention.
Developed around 1000 AD in northern Europe, it allowed the region to be farmed efficiently and so, it could be argued, was responsible for the rise of civilization there. It also gave its possessors great war-making potential think of knights in armour, for example.
The most important invention in the past two thousand years is the printing
press. When William Caxton published 'The Canterbury Tales' in the 15th
Century with his newly invented printing machine, he dramatically accelerated
the separation of human culture from nature, eclipsing the direct experience
of natural processes that continues in the oral tradition and replacing
it by words on a page. This cut in two directions. (1) The power of
nature diminished so that science and technology could start the systematic
program of gaining knowledge for control of nature, liberating people
from drudgery and freeing the imagination. (2) At the same time, nature
was degraded to a set of mechanisms that humans could manipulate for
their own purposes, and the 'rape of nature' began in earnest. We are
now reaping twin harvests: vastly expanded potential for written communication
through the internet, as in this exchange of views at the Edge web site;
and a vastly degraded planet that won't support us much longer, as things
are going. Can we use one to save us from the other? We can now connect
with each other as never before; but what about nature?
Mary Catherine Bateson
You asked me to comment on Gregory's (Bateson) economic man quote: "Of all our inventions, economic man is by far the dullest."
Gregory's candidate for most dullest invention was "economic man." We can be grateful that in this case no one has cleaned up his gender biased language because the concept is not and never was a gender neutral one. The dangerous idea that lies behind "economic man" is the idea that anyone can be entirely rational or entirely self-interested. One of the corollaries, generally unspoken in economics texts, was that such clarity could not be expected of women who were liable to be distracted by such things as emotions or concern for others. Economic man belongs with a set of older ideas separating mind from body and emotion from thought, a whole family of bad inventions. Gregory argued, with Pascal, that the heart has its reasons which the reason does not know, and that decisions are less likely to be destructive if made by whole persons.
From: Oliver Morton
I've been engrossed in a lot of other stuff and only just got to reading the inventions list. The following is probably too late but I thought I had to add it anyway.
I'm simply amazed by the lack of interest in genetic engineering. As far as I can see and if I'm dramatically wrong just disregard this there are two votes for genetic sequencing, and of these Krauss's is a sort of afterthought. There is no nomination of genetic manipulation per se, though Shapiro talks about it in the context of sequencing. Searle and Blakemore also bring it up, in one case as a specific instrumentality, in the other as a parting shot.
Yet, unlike many of the nominees, directed genetic manipulation is an invention in the truest sense. It's a body of tools and techniques conceived for an express purpose; you can point to the people who invented it (their names are on the patents). In its basics it is likely to be enduring while I find it hard and unsatisfactory to imagine daily oral contraceptives lasting a century, it seems quite likely that the basic systems used to replicate and edit DNA in laboratories will stay reasonably stable, while their implementation will doubtless increase in its efficiency enormously. And it is likely to have an immense impact on timescales from centuries to millennia. While I can imagine futures in which no biological entities or environments are pervasively engineered, and nor are any people, they don't seem terribly likely, nor necessarily terribly desirable.
Obviously there are lots of ways genetic modification could do bad things to us but that is surely one of the basic criteria for being important, and not a reason for shunning a technology or downplaying its significance. If we want to implement our discoveries about the biological world, directed genetic manipulation is one of the fundamental tools with which we will do it. (And for those of a more conceptual bent, it will undoubtedly reshape all our discourse about nature and our place in it, as it is already doing.)
I suppose if people apply a discount rate to future impact when assessing the importance of an invention, then the printing press's five centuries of influence win out over genetic modification's couple of decades. But this seems wrong on two counts; first, it privileges this moment above others; second, if the future is discounted, then all this stuff about computers (and, indeed, oral contraceptives) has to go on to the back burner. Both have had great impacts (though the pill is in many places used by only a minority of women, often a very small one, and may in some ways be less important than safe and legal abortion); but especially in the case of the computer, its networks, its cryptographic potential and so on it appears that the invention's true importance lies ahead. The same is just as true of the ability to redesign living beings.
I'm not saying that directed genetic manipulation is more important than democracy or the notion of equality or many of the other great things on the list. But it is more obviously what most people mean by an invention, and I really was surprised not to see it all over the place.
As ever, Oliver
From: Gino Segre
My choice for the greatest invention of the past 2,000 years is the lens. First of all, without lenses, you might not even be able to read this piece and even worse, you might not have ever been able to read if your vision had not been corrected. I remember Teddy Roosevelt's description of getting his first pair of glasses and suddenly having the world come into focus. Seeing clearly is of course no small matter, but it seems limited to pick eyeglasses as the greatest invention of the last 2,000 years so my vote is for lenses big and small, alone and combined.. The lenses we use to read the Universe or the intricacies of life are variations of those we use to absorb the written word
I am going to start, however, with plain old spectacles. We don't really when they first started being used. They were not uncommon in fourteenth century Italy and by 1600, there were specialized artisans who carefully ground lenses, keeping their tricks secret. One of them, a Dutch spectacle maker named Lippershey, noticed that a combination of two lenses made distant objects bigger. He tried to use this to get rich. He didn't succeed but several of his two lens devices were made. By 1609 one of the devices reached a transplanted Florentine named Galileo Galilei who was teaching at the University of Padova. He pointed his device, or telescope as it was later called, at the night sky and looked out. He took his telescope apart, rebuilt it, improved it and looked some more. What he saw changed our view of the world. The Sun rotated around its axis, Venus revolved around the Sun, the Moon had mountains and valleys, Jupiter had four moons and the Milky Way was made up of vast numbers of stars. It was crystal clear that the old Ptolemaic vision of the Universe was wrong. Copernicus and Kepler were right, the Earth was not the center of the Universe and there was no going back. We were launched on our exploration of outer space.
It is a short journey from the telescope to the microscope . Not surprisingly they were discovered at around the same time. After all, they are both just the simple piecing together of the right two lenses in correct positions. Galileo used the telescope brilliantly, but he also peered through a microscope of sorts. He saw flies the size of sheep and spots of dirt that looked like rocks, but he did not know what to make of it. In 1665 Robert Hooke published a best-seller called Micrographia. The book had a series of beautiful plates in it, Hooke's rendering of what he had seen with his microscope. There was a fly's eye, mold on the leaf of a rose, a picture of a louse and so on. All very pretty, but it did not lead to anything. The microscope was a tool in search of a problem. The problem eventually did develop and it was nothing less than understanding the origins of life and of disease. This first came into focus, no pun intended, when Anton Van Leuwenhoek in 1678 made a lens good enough to get a magnifying power close to five hundred. At that point a whole rich substructure was revealed. A drop of pond water turned out to be filled with little "animacules" swimming in it. Van Leuwenhoek had discovered bacteria. It took another two hundred years to really understand what he had seen, but then it also took three hundred years to understand that the Milky Way was just one of many galaxies.
I have been saying the lens is the greatest invention of the past 2,000 years but an excellent lens had already been perfected over the course of millions of years by creatures so primitive they didn't even know how to make a fire. Despite this comparative ignorance, their lenses are as good as anything we can dream of making in the lab today. Of course I am talking about our own ancestors and the lens I am describing is our eye's lens. It was developed by that diabolically clever builder we call evolution. There are many places and ways to learn just what a good job evolution did, but my favorite is offered by Richard Feynman in a physics course he taught at Caltech. Given who Feynman was, none of the course is ordinary and some of it is extraordinary, the work of a true genius. He describes how light rays enter our eye and are immediately bent and focused toward the retina by a surface lens we call the cornea. After the first focusing the rays travel through a chamber filled with fluid and then meet the second focuser, known simply as the lens. This lens is exquisite, a thing of beauty. It is built up like an onion with transparent layers, slightly flatter toward the edges and with slowly varying bending power of light, all designed for optimal focusing. The curvature of the lens can be adjusted by muscles on the side and with a little luck the lens forms the perfect image on the best of all screens, the retina.
The retina is wired to the visual cortex in the brain and, voila, we see the picture. I have been implying that the brain and the retina are two separate things, but it may make more sense to talk of the retina as a piece of the brain because it does a lot of the information processing before sending on its results through the optic nerve to the cortex.
My answer for the greatest invention of the last 2,000 years is still
the lens, but the greatest invention of all times is the brain which,
incidentally, has managed to figure out how to use the lens it is already
hooked up to and the lens it has learned how to build in its never-ending
attempt to understand the Universe.
I would very much like to add The Plough (or the digging stick the principle is the same) to your list of "Inventions ".
My thesis is that farming really began at least 40,000 years ago (not 10,000 years ago in the "Neolithic Revolution" as is generally supposed) but that for 10s of thousands of years people 'merely' managed the environment in various ways, while at the same time getting a large propotion of their food by hunting and gathering. This management I have called "protofarming ".
In fact you can do all of horticulture (which in effect means cultivation
of individual plants, though the etymology means gardens) and pastoralism
in protofarming mode. The economic switch came when people came to rely
Ploughing (or soil-breaking in general) leads in short to arable farming, which primarily means mass growing of cereals. The Old Testament shows how people hated this (arable farmers have an extremely bad press in the OT) and indeed regarded the breaking of soil as blasphemous. Cain was the arable farmer perceived as the murderer (of the pastoral Abel) whose gift of corn was rejected by God. The "Neolithic Revolution " is not about the origin of farming; but it does reflect the birth of arable farming (ie, agriculture in the strict etymological sense).
Ploughing has given us a world population of 6 billion, and transformed the world's landscape.
The plough is the most significant human invention of all.
(Together with the spear of course which enables human beings to kill at a distance!).
From: Murray Gell-Mann
I thought about your question and came up with an answer right away, but I am not sure if my answer is suitable. For one thing, I don't know if it really refers to an invention of the last two thousand years. Most likely there were many people who thought about it before the year 2 BCE, and we may well have documentary evidence of that, although there might easily have been discoverers who were afraid to discuss it publicly.
In any case, the most important invention I can think of is disbelief in the supernatural, the realization that we are part of a universe governed entirely by law and chance. (Of course, the fundamental role of chance was not fully appreciated before the discovery of quantum mechanics.)
The deism to which some of our U.S. founding fathers subscribed was not altogether different, in that it involved a supernatural being that set the orderly universe in motion and then left it alone. In its pure form, though, what I am discussing is the complete elimination of the supernatural from our world picture.
From: Patrick Bateson
As I sit at my computer writing this whimsy, I realise how much of my life is spent peering at its pale screen. So much of my working life has been transformed by the user-friendly software that is now available. As an inveterate reviser, when I write by hand, I start to change my prose almost immediately after I have written something. Large chunks are crossed out, word orders are changed, sentences rearranged, paragraphs moved about. Before long the manuscript looks like a bird's nest. Producing a tidy typewritten copy is not at all easy after so many afterthoughts. The editing facilities of modern word-processing packages are so straightforward that manuscript bird's nests are a part of my past. The new technologies have been truly liberating. So my first thought was that the invention of friendly word processors was my candidate for this symposium. But wait a minute.
A good principle used by historians of technologies is to ask what had to had to be known in order for a particular development to have occurred. It is doubtful, for example, if desktop computers of the power and flexibility we now have would have been possible without the invention of the silicon chip. This approach to emerging technologies produces a fan of necessary developments or, more aptly, a root system branching outwards as the historian moves backwards in time. Some of these roots are undoubtedly more important than others, some certainly more enabling. Consider the computer on my desk again. It is inconceivable that such a machine would have been possible without electricity.
To be sure, Charles Babbage developed plans in the 1830s for what he called an analytical engine. His idea was that the machine would perform any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer. The analytical engine was not built according to Babbage's specifications for another 150 years. Its mechanical components meant that it was bulky and the modern outgrowth of a Babbage machine would exclude both my desk and me from the room in which it sat - and it would do a fraction of what my liberating machine does. So, my candidate for the greatest invention of the last two thousand years is the harnessing of electricity.
The first device that could store large amounts of electric charge was the Leyden jar invented in 1745 by Pieter van Musschenbroek, a Leyden physicist. The jar was partially filled with water and contained a thick wire capable of storing a substantial amount of charge. One end of this wire protruded through the cork sealing the jar and was connected to a device generating friction and static electricity. Soon after the invention "electricians" were earning their living all over Europe killing animals with electric shock and devising other spectacles. In one demonstration in France a wire made of iron connected a row of Carthusian monks; when a Leyden jar was discharged, the white-robed monks leapt simultaneously into the air. The frivolities led to thought. Thanks to Ben Franklin in the United States and Joseph Priestley in England, experiments and theorising proceeded apace and, by the mid-19th century, the study of electricity had become a precise, quantitative science which paved the way for the technologies we now all take for granted.
We need electricity for keeping us cool in summer and warm in winter - though our ancestors would have been flabbergasted by the profligate way in which we do so. We use electricity for cooking much of our food and for freezing what we intend to eat later. We depend on it for transport, for communication, for entertainment, for running lives that bear no relation to the rising and setting of the sun. Of the major human appetites, only sex it seems is likely to be served by a power cut.
| Home | About Edge| Features | Edge Editions | Press | Reality Club | Third Culture | Digerati | Edge:Feed | Edge Search |
| Top |