George Dyson, Cliff Pickover, Joseph Traub, Jaron Lanier, Stewart Brand, William H. Calvin, Marvin Minsky, Charles Simonyi, Stewart Brand, Kevin Kelly, Lee Smolin, Philip W. Anderson, Marc D. Hauser, Jordan B. Pollack, Nicholas Humphrey., Steve Grand, W. Daniel Hillis, Marvin Minsky, John McCarthy, Jaron Lanier, John Baez, Terry Sejnowski, Response to the responders, from Freeman Dyson
Thanks to all nineteen of you who responded, some of you twice, with thoughtful objections and disagreements. As I said in my opening statement, it is much more fun to be contradicted than to be ignored. I learned a lot from your contradictions.
The following two statements are my attempt to condense a rich assortment of opinions into two sentences. (1) Any real computer operating in the real world is partly digital and partly analog, and any living organism is an even more inextricable mixture of digital and analog components. (2) The concepts of digital and analog were invented to describe idealized models of human-designed machines, and are far too narrow to encompass the subtleties of living creatures. In other words, when I asked whether life is analog or digital, (1) the answer is ``both'', and (2) I asked the wrong question. There seems to be a consensus among the respondents for both these statements. Beyond that, each of you had interesting things to say about details. I comment now on only three of your comments. Sorry it would take too long to give equal time to all of you.
Lee Smolin gave the longest and most substantial response. He describes a third possible form of information processing which is neither analog (because it is based on discrete rather than continuous components) nor digital (because it cannot be simulated by a digital computer algorithm). His information storage is based on the topological structure of finite graphs in three-dimensional space. This illustrates the general statement that the categories of analog and digital are too narrow to cover the range of possible machines and organisms. It is possible that Smolin's topological information processing may actually exist, both in living cells and in the fine-structure of space-time.
Steve Grand reformulates my question in an interesting way. He asks whether the signals carrying information are similar or dissimilar to the objects that they represent. If the signals are similar we may call them analog, and if they are dissimilar we may call them digital. When the question is put in this way, it becomes clear that living creatures are usually neither analog nor digital, because life does not usually represent anything. Life is not a symbol for something else. Life just is. On the other hand, brains do represent external objects, and the question whether a brain uses analog or digital symbols is meaningful.
Two respondents, Joseph Traub and John Baez, criticize my mention of the Pour-El-Richards theorem on technical grounds. The theorem says that an analog computer is more powerful than any digital computer for performing certain abstractly defined tasks. Traub and Baez correctly point out that the theorem only applies to an ideal mathematical universe and not to the physical universe that we inhabit. I never said that the theorem applies directly to the real world. I only said that it makes the superiority of analog devices in a cold universe less surprising. I challenge any mathematicians among you to find out whether some version of the theorem might hold under conditions closer to the real world. John Baez has already answered this question in the negative for one particular set of assumptions.
In conclusion, I thank you all for raising a lot of new questions which I hope we will continue to think about. Science thrives on mysteries, and the nature of life remains as mysterious as ever. I hope and believe we will never run out of good questions.
Freeman Dyson raises several intriguing questions about life and computation. These questions are closely related to two different styles of error correction, which is needed to preserve information and prevent catastrophic failure.
Digital computer memories use error correction coding schemes, such as block codes, to achieve extremely low error rates; this allows logical calculations to be carried out to great depth. Cells use an error correction scheme to achieve DNA replication error rates of less than one base error in 100 million. Modern error correcting codes in communication are within a few percent of the Shannon limit.
Vertebrate brains depend on statistical redundancy for reliable operation, which has the advantage of robustness to errors and damage, but the logical depth of computation is limited. However, brains are not general purpose computing devices, but special purpose systems with adaptive abilities.
Redundancy is a better strategy when the signal to noise ratio becomes extremely low and the power available for coding and decoding becomes scarce. Although it might be possible to build a general purpose computer with this strategy, a special purpose system, along the lines of brains, might be best adapted to the conditions that Dyson has explored.
Freeman Dyson mentioned a theorem due to Pour-El and Richards, and reads it as saying that "analog computers are more powerful than digital computers". I've worked a bit on this stuff and disagree with this assessment. As usual, the devil is in the details.
Pour-El and Richards' result goes roughly like this. They consider the "wave equation", but let me talk about Maxwell's equations in a vacuum, since that would work too, and it sounds nicer. They show there are solutions with the following property: at time zero, the electric and magnetic fields are computable functions, but at some later time, there is one point in space at which the electric or magnetic field is not computable. The trick is to set up a lot of waves coming in, which all crash together at a single point at some moment.
As for "computable": here they're using a more or less standard definition of "computable" functions of several real variables, taking real values. The idea behind this definition is that you can write a computer program that can compute f(x) to any given accuracy if you specify x to sufficiently high accuracy.
Pour-El and Richards' result is interesting, but notice the catch: to use this setup in a practical device, you'd have to be able to measure the electric or magnetic field at a single point in spacetime. In practice we never do this: we measure smeared-out averages of the electric and magnetic fields, in a manner limited by the size of our probe.
In fact, it's crucial to Pour-El and Richard's argument that they are working with solutions where the electric and magnetic field are continuous and have continuous first derivatives. Mathematicians know that these details can make or break an argument. If instead we work with solutions that merely have finite energy (a weaker condition), their argument no longer works, because finite-energy solutions need not have a well-defined value at a single point in spacetime: only smeared averages are well-defined!
And in fact, one can show that in the context of finite-energy solutions, if the electric and magnetic fields are computable at time zero, they will remain computable for all time. I believe this is more relevant to physics than the theorem Pour-El and Richards proved.
Of course, in this other theorem, we need a slightly different definition of "computable", since we're dealing with solutions where only the smeared-out averages of fields make sense, not their values at specific points. But I came up with this definition when I was an undergrad at a school near where Dyson hangs out. At the time I was interested in Schrodinger's equation rather than Maxwell's equation or the wave equation, and I proved that time evolution for Schrodinger's equation takes computable wavefunctions to computable wavefunctions. But later I realized that the same techniques work for these other equations. So these days I doubt that realistic analog computers can compute nonrecursive functions.
From: Jaron Lanier
I was trying to be subtle and non-confrontative in my initial post, but I have a sense that I was a bit too subtle. Why are we confusing our inventions with natural systems? Why is this confusion being asserted with such force and regularity? What is to be gained from it?
To be more explicit: "Analog" and "Digital" are two technologies of relatively recent invention. They both can be usefully understood through simple ideas about state and causality that can probably not be applied to all of nature at all scales. Each technology is made possible because we have learned to artificially construct material systems in which certain traits are well enough controlled that they conform to simple models with enough reliability for practical uses,
Digital computers can hypothetically be applied to a significantly general class of problems, but in practice are hard to scale. Analog circuits can hypothetically be applied to a significantly general class of problems, but in practice are hard to scale. Aspects of biological systems are usefully modeled by manageably small digital programs. An overlapping but different set of biological problems are usefully modeled by manageably small analog circuits.
Analog systems should not be confused with continuous mathematics, and neither should digital computers be confused with discrete mathematics, as Lee Smolin pointed out.
Freeman Dyson is fascinated by whether this or that configuration could be "alive" or "conscious". It's hard to respond to him on a purely technical level, because the terms aren't well enough defined. Taken on a philosophical or poetic level, he seems to ignore an obvious question of subjectivity and observation.
A digital interpretation of a computer is completely superfluous unless you're a person using the computer. A Macintosh will do the same thing whether it's modeled as an analog device, a digital device, a quantum phenomenon, or a thermodynamic event. For something to be digital, in the sense of an information bearing device, someone has to appreciate it as bearing its information which somehow ends up as subjective experience. I don't think the celebrated "observer" of quantum measurement is nearly as provocative as this "observer" of computers, who is so obvious as to be easily missed.
The mysteries of philosophy don't go away because we have the metaphor of the computer they only increase. I don't think these objections need to be part of a scientific discussion, but neither should the scientific community be broadcasting the "man is a computer" or "life is a computer" tropes.
Freeman Dyson includes
1. I want more than that life merely process information. Our form of life can process arbitrary digital information. It must be able to represent the same facts about the world that humans do and be able to make arbitrary computations (accepting limitations of speed, storage and time) that humans make.
2. Consider a physical digital circuit. It is in fact an analog circuit. What makes it workable for digital purposes is the nonlinear behavior that permits it to approximate discrete states and make arbitrary computations. Marvin Minsky's PhD thesis showed that processing information essentially requires only that the basic processing element have a negative part in its response curve, i.e. sometimes increasing inputs leads to reducing outputs. This negative part of the transfer functions is also what prevents small deviations from destroying the information.
3. That the analog system can compute some functions digital systems can'd doesn't show that the analog system can preserve information, correct errors and process it logically.
4. Does the black cloud have these information storage and processing capabilities?
From: Marvin Minsky
Date: March 8, 2001
Danny Hillis wrote: "I see no reason to believe that the technical distinction pointed out by Pour-El /Richards are in any way relevant to the actual 'computation' of life."
Or to anything else, so far as I can see. As I recall, that theorem is based on assuming the existence of continuous real variables. That amounts to first throwing the baby into the bath, and then appearing to magically take it out again. Along with Feynman, I'm inclined to suspect, instead, that the universe has a finite information density.
Worse, it is hard to imagine any way either to prove this or to prove the contrary. So, it's probably best to just forget it.
Anyway, in regard to Stuart Brand's remark, surely a CD is just a small soft rock A 650 megabyte hand carved rock would consume a square kilometer or so. Besides, it is not really analog. It is digital with a local redundancy of the order of 10**20 in molecular atom placement. A CD has far less redundancy.
From: W. Daniel Hillis
Date: March 8, 2001
Freeman, I am not sure I understand your question. If you are asking a question about the life-as-we-know-it, surely the answer is "both" since we know of many examples of the both analog and digital information processing in biology.
Your own book Origins of Life make this point very eloquently, so I wonder if you are asking if life-in-principle is analog or digital? Surely the answer to this is "either".
In almost all circumstances digital life can simulate analog life to arbitrary precision and visa versa. I see no reason to believe that the technical distinction pointed out by Pour-El /Richards are in any way relevant to the actual "computation" of life. In the special circumstance of extreme cooling, where both noise and signal are fading together, it may well be that only the analog implementation works .
But would this make life analog? I don't think so.
From: Steve Grand
Date: March 8, 2001
The world is divided into two camps: those that believe in dichotomies and, er, nobody else...
Perhaps at a theoretical level analogue and digital are two branches of a dichotomy, but at a practical level they never are (and certainly their etymologies do not make them into antonyms). For example, analogue computers represent physical quantities using "continuously varying" voltages. But instantaneous voltage is quantised, with a resolution of 1eV, so in practice this "analogue" signal is really digital (albeit a very close approximation, thanks to Avogadro's Number). Likewise all practical digital signals are analogue, since the system takes a finite time to slew between any two states. Maybe this isn't true with quantum flips, but instead perhaps we are stuck with a finite period (of uncertain length) in which the states are superimposed - 0 and 1 simultaneously. A digital system has to make a deliberate decision on how to treat this finite transition in practice we usually use hysteresis (e.g. a Schmitt Trigger in electronics) to define that the intermediate state is simply the old state until it passes some (presumably quantised) threshold. Fundamentally it's a mess analogue and digital are not distinct and complementary, and they're more in the eye of the beholder/receiver than anything else.
When it comes to life, we see the analogue/digital distinction disappear completely, to be replaced by the spectrum of forms of encoding it really is. Take a nerve signal: at a theoretical level we can treat an action potential as a differentiated square wave, i.e. a spike of infinitesimal width and infinite height the ultimate in digital. But in practice the cell membrane takes a finite time to transit between polarised and depolarised states and so forms a smooth (if sharp) curve this is an analogue change (discrete at the quantum level). But then again, there are only two significant states, polarised and depolarised, so we're back to a digital system. And yet what really matters is the choice of encoding scheme, which for the majority of neurons seems to be frequency modulation, and so the true signal is analogue and can vary continuously. Mind you, two spike peaks can only vary in distance by a whole number of molecules, and so this analogue signal is really discrete... Argh!
Much the same mess and mix of mechanisms applies to DNA too. ACTG may be digital, but genes are also encoded by the folding structure of DNA, RNA and the relevant enzymes, so although the resultant protein's amino acid sequence is defined digitally, the particular physical form it takes on (proteins can often fold up in many different ways, with different properties) depends on much less clear-cut factors.
The idea of digital might seem like a neat trick to us humans, who've only just discovered it. But nature simply isn't impressed either way.
Perhaps a more important distinction is not between analogue and digital in the sense of "real number" versus "integer", but between "analogue" in its original sense as an explicit representation of another physical process, and "symbolic" (conventionally in its most elegant form of binary numbers), where the shape of the signal bears no similarity to the physical process it is emulating. But my feeling is that this distinction only really applies to mathematical models and computer simulations, which are pretending to be something else. Life just is, so the concept of a "representation" doesn't apply (outside of the question of whether and how brains make mental models of the world), and it's pointless asking whether the information is explicit or symbolic.
So "is life digital or analogue?" is surely therefore a non question, since the word "is" implies that it's a practical question, not a theoretical one? " Could life be digital or analogue" is a theoretical question (although it's a practical one for people like me, who work in Artificial Life). But the answer here is surely: it doesn't care. Whenever a network of cause and effect is capable of sustaining itself, it will. Perhaps it now boils down to how discrete such information flows need to be in order to be self-sustaining, and my hunch here is that highly discretised signals (like ACTG) are an advantage but not a necessity.
Er, maybe all I've done here is rephrase the question. Does that make me a philosopher?
From: Nicholas Humphrey
Date: March 7, 2001
If information in DNA in genes is digital, while information in culture in memes is analog, the answer has to be that life was digital, and will be analog.
From: Jordan B. Pollack
Date: March 7, 2001
Freeman's material definition eliminates the idea that software could be alive, and places no constraints on the direction of the process. I prefer the lower level thermodynamic definition: a process far from equilibrium which dissipates energy and creates a local reversal of entropy. All the other elements of life-as-we-know it, like information and replication would fall right out, were we were to really understand self-organization.
As a computer scientist, whether life is analog or digital seems to me to be the wrong question. It is a similar question as whether a chair must be made out of wood or plastic. "Chairness" is in the organization which enables a platform of a certain height to support a certain weight. Similarly "life" is a measurement of the organization in a system. The yes or no answer to the "is it alive?", is like the yes or no answer to the question to "is it hot?"
So it doesnt matter whether a system is made from silicon, carbon, chips, polymers, potentiometers, relays and motors, legos, or tinkertoys or even pure software. What matters is where the biologically complex organization will come from.
I think both (analog) complex dynamical systems and (digital) software engineering have been failures at even recognizing the scope of the problem of biological complexity. All the mathematical models of complex systems theory focus on quite low dimensional systems, like cellular automata, and mostly on convergence phenomena or pretty graphics. And AI-type programs like symbolic manipulation systems as well as trained neural networks, always reduce to small bits of organization: business logic plus a relational database or a polynomial with lots of parameters to set.
There seems to be a fundamental limit to organization which is buildable by human teams. In fact, a dirty secret of the software engineering field is that 2 or 3 good programmers can build anything, and tools and fancy methodologies havent changed the equation. Hundreds of other people are necessary just to keep businesses going and to tweak the code to market. Big computer programs are just "suites" of separate 1 million line programs, requiring a human brain in the middle to select and apply different functions usually via a menu system.
The amount of organization in a single autonomous biological cell dramatically exceeds the amount of organization of a modern computer program. The real question is how do we get self-organization of systems going to the point they achieve biological complexity, not whether they are digital or analog in nature.
From: Marc D. Hauser
Date: March 7, 2001
We now know a considerable amount about the biological basis of number representation surely Stanislas Dehaene will weigh in here!
What we have learned from studies of animals, human infants and adults, and patients with brain damage, is that all such creatures have, minimally, an analog magnitude system for computing the number of objects or events. this system, mediated by a mechanism that can either count or time, is approximate, falls under the constraints of Weber-Fechner law, and thus can do small or large numbers approximately. Some of us (e.g., me, Sue Carey, Liz Spelke) believe that this system is joined by another that can do small numbers (< 4) precisely.
The emergence, in human history, of a large precise number system was contingent upon the acquisition of language. Thus, no other animal and no child lacking language will acquire a large precise number system, one capable of true digital quantification. However, relevant to the digital-analog issue, it looks like much if not most of the natural world makes decisions in an analog fashion.
From: Philip W. Anderson
Date: March 7, 2001
I have the impression that the best answer to any question in the form "is X Y or Z?" is almost always "neither!" (As, for instance, was Bohr or Einstein right?") To focus strictly on life as an information process is to miss the point, widely. This is a mistake I myself was guilty of in my two decades old papers on the origin of life, buying too naively into the "RNA world" and the idea that self-replicating information was the essence of life.
In his most recent book, "Investigations", Stuart Kauffman defines a living organism as "an agent which can act on its own behalf". Perhaps this is too restrictive, perhaps one should add replication and say "and its descendants' behalf", but the crucial word here is "act". The information-handling capacity sensing the gradient of nutrient for a motile bacterium, finding the direction from which the sunlight is coming is merely a facilitation of the basic requirement, which is to go out and find a source of energy and to convert that energy into work. Work here is defined in its thermodynamic sense as energy without entropy, energy which can be used to drive the system as far out of equilibrium as may be necessary to achieve the basic goal of survival and reproduction.
To put this in Freeman's terms, something will have to maintain, the
black cloud or the silicon chip against, at the very least, the depredations
of other black clouds or silicon chips wanting the same energy source,
and Stu and I would argue that it would be this maintenance object which
is actually alive. The information which the cloud contains represents
a large departure from thermal equilibrium and to maintain and use it,
or to replicate it accurately, work will be necessary, and the schemes
described do not tell me where the work is coming from.
From: Lee Smolin
Date: March 7, 2001
I am troubled by two assumptions that Freeman and others seem to be making. First that there are only two choices for how information can be coded, digital and analogue. Second, that nature is analogue, apart from some quantum phenomena. Both seem to be false, and this leads to a possible resolution of Freeman's quandary.
Consider the problem of classifying the embeddings of arbitrarily complicated graphs in three dimensional space, up to topology. The problem is topological and combinatorial, no continuous variables are involved, but it is not known if there exists an algorithm which can solve it in finite time. It is plausible that the problem is unsolvable. There are other topological and combinatorial problems that are known to be non computable, for example classifying four manifolds or classifying finite groups in general. If the classification problem cannot be solved then the information coded in the embedding of a graph is not digital because there is no algorithm which can in a finite time reduce the topology of an arbitrarily complicated graph to a representation in terms of a finite string of ones and zeros. But no continuous quantities are involved as all that is relevant is the topology of the embedding.
Furthermore, even if the problem is solvable in principle, it is likely that the time required to classify a graph will grow extremely fast as the complexity of the graph is increased. Thus, even a moderately complicated graph may be knotted in a way that no digital computer could classify in a physically relevant amount of time. So it is certainly the case that the information contained in such topological problems is not digital for all practical purposes.
Is this relevant for nature and for Freeman's query? Most likely yes.
One of the basic results of quantum gravity is that at the Planck scale the quantum states of geometry are coded in terms of combinatorics and topology. In fact the quantum states of a large class of quantum theories of gravity are in one to one correspondence with the embedding classes of certain graphs, called spin networks. If there is no finite procedure for classifying the embeddings of graphs in three space up to topology then there will be no digital representation of the information coded in quantum geometry, in spite of the complete absence of continuous variables. Even if the problem is solvable in principle, it is still almost certainly the case that no digital computer built from a subsystem of the universe will be able to classify the possible quantum states of the geometry of the universe in a number of time steps fewer than the age of the universe in Planck units.
Two other results of quantum gravity, the Bekenstein bound and the holographic principle, require that only finite dimensional state spaces are required to code the quantum information that can be extracted from any region of the world with a finite area boundary. So it seems likely that continuous variables play no role in nature, but at the same time, this does not mean nature is digital in the ordinary sense. The problem is that these finite dimensional state spaces have bases which are distinguished by solving the problem of classifying embeddings of graphs. So while the holographic principle says that no observer in the universe can access more than a finite amount of information, that information may be stored in a way that cannot be represented digitally by any computer that could be built inside the universe.
Now, coming to life, one can wonder whether cells may make use of combinatorially coded information that is not efficiently coded digitally. Possibilities for such codings are not hard to find. For example, the topologist Louis Kauffman has hypothesized that information about turning genes on and off may be partially coded in the knotting of DNA molecules. To support this hypothesis he points to the existence of enzymes which change the topology of the folding of the DNA molecules.
The difference between combinatorial and digital coding of information is that when information is coded digitally all the possible states of the memory are equally accessible. When information is coded combinatorially, say in the knotting of some graphs, this is not the case, the time required to store or retrieve information depends very strongly on the state in which the information is coded. But a cell is not a general purpose computer, and there is no need that all possible configurations of the molecules where information is stored be equally accessible. What is required is only access to those states that are relevant for the functioning of the cell, and then only at the time they are needed. (Similarly the protein folding problem does not have to have a solution for arbitrary amino acid sequences, only for the much smaller subset that are relevant biologically.) So I do wonder whether the digital metaphor may be blinding us to ways in which information could be stored in biological systems using the combinatorics and topology of molecules.
This seems relevant for Freeman's worry, for if information can be stored in the topology of a system, then the system can be cooled or expanded arbitrarily without degrading the information. At the very least, if the universe has non-trivial topology, on either the large or small scale, there are possibilities for storage of combinatorial information where the discreteness of the states are maintained for arbitrarily low energies. At worst life will be able to survive by coding itself into the quantum geometry of space itself.
From: Kevin Kelly
Date: March 6, 2001
The discussion so far has made me wonder if my computer is digital. I inspected my iMac to see if I could tell. There's a lot of stuff in there. The monitor part is definitely analog all those rasters and light waves and pixels. There's a power supply which I know is not digital but analog. There's the CD, which Minsky assures me is a soft analog rock. I see hardware it is analog, yes? And circuits boards. I have not tried measuring them, but I suspect a lot of the electrical currents running through the printed wires in this device would have analog curves. Eventually we get to the CPUs. How much of what goes on in these little chips actually resembles digital processes? I have no idea, but my suspicions are high at this point. Even if all the activity in the chip was digital (and I doubt it), it's only a small part of the life of this machine.
Ah, but it is the most essential part, if not the ONLY essential part one would say. Maybe. Reducing the activity of the entire computer to the abstraction of its CPU as a way to measure its digitalness seems almost tautological; it's like reducing life to its genes.
If only a part of my computer is digital, and maybe not as much as I first thought, then is this a perspective, as Jaron suggests, that can change depending on how one looks at it? Is the digital/analog question like particles and waves?
It may be that life is "both" not in the sense of being part analog and part digital, but that it is all digital and all analog at the same time.
As for what that means to the prospects of life eternal, I guess that means "both." Life will continue forever and will die out, too.
Stranger things have happened.
From: Charles Simonyi
Date: March 6, 2001
Letters carved in a rock are definitely digital (not binary, but digital).
Bison carved in the rock may be called analog, unless it is a part of an alphabet, in which case it would be digital too. (pictures of egyptian deities on an inscription: definitely digital, that is why they were called the "enead", the group of nine (or eleven?)
The information carried in digital form (Hamurabi, etc.) from ancient times was incredibly durable. digital rocks.
I am not sure if life is analog or digital. Freeman seems to take it for granted that the universe is analog (with good reasons ; ). But maybe the issue is:
Are real numbers real?
(or was this one of the Y2000 questions already?)
Certainly quantum energy states are digital in spirit. In a popular column I said that they help gold to remember that it should shine and water that it is supposed to be a liquid. In an alternative analog world all of these interesting properties would be quickly forgotten as the electrons would wind down into the nucleus in a truly analog fashion.
Best wishes to all.
From: Stewart Brand
Date: March 6, 200l
My point on digital is nonprofound. Just that reading a CD in 30 years
will be a chore, like reading an 8 inch floppy now. Rocks and paper
stay readily readable the eyeball platform is fairly constant
over centuries, and alphabets seem to leave a lineage you can trace
back for understanding old forms. Language is mutable though, sometimes
From: Marvin Minsky
Date: March 6, 2001
Umm, surely a CD is just a small soft rock. Still, a CD can be made with a good deal of redundancy. A 650 megabyte hand carved rock would consume a square kilometer or so. Besides, it is not really analog. It is digital with a local redundancy of the order of 10**20 in molecular atom placement.
By the way, I don't agree with Dyson's assessment of Pour-El et al's theorem. The assumptions are based on the existence of continuous real variables. That amounts to first throwing the baby into the bath, and then appearing to magically take it out.
I suspect, instead, that the universe has a finite information density. I also suspect that there's no way to prove either that or the contrary. So, on second thought, forget it.
William H. Calvin
I get involved all the time in neurophysiological cautions to Moravec's uploading, and to some extent the same cautions apply to the basics of life, not just higher intellectual functions. Organisms are always in the process of turning into something else, whether aging on the individual level or evolving on the species level. Yet they have to maintain their integrity in much the same way as any other bureaucracy; variations around the fringes are OK but there is a developmental core that has to remain very conservative.
So robustness is always a consideration: how well does the organism bounce back when perturbed? As we all know, digital media (including DNA) have the ability to avoid degrading copies and, if mutations occur in a master record, a certain resiliency about fixing it. I don't hear of analog examples of such fidelity and robustness.
When you compare digital storage of data (eg. a CD) with analog storage
(eg. letters carved in rock), analog is proving to be far more durable
over time. Mathematically it's not a profound difference, but practically
From: Jaron Lanier
Date: March 6, 2001
It seems to me that there's an epistemological difference between analog and digital systems. A digital system must adhere to an idea of what constitutes a bit, and an on and off state for a bit. This idea, which might be thought of as a "standard" for a given fundamental digital platform, is a way of interpreting digital information content in a fundamentally analog system. A digital system can only be defined or perceived through the use of this kind of extra layer of specification, or what might be called in another context an extra layer of "subjectivity". A digital interpretation's very essence is that it ignores a lot of information in the system; It is made artificially not only finite but small, and therefore more easy to understand and predict for many purposes. Any digital system can be made to "die" by putting it in extreme conditions where either the bits dissipate or the difference between the on and off states is no longer present in a useful way.
Therefore, I think a better question to ask is, "What digital interpretations of living systems might prove to be useful?" There might be multiple useful ways in which the human brain can be thought of as a digital system. Each of these might have a different idea of the material instantiation of a bit and the states of a bit. For the same reasons that there is no perfectly reliable digital computer in the real world, there will be no perfectly applicable mapping of a digital interpretation onto a natural system such as a brain. But we should expect useful digital interpretations (indeed there is already a thriving community of computational neuroscientists), and I hope we will not be burdened by an a priori bias about whether a single mapping or many should be emphasized in the future. The possibility of such a bias is why it's worth pointing out the epistemological issue. (Also note that temporal and phase-sensitive phenomena in natural living systems can be particularly hard to interpret with sufficient resolution digitally so it might take a while before we have powerful enough computers to usefully map onto some behaviors of analog brains.)
From: Joseph Traub
Date: March 6, 2001
The Pour-El and Richards result is for a worst case setting. It's been shown that there's no difficulty "on the average".
is not unusual. For example, problems typically suffer the "curse
of dimensionality" in the worst case. If one settles for a stochastic
assurance (Monte Carlo, for example) the curse is vanquished. The bad
result is just an artifact of insisting on certainty.
From: Cliff Pickover
Date: March 6, 2001
As a general background to Edge readers, if they feel that only flesh and blood can support consciousness, then life would be very difficult in the Final Days when the universe expands and cools and does not contain water nor much energy. But to my way of thinking, there's no reason to exclude the possibility of non-organic sentient beings in the final diffuse universe. I call these beings Omega creatures. If our thoughts and consciousnesses do not depend on the actual substances in our brains but rather on the structures, patterns, and relationships between parts, then Omega beings could think. If you could make a copy of your brain with the same essential structure but using different materials, the copy would think it was you.
More specifically addressing Freeman Dyson's essay, Freeman writes "If we are partly analog, the downloading of a human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities." I would enjoy hearing him expand on this subject. For example, if I were to digitize an analog LP record at very high resolution, would I really have to lose any of its "finer qualities?" Perhaps the *playback mechanism* for the LP record affects it sounds, but it seems to me I have captured all of the LP record's qualities, and someday, given a good playback mechanism, it would sound just as beautiful. Perhaps we should even say that a digital capture of an LP record, even today, is indistinguishable in any "meaningful way" from the LP record in terms of the music's "finer qualities," our perception of the music, and the feelings the music evokes.
I would also enjoy hearing Freeman expand on his sentence, "If the system could live forever, the temperature would ultimately become much lower than the energy-gap, and the states above the gap would become inaccessible.... It would be... dead." However, if the amount of energy in the universe asymptotically approaches zero, it will never reach zero. Is it possible that the universe will never reach a condition in which ALL the states of of the components of the living system become unreachable? Is it possible that there will be at least some of the states made possible via the vacuum fluctuations, where the overall universe's energy is arbitrarily close to zero but locally there may be variations allowing transient existence not so different than what we all have now, transient existences. Even if this state would correspond to a person in a coma who is not fully conscious, we would still call this person "alive." And with life, there is always a hope, even if very slim, of resurrection and rescue from beings in parallel universes or from other dimensions.
From: George Dyson
Date: March 6, 2001
When I heard that Freeman's Edge question was "Is life analog or digital?" I was intrigued, because, as Freeman so eloquently argued in *Origins of Life* (1986) the answer is "both".
Assuming Freeman's model whereby life-as-we-know-it originated as the
result of a digital parasite incorporated into the analog metabolism
of its original host, the time-without-end question is whether we can
now slowly exterminate (or at least permanently archive) the parasite
without killing the host. Seems to me the answer is yes.
| Top |