Then I became
concerned for
a different reason
which was pragmatic
and immediate:
I became convinced
that the overuse
of the computational
metaphor was actually
harming the quality
of the present-day
design of computer
systems. One example
of that, the belief
that people and
computers are
similar, the artificial
intelligence mindset,
has a tendency
to create systems
that are naively
and overly automated.
An example of
that is the Microsoft
word processor
that attempts
to retype what
you've just typed,
the notion of
trying to make
computers into
people because
somehow that agenda
of making them
into people is
so important that
if you jump the
gun it has to
be for the greater
good, even if
it makes the current
software stupid.
There's a third
reason to be suspicious
of the overuse
of computer metaphors,
and that is that
it leads us by
reflection to
have an overly
simplistic view
of computers.
The particular
simplification
of computers I'm
concerned with
is imagining that
Moore's Law applies
to software as
well as hardware.
More specifically,
that Moore's Law
applies to things
that have to have
complicated interfaces
with their surroundings
as opposed to
things that have
simple interfaces
with their surroundings,
which I think
is the better
distinction.
Moore's Law is
truly an overwhelming
phenomenon; it
represents the
greatest triumph
of technology
ever, the fact
that we could
keep on this track
that was predicted
for all these
many years and
that we have machines
that are a million
times better than
they were at the
dawn of our work,
which was just
a half century
ago. And yet during
that same period
of time our software
has really not
kept pace. In
fact not only
could you argue
that software
has not improved
at the same rate
as hardware, you
could even argue
that it's often
been in retrograde.
It seems to me
that our software
architectures
have not even
been able to maintain
their initial
functionality
as they've scaled
with hardware,
so that in effect
we've had worse
and worse software.
Most people who
use personal computers
can experience
that effect directly,
and it's true
in most situations.
But I want to
emphasize that
the real distinction
that I see is
between systems
with simple interfaces
to their surroundings
and systems with
complex interfaces.
If you want to
have a fancy user
interface and
you run a bigger
thing it just
gets awful. Windows
doesn't scale.
One question to
ask is, why does
software suck
so badly? There
are a number of
answers to that.
The first thing
I would say is
that I have absolutely
no doubt that
David Gelernter's
framework of streams
is fundamentally
and overwhelmingly
superior to the
basis in which
our current software
is designed. The
next question
is, is that enough
to cause it to
come about? It
really becomes
a competition
between good taste
and good judgment
on the one hand,
and legacy and
corruption on
the other - which
are effectively
two words for
the same thing,
in effect. What
happens with software
systems is that
the legacy effects
end up being the
overwhelming determinants
of what can happen
next as the systems
scale.
For instance,
there is the idea
of the computer
file, which was
debated up until
the early 80s.
There was an active
contingent that
thought that the
idea of the file
wasn't a good
thing and we should
instead have a
massive distributed
data base with
a micro-structure
of some sort.
The first (unreleased)
version of the
Macintosh did
not have files.
But Unix jumped
the fence from
the academic to
the business world
and it had files,
and Macintosh
ultimately came
out with files,
and the Microsoft
world had files,
and basically
everything has
files. At this
point, when we
teach undergraduates
computer science,
we do not talk
about the file
as an invention,
but speak of it
as if it were
a photon, because
it in effect is
more likely to
still be around
in 50 years than
the photon.
I can imagine
physicists coming
up with some reasons
not to believe
in photons any
more, but I cannot
imagine any way
that we can tell
you not to believe
in files. We are
stuck with the
damn things. That
legacy effect
is truly astonishing,
the sort of non-linearity
of the costs of
undoing decisions
that have been
made. The remarkable
degree to which
the arrow of time
is amplified in
software development
in its brutalness
is extraordinary,
and perhaps one
of the things
that really distinguishes
software from
other phenomena.
Back to the physics
for a second.
One of the most
remarkable and
startling insights
in 20th century
thought was Claude
Shannon's connection
of information
and thermodynamics.
Somehow for all
of these years
working with computers
I've been looking
at these things
and I've been
thinking, "Are
these bits the
same bits Shannon
was talking about,
or is there something
different?" I
still don't know
the answer, but
I'd like to share
my recent thoughts
because I think
this all ties
together. If you
wish to treat
the world as being
computational
and if you wish
to say that the
pair of sunglasses
I am wearing is
a computer that
has sunglass input
and output- if
you wish to think
of things that
way, you would
have to say that
not all of the
bits that are
potentially measurable
are in practice
having an effect.
Most of them are
lost in statistical
effects, and the
situation has
to be rather special
for a particular
bit to matter.
In fact, bits
really do matter.
If somebody says
"I do" in the
right context
that means a lot,
whereas a similar
number of bits
of information
coming in another
context might
mean much less.
Various measurable
bits in the universe
have vastly different
potentials to
have a causal
impact. If you
could possibly
delineate all
the bits you would
probably see some
dramatic power
law where there
would be a small
number of bits
that had tremendously
greater potential
for having an
effect, and a
vast number that
had very small
potentials. It's
those bits that
have the potential
for great effect
that are probably
the ones that
computer scientists
are concerned
with, and probably
Shannon doesn't
differentiate
between those
bits as far as
he went.
Then the question
is how do we distinguish
between the bits;
what differentiates
one from the other,
how can we talk
about them? One
speculation is
that legacy effects
have something
to do with it.
If you have a
system with a
vast configuration
space, as is our
world, and you
have some process,
perhaps an evolutionary
process, that's
searching through
possible configurations,
rather than just
a meandering random
walk, perhaps
what we see in
nature is a series
of stair steps
where legacies
are created that
prohibit large
numbers of configurations
from every being
searched again,
and that there's
a series of refinements.
Once DNA has won
out, variants
of DNA are very
unlikely to appear.
Once Windows has
appeared, it's
stuck around,
and so forth.
Perhaps what happens
is that the legacy
effect, which
is because of
the non-linearity
of the tremendous
expense of reversing
certain kinds
of systems. Legacies
that are created
are like lenses
that amplify certain
bits to be more
important. This
suggests that
legacies are similar
to semantics on
some fundamental
level. And it
suggests that
the legacy effect
might have something
to do with the
syntax/semantics
distinction, to
the degree that
might be meaningful.
And it's the first
glimmer of a definition
of semantics I've
ever had, because
I've always thought
the word didn't
mean a damn thing
except "what we
don't understand".
But I'm beginning
to think what
it might be is
the legacies that
we're stuck with.
To tie the circle
back to the "Rebooting
Civilization"
question, what
I'm hoping might
happen is as we
start to gain
a better understanding
of how enormously
difficult, slow,
expensive, tedious
and rare an event
it is to program
a very large computer
well; as soon
as we have a sense
and appreciation
of that, I think
we can overcome
the sort of intoxication
that overcomes
us when we think
about Moore's
Law, and start
to apply computation
metaphors more
soberly to both
natural science
and to metaphorical
purposes for society
and so forth.
A well-appreciated
computer that
included the difficulty
of making large
software well
could serve as
a far more beneficial
metaphor than
the cartoon computer,
which is based
only on Moore's
Law; all you have
to do is make
it fast and everything
will suddenly
work, and the
computers-will-become-smarter
than-us-if-you
just-wait-for-20-years
sort of metaphor
that has been
prevalent lately.
The really good
computer simulations
that do exist
in biology and
in other areas
of science, and
I've been part
of a few that
count, particularly
in surgical prediction
and simulation,
and in certain
neuroscience simulations,
have been enormously
expensive. It
took 18 years
and 5,000 patients
to get the first
surgical simulation
to the point of
testable usability.
That is what software
is, that's what
computers are,
and we should
de-intoxicate
ourselves from
Moore's Law before
continuing with
the use of this
metaphor.