My body is an electronic virgin. I incorporate no silicon chips, no retinal or cochlear implants, no pacemaker. I don't even wear glasses (though I do wear clothes). But I am slowly becoming more and more a Cyborg. So are you. Pretty soon, and still without the need for wires, surgery or bodily alterations, we shall be kin to the Terminator, to Eve 8, to Cable...just fill in your favorite fictional Cyborg. Perhaps we already are. For we shall be Cyborgs not in the merely superficial sense of combining flesh and wires, but in the more profound sense of being human-technology symbionts: thinking and reasoning systems whose minds and selves are spread across biological brain and non-biological circuitry.

This may sound like futuristic mumbo-jumbo, and I happily confess that I wrote the preceding paragraph with an eye to catching your attention, even if only by the somewhat dangerous route of courting your immediate disapproval! But I do believe that it is the plain and literal truth. I believe, to be clear, that it is above all a scientific truth, a reflection of some deep and important facts about (a whiff of paradox here?) our special, and distinctively human nature. And certainly, I don't think this tendency towards cognitive hybridization is a modern development. Rather, it is an aspect of our humanity which is as basic and ancient as the use of speech, and which has been extending its territory ever since. 

We see some of the "cognitive fossil trail" of the Cyborg trait in the historical procession of potent Cognitive Technologies that begins with speech and counting, morphs first into written text and numerals, then into early printing (without moveable typefaces), on to the revolutions of moveable typefaces and the printing press, and most recently to the digital encodings that bring text, sound and image into a uniform and widely transmissible format. Such technologies, once up-and-running in the various appliances and institutions that surround us, do far more than merely allow for the external storage and transmission of ideas. They constitute, I want to say, a cascade of "mindware upgrades": cognitive upheavals in which the effective architecture of the human mind is altered and transformed.

What's more, the use, reach and transformative powers of these cognitive technologies is escalating. New waves of user-sensitive technology may soon bring this ancient process to a climax, as our minds and identities become ever more deeply enmeshed in a non-biological matrix of machines, tools, props, codes and semi-intelligent daily objects.

We humans have indeed always been adept at dovetailing our minds and skills to the shape of our current tools and aids. But when those tools and aids start dovetailing back — when our technologies actively, automatically, and continually tailor themselves to us, just as we do to them — then the line between tool and user becomes flimsy indeed. Such technologies will be less like tools and more like part of the mental apparatus of the person. They will remain tools in only the thin and ultimately paradoxical sense in which my own unconsciously operating neural structures (my hippocampus, my posterior parietal cortex) are tools.

I do not really "use" my brain. There is no user quite so ephemeral. Rather, the operation of the brain is part of what makes me who and what I am. So too with these new waves of sensitive, interactive technologies. As our worlds become smarter, and get to know us better and better, it becomes harder and harder to say where the world stops and the person begins.

What are these technologies? They are many, and various. They include potent, portable machinery linking the user to an increasingly responsive World Wide Web. But they include also, and perhaps ultimately more importantly, the gradual smartening-up and interconnection of the many everyday objects which populate our homes and offices.

My immediate goal, however, is not really to talk about new technology. Rather, it is to talk about us, about our sense of self, and about the nature of the human mind. The point is not to guess at what we might soon become, but to better appreciate what we already are: creatures whose minds are special precisely because they are tailor-made to mix and match neural, bodily and technological ploys.

Cognitive technologies are best understood as deep and integral parts of the problem-solving systems that constitute human intelligence. They are best seen as proper parts of the computational apparatus that constitutes our minds. If we do not always see this, or if the idea seems outlandish or absurd, that is because we are in the grip of a simple prejudice: the prejudice that whatever matters about mind must depend solely on what goes on inside the biological skin-bag, inside the ancient fortress of skin and skull. But this fortress has been built to be breached. It is a structure whose virtue lies in part in it's capacity to delicately gear its activities to collaborate with external, non-biological sources of order so as (originally) to better solve the problems of survival and reproduction.

Thus consider a brief but representative example. Take the familiar process of writing an article for a newspaper, an academic paper, a chapter in a book. Confronted, at last, with the shiny finished product the good materialist may find herself congratulating her brain on its good work. But this is misleading. It is misleading not simply because (as usual) most of the ideas were not our own anyway, but because the structure, form and flow of the final product often depends heavily on the complex ways the brain cooperates with, and depends on, various special features of the media and technologies with which it continually interacts.

We tend to think of our biological brains as the point source of the whole final content. But if we look a little more closely what we may often find is that the biological brain participated in some potent and iterated loops through the cognitive technological environment.

We began, perhaps, by looking over some old notes, then turned to some original sources. As we read, our brain generated a few fragmentary, on-the-spot responses which were duly stored as marks on the page, or in the margins. This cycle repeats, pausing to loop back to the original plans and sketches, amending them in the same fragmentary, on-the-spot fashion. This whole process of critiquing, re-arranging , streamlining and linking is deeply informed by quite specific properties of the external media, which allow the sequence of simple reactions to become organized and grow (hopefully) into something like an argument. The brain's role is crucial and special. But it is not the whole story.

In fact, the true power and beauty of the brain's role is that it acts as a mediating factor in a variety of complex and iterated processes which continually loop between brain, body and technological environment. And it is this larger system which solves the problem. We thus confront the cognitive equivalent of Dawkins' vision of the extended phenotype. The intelligent process just is the spatially and temporally extended one which zigzags between brain, body and world.

One useful way to understand the cognitive role of many of our self-created cognitive technologies is as affording complementary operations to those that come most naturally to biological brains. Thus consider the connectionist image of biological brains as pattern-completing engines. Such devices are adept at linking patterns of current sensory input with associated information: you hear the first bars of the song and recall the rest, you see the rat's tail and conjure the image of the rat.

Computational engines of that broad class prove extremely good at tasks such as sensorimotor coordination, face recognition, voice recognition, etc. But they are not well-suited to deductive logic, planning, and the typical tasks of sequential reason. They are, roughly speaking, "Good at Frisbee, Bad at Logic:" a cognitive profile that is at once familiar and alien. Familiar, because human intelligence clearly has something of that flavor. Yet alien, because we repeatedly transcend these limits, planning family vacations, running economies, solving complex sequential problems, etc., etc.

A powerful hypothesis, which I first encountered in work by David Rumelhart, Paul Smolensky, John McClelland and Geoffrey Hinton, is that we transcend these limits, in large part, by combining the internal operation of a connectionist, pattern-completing device with a variety of external operations and tools which serve to reduce various complex, sequential problems to an ordered set of simpler pattern-completing operations of the kind our brains are most comfortable with.

Thus, to borrow their illustration, we may tackle the problem of long multiplication — e.g. 667X999— by using pen, paper and numerical symbols. We then engage in a process of external symbol manipulations and storage so as to reduce the complex problem to a sequence of simple pattern-completing steps that we already command, first multiplying 9 by 7 and storing the result on paper, then 9 by 6, and so on.

The cognitive anthropologist Ed Hutchins, in his book Cognition In The Wild depicts the general role of cognitive technologies in similar terms, suggesting that "[Such tools] permit the [users] to do the tasks that need to be done while doing the kinds of things people are good at: recognizing patterns, modeling simple dynamics of the world, and manipulating objects in the environment." This description nicely captures what is best about good examples of cognitive technology: recent word-processing packages, web browsers, mouse and icon systems, etc. (It also suggests, of course, what is wrong with many of our first attempts at creating such tools: the skills needed to use those environments (early VCR's, word-processors, etc.) were precisely those that biological brains find hardest to support, such as the recall and execution of long, essentially arbitrary, sequences of operations.

The conjecture, then, is that one large jump or discontinuity in human cognitive evolution involves the distinctive way human brains repeatedly create and exploit various species of cognitive technology so as to expand and reshape the space of human reason. We, more than any other creature on the planet, deploy non-biological elements (instruments, media, notations) to complement (but not, typically, to replicate) our basic biological modes of processing, creating extended cognitive systems whose computational and problem-solving profiles are quire different from those of the naked brain. Human brains maintain an intricate cognitive dance with an ecologically novel, and immensely empowering, environment: the world of symbols, media, formalisms, texts, speech, instruments and culture. The computational circuitry of human cognition thus flows both within and beyond the head.

Such a point is not new, and has been well-made by a variety of theorists working in many different traditions. I believe, however, that the idea of human cognition as subsisting in a hybrid, extended architecture (one which includes aspects of the brain and of the cognitive technological envelope in which our brains develop and operate) remains vastly underappreciated. We simply cannot hope to understand what is special and distinctively powerful about human thought and reason by merely paying lip-service to the importance of this web of surrounding technologies.

Instead, we need to work together towards a much more detailed understanding of how our brains actively dovetail their problem-solving activities to a variety of non-biological resources, and of how the larger systems thus created operate, change, interact and evolve. In addition it may soon be quite important (morally, socially, and politically) to publicly loosen the bonds between the very ideas of minds and persons and the image of the bounds, properties, locations and limitations of the basic biological organism . 

A proper question to press, however, is this: since no other species on the planet builds as varied, complex and open-ended designer environments as we do (the claim, after all, is that this is why we are special), what is it that allowed this process to get off the ground in our species in such a spectacular way? And isn't that, whatever it is, what really matters? Otherwise put, even if it's the designer environments that make us so intelligent, isn't it some deep biological difference that lets us build-discover-use them in the first place?

This is a serious, important and largely unresolved question. Clearly, there must be some (but perhaps quite small) biological difference that lets us get our collective foot in the designer environment door — what can it be? One possible story locates the difference in a biological innovation for widespread cortical plasticity combined perhaps with the extended period of protected learning called "childhood". Thus "neural constructivists" such as Steve Quartz and Terry Sejnowski depicts neural (especially cortical) growth as experience — dependent, and as involving the actual construction of new neural circuitry (synapses, axons, dendrites) rather than just the fine-tuning of circuitry whose basic shape and form is already determined. One upshot is that the learning device itself changes as a result of organism-environmental interactions — learning does not just alter the knowledge base for a fixed computational engine, it alters the internal computational architecture itself. The linguistic and technological environments in which human brains grow and develop are thus poised to function as the anchor points around which such flexible neural resources adapt and fit.

Perhaps, then, it is a mistake to posit a biologically fixed "human nature" with a simple "wraparound" of tools and culture. For the tools and culture are indeed as much determiners of our nature as products of it. Ours are (by nature) unusually plastic brains whose biologically proper functioning has always involved the recruitment and exploitation of non-biological props and scaffolds. More so than any other creature on the planet, we humans emerge as natural-born cyborgs, factory tweaked and primed so as to be ready to grow into extended cognitive and computational architectures: ones whose systemic boundaries far exceed those of skin and skull.

All this adds interesting complexity to those evolutionary psychological accounts which emphasize our ancestral environments. For we must now take into account an exceptionally plastic evolutionary overlay which yields a constantly moving target, an extended cognitive architecture whose constancy lies mainly in its continual openness to change. Even granting that the biological innovations which got this ball rolling may have consisted only in some small tweaks to an ancestral repertoire, the upshot of this subtle alteration is a sudden, massive leap in cognitive-architectural space. For our cognitive machinery is now intrinsically geared to transformation, technology-based expansion, and a snowballing and self-perpetuating process of computational and representational growth. The machinery of contemporary human reason thus turns out to be rooted in a biologically incremental progression while simultaneously existing on the far side of a precipitous cliff in cognitive-architectural space.

In sum, the project of understanding human thought and reason is easily and frequently misconstrued. It is misconstrued as the project of understanding what is special about the human brain. No doubt there is something special about our brains. But understanding our peculiar profiles as reasoners, thinkers and knowers of our worlds requires an even broader perspective: one that targets multiple brains and bodies operating in specially constructed environments replete with artifacts, external symbols, and all the variegated scaffoldings of science, art and culture.

Understanding what is distinctive about human reason thus involves understanding the complementary contributions of both biology and (broadly speaking) technology, as well as the dense, reciprocal patterns of causal and co-evolutionary influence that run between them. We cannot see ourselves aright until we see ourselves as nature's very own cyborgs: cognitive hybrids who repeatedly occupy regions of design space radically different from those of our biological forbears. The hard task, of course, is now to transform all this from (mere) impressionistic sketch into a balanced scientific account of the extended mind.