gary_marcus's picture
Professor of Psychology, Director NYU Center for Language and Music; Author, Guitar Zero
Psychologist, New York University; Author, The Birth of the Mind

If computers are made up of hardware and software, transistors and resistors, what are neural machines we know as minds made up of?

Minds clearly are not made up of transistors and resistors, but I firmly believe that at least one of the most basic elements of computation is shared by man and machine: the ability to represent information in terms of an abstract, algebra-like code.

In a computer, this means that software is made up of hundreds, thousands, even millions of lines that say things like IF X IS GREATER THAN Y, DO Z, orCALCULATE THE VALUE OF Q BY ADDING A, B, AND C. The same kind of abstraction seems to underlie our knowledge of linguistics. For instance, the famous linguistic dictum that a Sentence consists of a Noun Phrase plus a Verb Phrase can apply to an infinite number of possible nouns and verbs, not just a few familiar words. In its open-endedness, it is an example of mental algebra par excellence.

In my lab, we discovered that even infants seem to be able to grasp something quite similar. For example, in the course of just two minutes, a seven-month-old baby can extract the ABA "grammar" inherent in set of made-up sentences like la ta la, ga na ga, je li je. Or the ABB "grammar" in sentences like la ta ta, ga na na, je li li.

Of course, this experiment doesn't prove that there is an "algebra" circuit in the brain—psychological techniques alone can't do that. For final proof, we'll need neuroscientific techniques far more sophisticated than contemporary brain imaging, such that we can image the brain at the level of interactions between individual neurons. But every bit of evidence that we can collect now—from babies, from toddlers, from adults, from psychology and from linguistics—seems to confirm the idea that algebra-like abstraction is a fundamental component of thought.