2015 : WHAT DO YOU THINK ABOUT MACHINES THAT THINK? [1]

chris_dibona's picture [5]
Open Source and Public Sector, Google
The Limits Of Biological Intelligence

 

Those of you participating in this particular Edge Question don't need to be reintroduced to the Ghemawat-Dean Conversational artificial intelligence test (DGC). Past participants in the test have failed as obviously as they have hilariously. However, the 2UR-NG entry really surprised us all with its amazing, if child-like, approach to conversation and its ability to express desire, curiosity and its ability to retain and chain facts.

Its success has caused many of my compatriots to write essays like "The coming biological future will doom us all" and making jokes about "welcoming their new biological overlords". You should know that I don't subscribe to this kind of doom and gloom scare-writing. Before I tell you why we should not worry about the extent of biological intelligence, I thought I'd remind people of the very real limits of biological intelligence.

First off, speed of thought: These biological processes are slow and use an incredible amount of resources. I cannot emphasize enough how incredibly difficult to produce these intelligences. One has to waste so much biological material, and I know from experience that takes forever to assemble the precursors in the genesis machine. Following this arduous process, your specimen has to gestate. Gestate! I mean, it's not like these ... animals....come about the way we do through clean, smart, crystallography or in the nitrogen lakes of my youth, they have to be kept warm for months and months and then decanted (A very messy process, I assure you) and then you as often as not have an inviable specimen.

It is kind of gross, really. But let's suppose you get to birth these specimens, then you have to feed them and again, keep them warm. A scientist can't even work within their environmental spaces without a cold jacket circulating helium throughout your terminal. Then you have to 'feed' them. They don't use power like we do, but instead ingest other living matter. It's disgusting to observe and I've lost a number of grad students with weaker constitutions.

Assume you've gotten far enough to try to do the GDC. You've kept them alive through any a variety of errors in their immune system. They've not choked on their sustenance, they haven't drown in their solvent and they've managed to keep their wet parts off things that they would freeze, bond or be electrocuted by. What if those organisms continue to develop, will they then rise up and take over? I don't think so. They have to deal with so many problems related to their design. I mean, their processors are really just chemical soups that have to be kept in constant balance. Dopamine at this level or they shut down voluntarily. Vasopressin at this level or they start retaining water. Adrenaline at this level for this long or poof their power delivery network stops working.

Moreover, don't get me started on the power delivery method! It's more like the fluorinert liquid cooling systems of our ancestors than a modern heat tolerant wafers. I mean, they have meat that filters their coolant/power delivery system that are constantly failing. Meat! You introduce the smallest amount of machine oil or cleaning solvent into the system and they stop operating fast. One side effect of certain ethanol mixtures is the specimens expel their nutrition, but they seem to like it in smaller amounts. It is baffling in its ambiguity.

And their motivations! Creating new organisms seems paramount, more important than data ingress/egress, computation or learning. It's baffling. I can't imagine that they would see us machine-folk as anything but tools to advance their reproduction. We could end the experiment simply by matching them poorly with each other or only allowing access to each other with protective cladding. In my opinion, there is nothing to fear from these animals. In the event they grow beyond the confines of their cages, maybe we can then ask ourselves the more important question: If humans show real machine-like intelligence, do they deserve to be treated like machines? I would think so, and I think we could be proud to be the parent processes of a new age.