[$mind]!=[$computer]: why uploading your brain probably won’t happen

Via Science Not Fiction, here’s one Timothy B Lee taking down that cornerstone of Singularitarianism, the uploading of minds to digital substrates. How can we hope to reverse-engineer something that wasn’t engineered in the first place?

You can’t emulate a natural system because natural systems don’t have designers, and therefore weren’t built to conform to any particular mathematical model. Modeling natural systems is much more difficult—indeed, so difficult that we use a different word, “simulation” to describe the process. Creating a simulation of a natural system inherently means means making judgment calls about which aspects of a physical system are the most important. And because there’s no underlying blueprint, these guesses are never perfect: it will always be necessary to leave out some details that affect the behavior of the overall system, which means that simulations are never more than approximately right. Weather simulations, for example, are never going to be able to predict precisely where each raindrop will fall, they only predict general large-scale trends, and only for a limited period of time. This is different than an emulator, which (if implemented well) can be expected to behave exactly like the system it is emulating, for as long as you care to run it.

Hanson’s fundamental mistake is to treat the brain like a human-designed system we could conceivably reverse-engineer rather than a natural system we can only simulate. We may have relatively good models for the operation of nerves, but these models are simplifications, and therefore they will differ in subtle ways from the operation of actual nerves. And these subtle micro-level inaccuracies will snowball into large-scale errors when we try to simulate an entire brain, in precisely the same way that small micro-level imperfections in weather models accumulate to make accurate long-range forecasting inaccurate.

As discussed before, I rather think that mind simulation – much like its related discipline, general artificial intelligence – is one of those things whose possibility will only be resolved by its achievement (or lack thereof). Which, come to think of it, might explain the somewhat theological flavour of the discourse around it…

9 thoughts on “[$mind]!=[$computer]: why uploading your brain probably won’t happen”

  1. Ahh yes, the old argument that if something is “natural” then it’s got some magic sauce in it that we can never completely understand. Proof: heaver than air flight is impossible, man will never walk on the moon, etc etc

  2. I think the issue swept under the occipital bone in Hanson’s article is the ethical aspect. To categorize the brain simply as a “signal processor” which we can “utilize” to “do work for us”, while simultaneously expecting to recreate a human mind down to the level of description necessary for duplication, meaning, essentially, creating a conscious entity.(lest you’re a bunkered-down Des Cartesian or a kitten-killing Dennet-loving materialist) Such an entity would or at least should be considered human, with all rights entailed, not a “signal processing” machine to be exploited like a calculator or an iPhone.

    Perhaps non-human like Artificial Intelligence is the way to go, both because of practical and logistic ease, and because of the ethical tarbabies that we will inevitably find ourselves twisting up in.

  3. In reply to 320: This is a “proof” that is often used but it is, unfortunately, a fallacious argument. To say that because people once did not think that x was possible and were proved wrong does not mean that everything that people believe to be wrong will turn out to be right (or wrong).

    This example should illustrate the point:
    One day I will be able to fly so fast that I will be able to reach other Galaxies within seconds. Proof: heaver than air flight is impossible, man will never walk on the moon, etc etc

    This argument does not eliminate the possibility of a thinking artificial intelligence but nor does it support it.

  4. Ms. Andreadis, I suggest respectfully that you are mistaken to dismiss Robin Hanson’s ideas by casually categorizing him as an “economist” and thus judging him to be not knowledgeable about other topics. Prof. Hanson is, by no stretch of the imagination, a typical economist.

  5. It’s actually Dr. Andreadis and the Dr. stands for molecular neurobiology research in brain function. I know a good deal about transhumorism, and Robin Hanson is among the funniest in that lot.

  6. I agree with most of what you say. If and when mind uploading becomes a reality, and as long as there’s lots of money to be made–I think s whole new technology will develop. Technicians will collect and analyze the sensory data of specific locations, and the companies that pay them will be glad to send you their brochures. They’ll visit specific locations–cities, towns, mountains, beaches, whatever and making notes, record the interaction of the flora and fauna–mostly the things that we might notice–we virtual real people. These technicians will look for what and who makes a place unique, including the locals who give a place its character. They select members of the area’s citizenry–shopkeepers, waiters, etc. For a fee, the local “colorfuls” sign away the rights to specific memories. Used for sims, these memories result in sims who add to a more “authentic” experience for the paying virtual tourists. Perfectly rendered details, but missing any surprise. Might be fun for a while though, then, surprise me with a delete.

Comments are closed.