Youll Probably Never Upload Your Mind Into A Computer – io9

Many futurists predict that one day we'll upload our minds into computers, where we'll romp around in virtual reality environments. That's possible but there are still a number of thorny issues to consider. Here are eight reasons why your brain may never be digitized.

Indeed, this isnt just idle speculation. Many important thinkers have expressed their support of the possibility, including the renowned futurist Ray Kurzweil (author of How to Create a Mind), roboticist Hans Moravec, cognitive scientist Marvin Minsky, neuroscientist David Eagleman, and many others.

Skeptics, of course, relish the opportunity to debunk uploads. The claim that well be able to transfer our conscious thoughts to a computer, after all, is a rather extraordinary one.

But many of the standard counter-arguments tend to fall short. Typical complaints cite insufficient processing power, inadequate storage space, or the fear that the supercomputers will be slow, unstable and prone to catastrophic failures concerns that certainly dont appear intractable given the onslaught of Moores Law and the potential for megascale computation. Another popular objection is that the mind cannot exist without a body. But an uploaded mind could be endowed with a simulated body and placed in a simulated world.

To be fair, however, there are a number of genuine scientific, philosophical, ethical, and even security concerns that could significantly limit or even prevent consciousness uploads from ever happening. Here are eight of the most serious.

Proponents of mind uploading tend to argue that the brain is a Turing Machine the idea that organic minds are nothing more than classical information-processors. Its an assumption derived from the strong physical Church-Turing thesis, and one that now drives much of cognitive science.

But not everyone believes the brain/computer analogy works. Speaking recently at the annual meeting of the American Association for the Advancement of Science in Boston, neuroscientist Miguel Nicolelis said that, The brain is not computable and no engineering can reproduce it. He referred to the idea of uploads as bunk, saying that itll never happen and that [t]here are a lot of people selling the idea that you can mimic the brain with a computer. Nicolelis argues that human consciousness cant be replicated in silicon because most of its important features are the result of unpredictable, nonlinear interactions among billions of cells.

You cant predict whether the stock market will go up or down because you cant compute it, he said. You could have all the computer chips ever in the world and you wont create a consciousness. Image credit: Jeff Cameron Collingwood/Shutterstock.

The computability of the brain aside, we may never be able to explain how and why we have qualia, or whats called phenomenal experience.

According to David Chalmers the philosopher of mind who came up with the term hard problem well likely solve the easy problems of human cognition, like how we focus our attention, recall a memory, discriminate, and process information. But explaining how incoming sensations get translated into subjective feelings like the experience of color, taste, or the pleasurable sound of music is proving to be much more difficult. Moreover, were still not entirely sure why we even have consciousness, and why were not just philosophical zombies hypothetical beings who act and respond as if theyre conscious, but have no internal mental states.

In his paper, Facing Up to the Problem of Consciousness, Chalmers writes:

How can we explain why there is something it is like to entertain a mental image, or to experience an emotion? It is widely agreed that experience arises from a physical basis, but we have no good explanation of why and how it so arises. Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does.

If any problem qualifies as the problem of consciousness, argues Chalmers, it is this one. Image: blog.lib.umn.edu.

And even if we do figure out how the brain generates subjective experience, classical digital computers may never be able to support unitary phenomenal minds. This is whats referred to as the binding problem our inability to understand how a mind is able to segregate elements and combine problems as seamlessly as it does. Needless to say, we dont even know if a Turing Machine can even support these functions.

More specifically, we still need to figure out how our brains segregate elements in complex patterns, a process that allows us to distinguish them as discrete objects. The binding problem also describes the issue of how objects, like those in the background or in our peripheral experience or even something as abstract as emotions can still be combined into a unitary and coherent experience. As the cognitive neuroscientist Antti Revonsuo has said, Binding is thus seen as a problem of nding the mechanisms which map the objective physical entities in the external world into corresponding internal neural entities in the brain.

He continues:

No one knows how our organic brains perform this trick at least not yet or if digital computers will ever be capable of phenomenal binding. Image credit: agsandrew/Shutterstock.

Though still controversial, theres also the potential for panpsychism to be in effect. This is the notion that consciousness is a fundamental and irreducible feature of the cosmos. It might sound a bit New Agey, but its an idea thats steadily gaining currency (especially in consideration of our inability to solve the Hard Problem).

Panpsychists speculate that all parts of matter involve mind. Neuroscientist Stuart Hameroff has suggested that consciousness is related to a fundamental component of physical reality components that are akin to phenomenon like mass, spin or charge. According to this view, the basis of consciousness can be found in an additional fundamental force of nature not unlike gravity or electromagnetism. This would be something like an elementary sentience or awareness. As Hameroff notes, "these components just are." Likewise, David Chalmers has proposed a double-aspect theory in which information has both physical and experiential aspects. Panpsychism has also attracted the attention of quantum physicists (who speculate about potential quantum aspects of consciousness given our presence in an Everett Universe), and physicalists like Galen Strawson (who argues that mental/experiential is physical).

Why this presents a problem to mind uploading is that consciousness may not substrate neutral a central tenant of the Church-Turing Hypothesis but is in fact dependent on specific physical/material configurations. Its quite possible that theres no digital or algorithmic equivalent to consciousness. Having consciousness arise in a classical Von Neumann architecture, therefore, may be as impossible as splitting an atom in a virtual environment by using ones and zeros. Image credit: agsandrew/Shutterstock.

Perhaps even more controversial is the suggestion that consciousness lies somewhere outside the brain, perhaps as some ethereal soul or spirit. Its an idea thats primarily associated with Rene Descartes, the 17th century philosopher who speculated that the mind is a nonphysical substance (as opposed to physicalist interpretations of mind and consciousness). Consequently, some proponents of dualism (or even vitalism) suggest that consciousness lies outside knowable science.

Needless to say, if our minds are located somewhere outside our bodies like in a vat somewhere, or oddly enough, in a simulation (a la The Matrix) our chances of uploading ourselves are slim to none.

Philosophical and scientific concerns aside, there may also be some moral reasons to forego the project. If were going to develop upload technologies, were going to have to conduct some rather invasive experiments, both on animals and humans. The potential for abuse is significant.

Uploading schemas typically describe the scanning and mapping of an individuals brain, or serial sectioning. While a test subject, like a mouse or monkey, could be placed under a general anesthetic, it will eventually have to be re-animated in digital substrate. Once this happens, well likely have no conception of its internal, subjective experience. Its brain could be completely mangled, resulting terrible psychological or physical anguish. Its reasonable to assume that our early uploading efforts will be far from perfect, and potentially cruel.

And when it comes time for the first human to be uploaded, there could be serious ethical and legal issues to consider especially considering that were talking about the re-location of a living, rights-bearing human being. Image credit: K. Zhuang.

Which leads to the next point, that of post-upload skepticism. A person can never really be sure they created a sentient copy of themselves. This is the continuity of consciousness problem the uncertainty well have that, instead of moving our minds, we simply copied ourselves instead.

Because we cant measure for consciousness either qualitatively or quantitatively uploading will require a tremendous leap of faith a leap that could lead to complete oblivion (e.g. a philosophical zombie), or something completely unexpected. And relying on the advice from uploaded beings wont help either (Come on in, the waters fine...).

In an email to me, philosopher David Pearce put it this way:

In other words, the quality of conscious experience in digital substrate could be far removed from that experienced by an analog consciousness. Image: Rikomatic.

Once our minds are uploaded, theyll be physically and inextricably connected to the larger computational superstructure. By consequence, uploaded brains will be perpetually vulnerable to malicious attacks and other unwanted intrusions.

To avoid this, each uploaded person will have to set-up a personal firewall to prevent themselves from being re-programmed, spied upon, damaged, exploited, deleted, or copied against their will. These threats could come from other uploads, rogue AI, malicious scripts, or even the authorities in power (e.g. as a means to instill order and control).

Indeed, as we know all too well today, even the tightest security measures can't prevent the most sophisticated attacks; an uploaded mind can never be sure its safe.

Special thanks to David Pearce for helping with this article.

Top image: Jurgen Ziewe/Shutterstock.

Read more here:

Youll Probably Never Upload Your Mind Into A Computer - io9

Related Posts

Comments are closed.