Follow TV Tropes

Following

Analysis / Brain Uploading

Go To

Legal, moral, and theological questions regarding Brain Uploading:

  • Is the AI considered to be the same person as its human predecessor or a digital twin? Is it a person at all? If an upload is a person, how different do copies of that upload have to be before they're separate persons?
  • Is one copy responsible for the debts and/or crimes incurred or committed by another copy? Is the original responsible, assuming nondestructive uploading?
  • Assuming nondestructive, undetectable uploading, is uploading without consent of the original a crime? What if the original objects, but the upload doesn't want to be deleted? What about uploading dead people who specified they didn't want to be uploaded after death? And how do the original and the copy feel about no longer being unique?
  • Assuming destructive uploading, the original is dead. How does the copy feel about that? Do they know? Should they be told?
  • Would the soul be copied over? Is there a soul at all to be copied? While some people might see the debunking of mind-body separation as just another case of science marching on, a great deal of people would find the idea that even their mind is quantifiable to be rather frightening. Or worse, would see those who go through with the upload as less than human, and campaign for a ban of the procedure for it violating human dignity.
    • Assuming the existence of the soul (or even just assuming the original believes he has one), how does one feel about the prospect that they may not be simply destroyed, but go on to an afterlife (pleasant or unpleasant) while a newly created double takes their place? After all, "they" either stand a 50/50 chance of winding up as the original or the copy, or is always going to be the original. For that matter, is the newborn copy innocent of sin despite their memories of committing them?
  • Even theorists who don't believe in the soul, per se, often believe in consciousness as a real phenomenon. Would a simulation of a brain experience consciousness any more than a simulation of lungs can be said to actually respirate oxygen? How could an outside observer tell? note  The fact that the observer probably can't tell arguably makes this consideration more important, not less—since uploadees would be gambling their very selves on the trustworthiness of this tech (and even if it can simulate their consciousness, that doesn't mean it's the same, continued consciousness).
  • Though fictional depictions of virtual worlds rarely address the fact, programs move by copying themselves. Any time "virtual you" moves from Server A to Server B, you're leaving behind a duplicate of yourself, unless it's automatically deleted. Might the constant duplication and murder of people as the basis of all transportation be unethical, or at least problematic?
    • If a scanned mind is an analog recording, the constant and casual re-copying necessary to "travel" electronically would be impossible without corrupting the data. You could copy yourself into a durable and long-lasting robot body relatively safely, but you could never safely leave it except by physically transplanting the robot's brain. And of course, physical electronic components do wear out (a lot faster than flesh does, at present).
  • How accurate would the copy be, especially in the early days of the technology? If the flaws are significant but not immediately obvious, how many people might undergo the procedure before the problems are noticed? And if you know about the flaws ahead of time, how much of your personality or consciousness are you willing to throw away or see changed beyond your control for a type of immortality?
  • Even if the tech is usually reliable, do obviously botched copies have any legal rights as people?
  • What if you have concerns about the trustworthiness of the process while everyone you know is doing it? Conversely, if you're a true believer in the process, what if society condemns it?
  • Can the computer provide a good enough simulation of human sensory input to keep you from going mad? Even a brief period spent in a sensory deprivation tank can have terrible effects on the mind, so one can imagine what complete absence of a physical body might do.
  • A person converted into software has all the vulnerabilities of software. They can very likely be hacked, duplicated, or edited against their will. For better or for worse, the human mind is currently relatively impregnable. Do you really want to be rendered no more unique than a Google search image, and more malleable than putty in the hands of others? Do you want to wake up one day to find that you're an illegal copy of yourself, being treated as a toy by a hacker? Would you necessarily own the copyright to yourself? If such a copyright even existed at all (since many consider copyright unenforceable and undesirable in the digital age), would the agency that uploaded you own it? How can the law provide any protection to a citizen who can be duplicated (and their duplicate used and abused however the criminal wishes) as easily as copying a computer file? And every time such a copy is produced, "you" stand a 50/50 chance of being that twin. If a virtual world makes a synthetic heaven possible, it likewise makes synthetic hells possible, and the latter may be far easier to produce (either accidentally or deliberately).
  • In a world where uniqueness exists, at best, as a legal courtesy, mightn't human life come to be seen as fundamentally less valuable?
  • Who owns the computer or computers that your virtual self would be running on? Are they under any obligation to keep running your program? If your program is not being run, is that the same as being dead? Asleep? In suspended animation? And if you don't like what they're doing or are planning to do with you, could you class any attempts to stop them as 'self-defence'? How do your rights as a human being translate to a computer simulation?
  • What if you changed your mind? What if you don't like it? Would you be allowed to simply turn yourself off?
  • Would a human program be able to perceive anything outside of their device? Or could they percieve anything at all, beyond the digital data fed directly to them?

Top