Brain uploading raises interesting questions as to whether or not the digital copy should be considered a separate person from the original (I say yes).
edited 9th May '12 6:06:28 PM by RTaco
Hence why finally answering that question once and for all by doing it in reality.
Hmmmmm: soooo... if you upload a brain-signature (or whatever term you care to abuse ), and change it to e.g. repair stroke- or dementia-damage, which would be the valid identity? And, how would you deal with other moral mazes like unsociable personalities? Where would you draw lines?
You'd have to have a board of ethical moderators so strong, they'd not so much have teeth, as the full Wolverine, is what.
edited 9th May '12 7:15:29 PM by Euodiachloris
Can't let you do that, Star Euo. That'd be ethically the same as someone else magically causing you to have a stroke.
Smile for me!Heeee: I know. That's why I raised it. Ain't I a stinker?
It's also why I raised the adamantium-tooled ethics board. And, they'd seriously have to be on their game. Hmmm: perforce, they'd be a part of any given country's Medical Board... but, would probably have to be their own department within it.
edited 9th May '12 7:44:49 PM by Euodiachloris
That wouldn't answer the question at all. It'd just light the powder keg.
Well, you'll have an answer when you try it yourself, at least.
Naturally the copy would want to err on the side of safety. The business that copied it would want to err on the side of profit by pushing an answer that would invalidate the copy's vote. That's just the tip of the iceberg.
Not saying that you'll tell others the actual deal, but you can't really lie to yourself whichever way if you try it yourself, right?
People lie to themselves all the time to keep from acknowledging things that are uncomfortable or inconvenient.
Hell, say you worked at a company that did this kind of thing. If their policy was to treat a copied brain like an impersonal program, either you'd have to do that regardless of your own thoughts on the matter, or you'd get fired and replaced by someone more compliant. Whoever is in for the long haul rationalizes it, "oh it's not my fault, it's just company policy". Dissolution of responsibility like that is exactly the purpose of half the bureaucracy that drowns everything, and why it's so goddamned hard to hold anyone accountable for bad business.
To be honest, I am not convinced that a simulation of the brain alone would display true humanlike behaviour, let alone behaviour which is indistinguishable from the original. You'd definitely have to simulate some parts of the endocrine system too, at least, since these have important effects on brain activity; and perhaps, you'd even have to put the simulated mind into some sort of body in space, simulated or real.
People tend to go crazy if they undergo total sensory deprivation for prolonged amounts of time; I don't want to think about what would happen to a truly disembodied human mind.
edited 10th May '12 12:16:54 AM by Carciofus
But they seem to know where they are going, the ones who walk away from Omelas.So you're saying that if your copied self isn't really you and you're in oblivion/afterlife (because you'd be dead), you would automatically come back to life and become your copied self because the company says your copy is you, right?
I want to work for this Jesus company right now.
edited 10th May '12 12:26:41 AM by IraTheSquire
I was thinking more along the lines of an AI that's desperate enough to try and convince itself it's me, and people who own it that have convinced themselves it's nothing. And me still being dead. None of those positions are palatable.
edited 10th May '12 12:31:18 AM by Pykrete
All of this depends on what ideas of "soul" and "afterlife" we are using. If the afterlife is supposed to be beyond time, there is no reason to think in terms of first going into it and then coming back: you exist on the material universe in two non-connected spans of time, and you exist in the afterlife, and that's it.
And in any case, the second "copy" would have no memory whatsoever of the afterlife (well, not unless you implant fake afterlife memories ).
But they seem to know where they are going, the ones who walk away from Omelas.And what I'm saying is that since you're dead, you would know that the copy isn't really you, regardless of what the AI says. Hence you would know the answer once you try it out yourself.
edited 10th May '12 12:33:52 AM by IraTheSquire
Suppose the process didn't involve me dying — I'd already know the copy wasn't me because I'm still there. So how does me kicking it as part of the process change anything? Sounds like wishful mysticism to think you're somehow living on in the copy.
edited 10th May '12 12:36:27 AM by Pykrete
All I am saying is that when this becomes reality, someone will know the true answer to the question of "can I really upload myself to a machine and will that copy be truly me?", namely the person who tried it.
edited 10th May '12 12:38:27 AM by IraTheSquire
Except they won't. There would be no way of functionally discerning from the outside between "the same" person and the copy's wishful thinking while the original was dead.
edited 10th May '12 12:38:51 AM by Pykrete
I think that the soul-as-substantial-form interpretation provides a nice solution to these problems.
If my identity consists in a sufficiently complete description of my pattern, then if somebody saves my "state" when I die and then "restarts" me again then that copy is me.
If the copy is started while I am still alive, well, we begin as the same being; but we diverge very quickly, as we accumulate different experiences, and in short time we become two distinct beings.
But they seem to know where they are going, the ones who walk away from Omelas.But the original will still be dead and be in oblivion/afterlife, right? Whether or not s/he can tell others s/he is dead and is not the copy is entirely another matter. The original will definitely know.
edited 10th May '12 12:40:45 AM by IraTheSquire
edited 10th May '12 12:47:42 AM by Carciofus
But they seem to know where they are going, the ones who walk away from Omelas.That's the thing. You'd only be able to tell the difference at all if there was an afterlife from which you could look back and you failed and ended up there (in which case you needn't have bothered to preserve yourself thus). If there wasn't, the original you would no longer exist to comprehend a failure, and there would be no way for anyone — not even the uploaded mind itself — to tell if the uploaded mind is successfully you or a failed copy that thinks it is. Similarly, if it was a success in either possibility of afterlife/oblivion, the uploaded mind would have no way of confirming that it was indeed a success.
edited 10th May '12 1:00:32 AM by Pykrete
I'd think that the moment you fade ino nothingness you'd know if the copy is successful or not, so you'd still know just at the moment of death.
edited 10th May '12 1:23:53 AM by IraTheSquire
Agreed, but those would be the easier part of the equation.
I vowed, and so did you: Beyond this wall- we would make it through.
[1]
Basically, a team of scientists takes on the task of mapping all the connections between the neurons in our brains in order to understand brain function. Some scientists think that is not necessary. I, for one, am hopeful that after several decades (which is what is implied this to take, assuming that more teams take interest), we'll be able to piece together an entire person by just looking at their neuron connections, not to mention brain uploading, which would be cool (and finally answer the philosophical questions by doing it in reality).