"How does Ronnie Ron taste, master?"
Harry spat out an eyeball. "Like some kid with eyes."
Dobby ducked an astronaut's poison barbed fist, digging his groinsaw into the beast's abdomen and letting the spray of viscera wash over his elfin space armor. The skulls' eye sockets on his shoulders grew brilliant with an infernal cast and vomited a bolt of light through an astronaut; he was thrown back against the deathwall, his flesh boiling in another dimension.
Harry slapped Dobby, who giggled.
Harry reminded himself to kill himself later.
"Master, look out!"
Dobby's groinsaw screamed as it flew off the armor, rocketing through the air like an early dream of mankind. It flew through three astronauts who dropped their hellspears as the saw cut a hole in the ground beneath them so they fell to hell and the demonic spheres rape them to this day, boys and girls.
Okay, the theoretical robotic entity is, let us say, covered in human skin. It has the capability for passing for human. Ergo, due to the nature of Qualia, we cannot percieve its perceptions-so what reason, exactly, is there to assume that it does not have human-like perceptions when all evidence suggests it does?
If we knew some kind of primordial truth to the universe-that, no, robots don't truly have "Human Experiences," that'd be one thing, but our only ability to interact with the universe is through our perceptions, and our perceptions of the robots are such that we can't expect their thoughts to be any less real than a human being's.
How about a robot that is more sentient than most humans?
"Sweets are good. Sweets are justice."Less real, no, but I'm just pointing out that sentience != human thought process.
Da Rules excuse all the inaccuracy in the world. Listen to them, not me.IIRC Turing suggested such an answer to the soullessness problem (since excluding vis vitae it's mostly a matter of religion): not your thing to question whom can God give a soul.
"Atheism is the religion whose followers are easiest to troll"Elegant.
"Sweets are good. Sweets are justice."Honestly, there's no way for me to prove that other humans are sentient. There's just myself and a empathic connection that I choose to engage in. Why shouldn't I extend robots the same courtesy?
The Happiness in Slavery thing is a different issue.
edited 14th Feb '11 6:26:31 PM by Clarste
Precisely~
There are very strong, physical arguments that P-Zombies are absurd.
"Sweets are good. Sweets are justice.": And? What difference does it make? And why does it matter?
A guy called dvorak is tired. Tired of humanity not wanting to change to improve itself. Quite the sad tale.Proving that P-Zombies can't exist means proving all (fully mentally able) humans are sentient, means doubting that is not only a pointless exercise, which we already knew, it is just plain wrong.
edited 15th Feb '11 8:13:35 AM by Ardiente
"Sweets are good. Sweets are justice.": And? If we find out the AI machine is actually alive, do we enter Double Standard or do we not?
A guy called dvorak is tired. Tired of humanity not wanting to change to improve itself. Quite the sad tale.It doesn't need to be "alive" to be sentient. Its not like we expect it to grow or have kids or die or whatever.
"Sweets are good. Sweets are justice."@They Call Me Tomu
I have to ask you hypothetical a question. What if it was possible to bio engineer a human so that they neither had sentience nor sapience but they functioned identically like a calculator. Would you consider using that person like you would a normal calculator?
As far as anyone can determine, the brain is a (phenomenally complex) calculator. The only difference is the Uncanny Valley.
edited 15th Feb '11 9:44:42 AM by Yej
Da Rules excuse all the inaccuracy in the world. Listen to them, not me.Of course I would.
I'd also consider brain transplants so that when my body gets riddled with holes, I can easily replace it.
That's part of the entire point of bioengineering. Doi.
edited 15th Feb '11 9:45:03 AM by TheyCallMeTomu
@Yej
An incremental difference would still be a difference. The color grey isn't black thats just very light.
edited 15th Feb '11 10:08:13 AM by TheHyenaPrincess
The very notion that "black" is a "thing" in and of itself is fundamentally flawed. It's a description of the property of hue.
Unless you think there's an abstract object called black or something...
We can get into complicated metaphysics arguments about "What does it mean for something to be similar to something else? How can something be the same and different" as it pertains to computers and humans and yadda yadda. But that's entirely trivial.
In a world without a mortal coil, where there is only consciousness, we would understand "human" interactions. If computers have this consciousness, then they would be just as "human." The organic form is irrelevant.
Now, you can argue as to whether or not robots will ever obtain that level of consciousness, but I would ask that you apply the same rules to assuming other humans are conscious that you do to robots.
edited 15th Feb '11 10:17:20 AM by TheyCallMeTomu
@Yej
You missed my point completely. I'm not saying its a difference in kind. I'm saying that even if the difference is in degree's its still a major difference. For example you wouldn't say 5 dollars is the same amount as 500 dollars because theres only a 495 dollar difference.
Also to note a large enough difference in degree can be considered a difference in kind. You wouldn't call steam really hot water even though both only differ in how much heat they have.
Semantics.
(Steam has different properties from even very hot water)
More importantly, I'm trying to point out that there's an incredibly valid reason for treating a human-like AI as a non-human: it might not think like a human.
Imagine you've got a computer that's as powerful as a human, except all of its intelligence is focused on social interaction. It couldn't even move around on its own, or do the most basic arithmetic, but could invoke Easy Evangelism. Is it at all reasonable to treat such a machine as human?
Da Rules excuse all the inaccuracy in the world. Listen to them, not me.
...try and take over the world!
...Wait, sorry, wrong line.
Because it won't necessarily think anything like any human. It might behave more like a house elf, or pretty much anything.
Da Rules excuse all the inaccuracy in the world. Listen to them, not me.