And people said that would stop with IJBM. -cackle-
Really though, yeah, the problem was always with blind obedience to ideology without consideration to its consequences rather than any given ideology.
edited 7th Feb '11 3:29:56 PM by Pykrete
I reject that the notion that morality has to center around the survival of humanity.
Robots have just as much right to "live" as human beings, and if they're better people, then they might have more.
... B Etter by what standard? Roots are tools, no matter how intelligent, they can't do stuff that's beyond their programming...
"Sweets are good. Sweets are justice."Suppose you have human level AI that can learn from its mistakes? Doesn't that deserve to survive just as much as people? Is it moral to prioritise an individual human's survival over the survival of that AI?
And would it be better to create a general purpose morality and ethics thread at this point?
edited 7th Feb '11 3:45:55 PM by BobbyG
Welcome To TV Tropes | How To Write An Example | Text-Formatting Rules | List Of Shows That Need Summary | TV Tropes Forum | Know The StaffMorals and ethics really are arbitrary at some point, boiling down to values. Thus, any meaningful argument must be between two individuals who share at least some values.
Eliezer Yudkowsky's Singularity Institute is devoted to the project of how to create an ethical transhuman AI. There's a lot of good material there that gets quoted a lot on the wiki. The fact is that we don't know for certain that a hypothetical transhuman intelligence would find any utility in keeping baseline humans around. In my decidedly selfish personal morality, I'd like something resembling me to be around in the future. In my practical mindset, I realize that the sapience of the future may be nothing like humans of today. As long as sapience itself persists, I'd consider the fundamental goal of life to have succeeded.
Precisely why I concede that, while morality may not exist as an absolute, we have to establish some kind of baseline. While religious points of view may serve to inform such a baseline, I don't accept that they have any default priority over any other source.
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"I don't think life has a fundamental goal. I think we're an accident of nature.
If I thought we were created to serve God, that would probably alter my morality somewhat.
Welcome To TV Tropes | How To Write An Example | Text-Formatting Rules | List Of Shows That Need Summary | TV Tropes Forum | Know The StaffTo Serve Man...
"Sweets are good. Sweets are justice."
I think we more or less got a consensus that religion itself is not evil, but people like to warp ideologies to do all kind of weird or evil things.
Then we sort of lost focus…
edited 7th Feb '11 3:30:42 PM by Justice4243
Justice is a joy to the godly, but it terrifies evildoers.Proverbs21:15 FimFiction account.