Present day. Present time! Hahahahaha!
And better than thy stroke; why swellest thou then?Look in the mirror. You are one of the tomatoes you speak of.
-*Express doubt about it occurring that soon*
-*Talk about the number of scientific limitations that we have yet to even get close to solving yet*
-*Express doubt that singularity will ever actually occur*
Don't get me wrong, we're making progress, but singularity to me still feels like a science fiction concept rather than an actual goal at this point due to the fact that it expects a variety of things that we're no where near close to figuring. We're extending life but we're not where near immortality, we've got storage spaces and back up networks but no way to upload the brain, and robotics, while advancing quite steadily, are not to the point of self-maintenance and flexibility of the human body. Strong Artificial Intelligence itself is a dubious concept since it requires that the machine be able to invent itself, in other words, be capable of not simply using logic but applying it in a totally free form manner. Heck, even if science does figure this stuff out, the politics of it all is going to slow down or even stop it.
Wow, I managed to completely predict my post. Guess I'm a predictable person on this topic.
The thing about making witty signature lines is that it first needs to actually be witty.Unpredictable.
When will we find an answer to the P-NP problem, or to the Goldbach conjecture? I couldn't say — these are very difficult problems, and people have been working on them for a while now. It could be tomorrow, it could be in five hundred years.
And now I should try to make a prediction about how much time will it be necessary to find the solution of a bunch of problems, some of which are not even defined very precisely, and some of which may be easily as difficult as P-NP?
Yeah, I think I'll pass.
One thing that I am relatively sure about, though, is that whatever happens, whenever happens, will not turn the Earth into a magical fairy candyland. It will solve some problems, and create some new ones.
edited 1st Jun '11 11:47:01 PM by Carciofus
But they seem to know where they are going, the ones who walk away from Omelas.I'm not sure I would really want the singularity anyway.
Well, I can't say I'm too confident in the timelines or predictions beyond the rough of Kurzweil or really any futurist. Futurists of the past have pretty consistent in how they're wrong (and how they're right)[1]and I don't see why I should think present ones are much different.
[1] This facsimile operated in part by synAC.I am positive a global post-scarcity communist society that has eradicated aging and disease, and made suffering optional while greatly expanding the minds of those who wish it is possible.
"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick BostromGiven an infinite amount of time and guaranteed survival up until that time? Sure.
The thing about making witty signature lines is that it first needs to actually be witty.^^
No thank you.
Well, there you go, it's hardly a fairy candyland if some people dislike it!
[1] This facsimile operated in part by synAC.LOL, the utopia problem, it's never utopia for everyone.
The thing about making witty signature lines is that it first needs to actually be witty.I just don't feel terribly comfortable with the human race becoming a decadent society that doesn't have to work for a living or ever endure hardship and has very few ways of proving their worth or attaining social status. Life isn't supposed to be a cake walk, I don't think it should be.
Post-Scarcity would be cool, so that we would have the resources needed to fuel a push to the stars.
edited 2nd Jun '11 12:04:01 AM by Barkey
Technology is not going to solve all our problems for us. It might help, but much of human suffering nowadays is has social, political and, let's say, ethical causes.
Science is not going to be able to do much about them. We, on the other hand, might. But it's not going to be a matter of pushing a button and letting a sovereign AI do all the work for us.
edited 2nd Jun '11 12:07:03 AM by Carciofus
But they seem to know where they are going, the ones who walk away from Omelas.Naaaaaah, Eliezer figured that out ages ago. He covers basically every angle anyone could possibly think of.
As for whether it's all feasible...
Where were you when I laid the earth’s foundation? Tell me, if you understand. Who marked off its dimensions? Surely you know! ~ GODWhich Eliezer again?
The thing about making witty signature lines is that it first needs to actually be witty.I'd say, *probably* within the next 50 years ... if you consider how the resolution of brain scanning has been going up, and the cost of computing power has been going down, it's likely that we'll eventually reach a state where we can scan a human brain and then simulate it, which should *seriously* screw things up: Imagine you're Bill Gates, and you have the choice between hiring 1000 people and running 1000 copies of yourself ... which one is cheaper, which one is more efficient? And doing that doesn't require any amazing technological insight, just improvement upon what we can already do.
And that's just an upper bound; maybe we'll have AI before that. I'm not sure it'd be better :P
Point that somewhere else, or I'll reengage the harmonic tachyon modulator.Oh, that'd be easy, I'd backstab myself and try to grab that power. See, you're going to run into problems with that sort of thing.
The thing about making witty signature lines is that it first needs to actually be witty.Oh, we are making an estimate based on exponential growth projections?
All right then, I have to agree. The wangularity is near.
edited 2nd Jun '11 12:13:38 AM by Carciofus
But they seem to know where they are going, the ones who walk away from Omelas.^^^
That sort of concept is exactly what chills me to the bone about the singularity, and why I don't want to see it in my lifetime.
edited 2nd Jun '11 12:14:59 AM by Barkey
Well, as long as we're linking smbc...
...
;_;
I think that as defined on this site, we'll never have The Singularity. We'll have events that could be considered The Singularity according to definitions I've seen elsewhere, but this site's definition (a way of living not even imaginable to modern humans) seems impossible—I mean, if the Internet didn't make us inhuman, nothing will.
Edit: To clarify, I'm arguing the Law of Conservation of Normality on a species-wide scale.
edited 2nd Jun '11 12:26:02 AM by feotakahari
That's Feo . . . He's a disgusting, mysoginistic, paedophilic asshat who moonlights as a shitty writer—Something Awful^
Sometimes I think the internet is working on it.
Nah, extrapolating growth in general is silly (Kurzweil's guilty of it a lot), you can fudge the data much too easily. I'm just talking about extrapolating brain scanning resolution and the cost of computing power, and they don't need to go "to infinity" or anything, just reach a point where it's technically feasible to scan a human brain and simulate it. Which is very different from saying "hey look how this curve is bending upwards, in the future it will bend upwards *even more*! To infinity! It's the Singularity! Send me your money!
Point that somewhere else, or I'll reengage the harmonic tachyon modulator.
When will artificial general intelligence be invented?
Personally I'm hoping Kurzweil is roughly correct on the timelines, because then life could get so much better.
"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom