TV Tropes Org

Forums

On-Topic Conversations:
When's the singularity?
search forum titles
google site search
Total posts: [150]
1
 2  3  4  5 6

When's the singularity?:

Nihilist Hippie
When will artificial general intelligence be invented?

Personally I'm hoping Kurzweil is roughly correct on the timelines, because then life could get so much better.
"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
Present day. Present time! Hahahahaha!
And better than thy stroke; why swellest thou then?
victorinox243
Look in the mirror. You are one of the tomatoes you speak of.

 4 Usht, Wed, 1st Jun '11 11:30:04 PM from an arbitrary view point.
Lv. 3 Genasi Wizard
-*Express doubt about it occurring that soon*

-*Talk about the number of scientific limitations that we have yet to even get close to solving yet*

-*Express doubt that singularity will ever actually occur*

Don't get me wrong, we're making progress, but singularity to me still feels like a science fiction concept rather than an actual goal at this point due to the fact that it expects a variety of things that we're no where near close to figuring. We're extending life but we're not where near immortality, we've got storage spaces and back up networks but no way to upload the brain, and robotics, while advancing quite steadily, are not to the point of self-maintenance and flexibility of the human body. Strong Artificial Intelligence itself is a dubious concept since it requires that the machine be able to invent itself, in other words, be capable of not simply using logic but applying it in a totally free form manner. Heck, even if science does figure this stuff out, the politics of it all is going to slow down or even stop it.

Wow, I managed to completely predict my post. Guess I'm a predictable person on this topic.
The thing about making witty signature lines is that it first needs to actually be witty.
Is that cake frosting?
Unpredictable.

When will we find an answer to the P-NP problem, or to the Goldbach conjecture? I couldn't say — these are very difficult problems, and people have been working on them for a while now. It could be tomorrow, it could be in five hundred years.

And now I should try to make a prediction about how much time will it be necessary to find the solution of a bunch of problems, some of which are not even defined very precisely, and some of which may be easily as difficult as P-NP?

Yeah, I think I'll pass.

One thing that I am relatively sure about, though, is that whatever happens, whenever happens, will not turn the Earth into a magical fairy candyland. It will solve some problems, and create some new ones.

edited 1st Jun '11 11:47:01 PM by Carciofus

But they seem to know where they are going, the ones who walk away from Omelas.

 6 Barkey, Wed, 1st Jun '11 11:51:59 PM from Bunker 051 Relationship Status: [TOP SECRET]
War Profiteer
I'm not sure I would really want the singularity anyway.
The AR-15 is responsible for 95% of all deaths each year. The rest of the deaths are from obesity and drone strikes.
 7 Tzetze, Wed, 1st Jun '11 11:52:26 PM from a converted church in Venice, Italy
DUMB
Well, I can't say I'm too confident in the timelines or predictions beyond the rough of Kurzweil or really any futurist. Futurists of the past have pretty consistent in how they're wrong (and how they're right)[1]and I don't see why I should think present ones are much different.
Over 10,000 dead.:<
Before I die.

...

;_;
Genkidama for Japan, even if you don't have money, you can help![1]
Nihilist Hippie
whatever happens, whenever happens, will not turn the Earth into a magical fairy candyland

I am positive a global post-scarcity communist society that has eradicated aging and disease, and made suffering optional while greatly expanding the minds of those who wish it is possible.
"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
 10 Usht, Wed, 1st Jun '11 11:58:32 PM from an arbitrary view point.
Lv. 3 Genasi Wizard
Given an infinite amount of time and guaranteed survival up until that time? Sure.
The thing about making witty signature lines is that it first needs to actually be witty.
 11 Barkey, Wed, 1st Jun '11 11:59:57 PM from Bunker 051 Relationship Status: [TOP SECRET]
War Profiteer
^^

No thank you.
The AR-15 is responsible for 95% of all deaths each year. The rest of the deaths are from obesity and drone strikes.
 12 Tzetze, Thu, 2nd Jun '11 12:00:58 AM from a converted church in Venice, Italy
DUMB
Well, there you go, it's hardly a fairy candyland if some people dislike it!
 13 Usht, Thu, 2nd Jun '11 12:02:30 AM from an arbitrary view point.
Lv. 3 Genasi Wizard
LOL, the utopia problem, it's never utopia for everyone.
The thing about making witty signature lines is that it first needs to actually be witty.
 14 Barkey, Thu, 2nd Jun '11 12:03:03 AM from Bunker 051 Relationship Status: [TOP SECRET]
War Profiteer
I just don't feel terribly comfortable with the human race becoming a decadent society that doesn't have to work for a living or ever endure hardship and has very few ways of proving their worth or attaining social status. Life isn't supposed to be a cake walk, I don't think it should be.

Post-Scarcity would be cool, so that we would have the resources needed to fuel a push to the stars.

edited 2nd Jun '11 12:04:01 AM by Barkey

The AR-15 is responsible for 95% of all deaths each year. The rest of the deaths are from obesity and drone strikes.
Is that cake frosting?
I am positive a global post-scarcity communist society that has eradicated aging and disease, and made suffering optional while greatly expanding the minds of those who wish it is possible.
A fair, benevolent society that gives to every human being adequate medical cures, food, lodging, and education while requesting from them only a reasonable amount of work is already theoretically possible. We wouldn't need further technological breakthroughs to do that.

Technology is not going to solve all our problems for us. It might help, but much of human suffering nowadays is has social, political and, let's say, ethical causes.

Science is not going to be able to do much about them. We, on the other hand, might. But it's not going to be a matter of pushing a button and letting a sovereign AI do all the work for us.

edited 2nd Jun '11 12:07:03 AM by Carciofus

But they seem to know where they are going, the ones who walk away from Omelas.

PARTY HARD!!!!
LOL, the utopia problem, it's never utopia for everyone.

Naaaaaah, Eliezer figured that out ages ago. He covers basically every angle anyone could possibly think of.

As for whether it's all feasible...
Where were you when I laid the earth’s foundation? Tell me, if you understand. Who marked off its dimensions? Surely you know! ~ GOD
 17 Usht, Thu, 2nd Jun '11 12:08:45 AM from an arbitrary view point.
Lv. 3 Genasi Wizard
Which Eliezer again?
The thing about making witty signature lines is that it first needs to actually be witty.
Needs to be more Evil
I'd say, *probably* within the next 50 years ... if you consider how the resolution of brain scanning has been going up, and the cost of computing power has been going down, it's likely that we'll eventually reach a state where we can scan a human brain and then simulate it, which should *seriously* screw things up: Imagine you're Bill Gates, and you have the choice between hiring 1000 people and running 1000 copies of yourself ... which one is cheaper, which one is more efficient? And doing that doesn't require any amazing technological insight, just improvement upon what we can already do.

And that's just an upper bound; maybe we'll have AI before that. I'm not sure it'd be better :P
Point that somewhere else, or I'll reengage the harmonic tachyon modulator.
 19 Usht, Thu, 2nd Jun '11 12:13:09 AM from an arbitrary view point.
Lv. 3 Genasi Wizard
Oh, that'd be easy, I'd backstab myself and try to grab that power. See, you're going to run into problems with that sort of thing.
The thing about making witty signature lines is that it first needs to actually be witty.
Is that cake frosting?
[up][up] Oh, we are making an estimate based on exponential growth projections?

All right then, I have to agree. The wangularity is near.

edited 2nd Jun '11 12:13:38 AM by Carciofus

But they seem to know where they are going, the ones who walk away from Omelas.

 21 Barkey, Thu, 2nd Jun '11 12:14:09 AM from Bunker 051 Relationship Status: [TOP SECRET]
War Profiteer
^^^

That sort of concept is exactly what chills me to the bone about the singularity, and why I don't want to see it in my lifetime.

edited 2nd Jun '11 12:14:59 AM by Barkey

The AR-15 is responsible for 95% of all deaths each year. The rest of the deaths are from obesity and drone strikes.
 22 Tzetze, Thu, 2nd Jun '11 12:15:44 AM from a converted church in Venice, Italy
DUMB
Well, as long as we're linking smbc...

Before I die.

...

;_;

[1]
 23 feotakahari, Thu, 2nd Jun '11 12:22:44 AM from Looking out at the city
Fuzzy Orange Doomsayer
I think that as defined on this site, we'll never have The Singularity. We'll have events that could be considered The Singularity according to definitions I've seen elsewhere, but this site's definition (a way of living not even imaginable to modern humans) seems impossible—I mean, if the Internet didn't make us inhuman, nothing will.

Edit: To clarify, I'm arguing the Law of Conservation of Normality on a species-wide scale.

edited 2nd Jun '11 12:26:02 AM by feotakahari

That's Feo . . . He's a disgusting, mysoginistic, paedophilic asshat who moonlights as a shitty writer—Something Awful
 24 Barkey, Thu, 2nd Jun '11 12:26:45 AM from Bunker 051 Relationship Status: [TOP SECRET]
War Profiteer
^

Sometimes I think the internet is working on it. tongue
The AR-15 is responsible for 95% of all deaths each year. The rest of the deaths are from obesity and drone strikes.
Needs to be more Evil
Oh, we are making an estimate based on exponential growth projections?

Nah, extrapolating growth in general is silly (Kurzweil's guilty of it a lot), you can fudge the data much too easily. I'm just talking about extrapolating brain scanning resolution and the cost of computing power, and they don't need to go "to infinity" or anything, just reach a point where it's technically feasible to scan a human brain and simulate it. Which is very different from saying "hey look how this curve is bending upwards, in the future it will bend upwards *even more*! To infinity! It's the Singularity! Send me your money!
Point that somewhere else, or I'll reengage the harmonic tachyon modulator.
Total posts: 150
1
 2  3  4  5 6


TV Tropes by TV Tropes Foundation, LLC is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Permissions beyond the scope of this license may be available from thestaff@tvtropes.org.
Privacy Policy