Follow TV Tropes

Following

When's the singularity?

Go To

Yej See ALL the stars! from <0,1i> Since: Mar, 2010
See ALL the stars!
#101: Jun 3rd 2011 at 6:47:56 AM

The idea of imagination is something that took 3.8 billion years for nature to do, and you esxpect humans to do it in 100?
Yes. tongue

Just remember, the concept of a general-purpose calculating machine has not even existed for 100 years and look how far we are already.

edited 3rd Jun '11 6:48:33 AM by Yej

Da Rules excuse all the inaccuracy in the world. Listen to them, not me.
del_diablo Den harde nordmann from Somewher in mid Norway Since: Sep, 2009
Den harde nordmann
#102: Jun 3rd 2011 at 11:51:42 AM

So why you people disagree with "The singularity happened in England 1712."? :P

A guy called dvorak is tired. Tired of humanity not wanting to change to improve itself. Quite the sad tale.
Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
DUMB
#103: Jun 3rd 2011 at 11:53:42 AM

Well, the idea of their being multiple singularities is rather at odds with the exponential extrapolation behind the idea of the singularity.

[1] This facsimile operated in part by synAC.
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#104: Jun 3rd 2011 at 11:59:04 AM

I'm not getting your line of thought here: since when does "contribute" = keeping a roof over your head and food in the belly?

What I'm saying is that apart from artistic expression, what job can a human even perform that an AI couldn't have accomplished or automated better? It's like how if we have automated assembly lines building the perfect car, what's the point of a human who wants to work going "I'm going to go build cars."

Sure, it's something to do, but what's the point? It's not work that needs to be done. The level of competency that a Strong AI possesses essentially means that there is absolutely no demand for humans to do much of anything, there's no need and even without the need for work, our quality of work is so much lower than what the machines would produce that we would just be in the way.

It's like being a master widget maker. You're pumping out absolutely perfect widgets at a rate that nobody else can equal. A young man walks up and starts making widgets alongside you, at half the pace and with half the quality in the finished widget. You have the capacity to produce so many widgets that you could easily produce more than we need as it is, so there's no reason for the young man to craft widgets in the first place, all he is doing is wasting resources to produce an inferior product that nobody will want for nothing other than personal enjoyment.

I don't want to live in a world like that, I want the accomplishments of our civilization to be human accomplishments born on the backs of human effort, not by proxy using a strong AI. In this particular case I don't want the easy way, because we would sacrifice much of our own humanity in doing so.

edited 3rd Jun '11 12:00:16 PM by Barkey

del_diablo Den harde nordmann from Somewher in mid Norway Since: Sep, 2009
Den harde nordmann
#105: Jun 3rd 2011 at 12:00:46 PM

Tzetze: I guess it is because jumping from X^2 to X^3 is a quite large jump? Hence when a large jump occurs instead of just the normal steady growth, it can be considered a new singularity.

A guy called dvorak is tired. Tired of humanity not wanting to change to improve itself. Quite the sad tale.
Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
DUMB
#106: Jun 3rd 2011 at 12:08:55 PM

A mathematical singularity is a point for which a mathematical object is not defined. The idea behind The Singularity is that if the technological acumen of humanity is graphed, where x is time and y is technological acumen, there's a point in the future where dy/dx is infinite, so y does not exist, and is a singularity. You can't have more than one. "singularity" doesn't mean "paradigm shift".

[1] This facsimile operated in part by synAC.
LoveHappiness Nihilist Hippie Since: Dec, 2010
Nihilist Hippie
#107: Jun 3rd 2011 at 12:12:25 PM

Except I don't think anyone's claiming there will be any physical infinities before, during, or after the "singularity".

"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
DUMB
#108: Jun 3rd 2011 at 12:15:24 PM

So the extrapolators can't even do math right. Tch.

[1] This facsimile operated in part by synAC.
LoveHappiness Nihilist Hippie Since: Dec, 2010
Nihilist Hippie
#109: Jun 3rd 2011 at 12:19:04 PM

It's just a fancy sounding buzzword, no need to read too much into it.

edited 3rd Jun '11 12:19:17 PM by LoveHappiness

"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
Yej See ALL the stars! from <0,1i> Since: Mar, 2010
See ALL the stars!
#110: Jun 3rd 2011 at 12:20:36 PM

[up][up][up][up] I can't think of any non-piecewise function that has infinite dy/dx for any defined x. tongue

Da Rules excuse all the inaccuracy in the world. Listen to them, not me.
Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#112: Jun 3rd 2011 at 12:37:46 PM

I believe that the Timecube holds these answers, and more.

Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
DUMB
#113: Jun 3rd 2011 at 12:43:15 PM

In any case...

I don't want to live in a world like that, I want the accomplishments of our civilization to be human accomplishments born on the backs of human effort, not by proxy using a strong AI.

The problem I have with this is that I see it as a fairly arbitrary line. Humans can't accomplish much without our tools anyway.

[1] This facsimile operated in part by synAC.
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#114: Jun 3rd 2011 at 12:56:10 PM

^

When you let the tools make the decisions or get creative is when you start to blur the lines way more than I'm comfortable with. There need to be some very watertight guidelines established and upheld with extreme prejudice before we even start to dabble in the area of Artificial Intelligence.

Even then, I'm not terribly comfortable with the entire concept. Technology should make the things we accomplish easier to accomplish, it shouldn't do it for us.

edited 3rd Jun '11 12:57:26 PM by Barkey

Tzetze DUMB from a converted church in Venice, Italy Since: Jan, 2001
DUMB
#115: Jun 3rd 2011 at 1:02:53 PM

before we even start to dabble in the area of Artificial Intelligence

Uh, you're at least sixty years late.

When you let the tools make the decisions or get creative

What constitutes "making a decision" or "getting creative"? I don't think that this is a distinction as obvious as you seem to think. As a trivial example, if I search for "one lov" on Google, can Google's AI be said to be "deciding" that I actually meant "one love"?

edited 3rd Jun '11 1:03:27 PM by Tzetze

[1] This facsimile operated in part by synAC.
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#116: Jun 3rd 2011 at 1:07:38 PM

It's tricky, obviously.

And by dabble I meant seriously get anywhere close to achieving such a thing as a self-aware AI.

It's not something I'm qualified to make legislation about or anything, but I reserve the right to have heeby jeebies about the whole concept itself.

Jinren from beyond the Wall Since: Oct, 2010
#117: Jun 3rd 2011 at 1:17:36 PM

I meant seriously get anywhere close to achieving such a thing as a self-aware AI.

At which point they stop being tools and become citizens.

To expand: if we were to "uplift" another species to have human-like cognition, would you object to them becoming valued members of society? (You might, many would.) What about the more likely scenario than strong AI, that augmented human minds become capable of recursive self-improvement? Do they suddenly lose human rights just because they're more powerful than the known comfort zone? A lot of people would have a hard time answering yes to that one, regardless of gut feeling.

If the intelligence is capable of reasoning, I say that its status as originally machine shouldn't have any impact on how we judge it; and unless you're willing to place the same restrictions on humans capable of dramatic self-enhancement (why stop there? why not curtail education, proper nutrition and physical exercise? or just declare Year 0 and stop people wearing glasses?) there shouldn't be any reason to restrict synthetic intelligences from the same. At which point it starts to look a bit more like Luddism for its own sake.

edited 3rd Jun '11 1:31:22 PM by Jinren

Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#118: Jun 3rd 2011 at 1:42:22 PM

^

I would say yes, simply because the risk of what could go wrong is too great.

When the risk is that a Strong AI could end up being not-so-benevolent and running rampant, possibly destroying most of or all of our race, I prefer to not even take that risk. If there's even a 1 percent chance that it could happen, I feel it isn't worth it. For Science! Is awesome when it doesn't threaten our way of life the way this does.

And by threaten our way of life, I'm talking about how easy things would be with Strong AI's just as much as the fact that they could turn and kill us all. Neither outcome is really acceptable to me, and even if it means denying rights to a sentient thing that we've created, I'm willing to go to that length.

I'm not talking about some primitive loving luddist beliefs here, I like our technology today, and there's up and coming stuff that really excites me. The possibility of essentially creating God however, is not one of them.(I'm referring to it as "God" because of the depth of power that such a being theoretically has) We create technology that renders things obsolete at a pretty fast rate these days, but I will never support creating something that makes humans obsolete, even if the entire thing hinges on how the creation feels about that particular subject and the appropriate course of action it wishes to take.

edited 3rd Jun '11 1:45:27 PM by Barkey

Jinren from beyond the Wall Since: Oct, 2010
#119: Jun 3rd 2011 at 1:45:52 PM

if it means denying rights to a sentient thing that we've created, I'm willing to go to that length.

I am actually impressed by your intellectual honesty.

...yeah, I have nothing to add, sorry.

edited 3rd Jun '11 1:47:29 PM by Jinren

Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#120: Jun 3rd 2011 at 1:49:09 PM

It's a military thing. I remember when we were discussing Yudkowsky's AI Box experiment, and I really wish I could do the experiment with him. When we discussed it here a while back I decided that there was no way I would let the AI out of the box, simply because I'd treat it as something which has no rights whatsoever, and my only job is to make sure nobody else touches that god damn button. Turning off empathy at will has its perks.

LoveHappiness Nihilist Hippie Since: Dec, 2010
Nihilist Hippie
#121: Jun 3rd 2011 at 1:56:25 PM

At this point, the fear of apocalyptic A.I.s (even assuming they will be invented) seems a bit... if not unrealistic than still a quite bit premature to come to that conclusion.

edited 3rd Jun '11 1:56:40 PM by LoveHappiness

"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
SlightlyEvilDoctor Needs to be more Evil Since: May, 2011
Needs to be more Evil
#122: Jun 3rd 2011 at 1:58:34 PM

The legal system, and human emotions, are made to work assuming all involved are humans. Throw superintelligent immortals in the mix, and things go out of balance.

Point that somewhere else, or I'll reengage the harmonic tachyon modulator.
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#123: Jun 3rd 2011 at 2:00:33 PM

^^

I don't feel that way in the slightest, the theoretical power of a strong AI and the amount of workload it would take on, even to the point of fulfilling nearly all of our civilizations major functions without the need for the judgment of a human operator makes it seem pretty logical that we're just a waste of space and resources standing in the way of efficiency.

Theoretically, a strong AI is the prime example of those lyrics "Anything you can do, I can do better." It makes us obsolete.

I'd say perhaps they would keep us as pets, but there's no sure bet that any of the functions served by a pet could be met by us for a human AI. Do they require affection? Because they certainly wouldn't need us to hunt for them or protect them.

edited 3rd Jun '11 2:01:43 PM by Barkey

LoveHappiness Nihilist Hippie Since: Dec, 2010
Nihilist Hippie
#124: Jun 3rd 2011 at 2:01:39 PM

and things go out of balance

My point being that even assuming these things will be invented, it's pure speculation to assume a negative outcome from the start. Just highly speculative...

"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder." -Nick Bostrom
Barkey Since: Feb, 2010 Relationship Status: [TOP SECRET]
#125: Jun 3rd 2011 at 2:02:29 PM

I like to think that when the possibility of such a thing happening is even remotely realistic, cautious speculation is a good idea.


Total posts: 150
Top