It's doubtful, due to the nature of the work, to "accidentally" build an AI much like it's a bit hard to trip, fall and then assemble a spacefaring rocket ship.
What is more likely to happen is to develop an AI without knowing it, by crossing a particular fuzzy line into a zone of "intelligence" that we had not seen before in previous AI incarnations but not truly recognised until later. Think of it as say the developers of Starcraft 1 did not expect there to have such emergent properties in terms of strategy that would play out from such a simple set of rules regarding how damage works, production rates and unit costs.
In the same vein, someone will create a complex AI algorithm for search but not truly recognise how powerful it is until much later.
More likely, an AI would form accidentally from multiple programs interacting in ways that weren't predicted. Like, say, someone builds a program that goes through other programs, and fixes mistakes, somehow. Someone else makes a simple AI, for, like, a video game. Someone else has the bright idea to put them together, having the cleaner program work on the AI, and have the AI call it every now and again, or something like that. This allows the AI to change itself, and learn, allowing for true intelligence to emerge.
That's sort of what I was saying: which is more likely, a man crossing the wrong wire too many times and ending up with a working brain, or a man sitting down and saying "I'mma build me a brain today" and pulling it off?
Further: how do you think these divergent means of origin for an intelligence would effect it's development as an individual? If you had a sure and provable answer to "who made me and why am I here?", how would that change your approach to other more mundane questions? Would any such self-examination even be relevant when dealing with a non-human, but human made intelligence? Do we posses the means to even begin tracking how any intelligence operates and is such an understanding relevant to crafting one from scratch?
EDIT: Also, no one has made any mention* of potential biological AIs. Could we "uplift" an animal to intelligent, communicative status more easily than we could craft an intelligence from circuitry?
edited 12th Jul '12 5:24:24 PM by OhnoaBear
"The marvel is not that the Bear posts well, but that the Bear posts at all."Human intelligence evolved as a result of natural selection. The plants and animals that we have adapted to our use were the result of selective breeding and then, once we had the technology, genetic engineering.
I'm not a biologist or computer engineer, so I'm just guessing here, but I think that a true artificial intelligence would have to come from a similar sort of process. It's like Project 2501 from Ghost In The Shell. Without the possiblity of reproducing, dying and passing on beneficial characteristics, an AI is just a program, and copies are just copies, not offspring.
So suppose somebody could design a simple AI with the capacity to learn, interact, and reproduce. But that would mean creating LIFE! ITSELF!
What we obtain too cheap, we esteem too lightly.That argument can be said for cars and other vehicles: it took the cheetah tha long to evolve to be able to run a 100 kmh. And we are developing machines that move in the same way as a 4-legged animal, not to mention prosthetics that are getting really close to the real deal. Just because the human brain is created by evolution does not mean that is the only means of creation.
edited 16th Jul '12 8:15:48 PM by IraTheSquire
I personally feel that there will be no true AI until a program has the ability to edit itself without the help of a technician.
Already here, but no AI yet.
<><Intelligence requires awareness of one's environment and the ability to adapt to that environment. Without an environment to provide challenges, any entity will stagnate.
What we obtain too cheap, we esteem too lightly.We are? That's news to me.
Also, I wasn't saying that having that ability means we have true AI, I was saying true AI would require having that ability.
And, without the ability to adapt, it couldn't learn.
edited 16th Jul '12 8:27:10 PM by deathpigeon
Or is it the other way around? Without the ability to learn, it can't adapt? Really, you need both at once. Learning without adaptation and adaptation without learning are insanity.
What we obtain too cheap, we esteem too lightly.Self-modifying code has been around for quite a long time, and there have been experiments done with programs that do nothing but modify themselves while trying to survive. They've produced some pretty nice viruses, but no HAL yet.
I mean, I'm just a Sophomore computer science student and I've already covered enough theory that i could build a decent learning/self-modifying program. The problem is in giving it the input information it needs and the processing power to deal with all that input.
edited 16th Jul '12 8:33:00 PM by EdwardsGrizzly
<><...How many semesters have you completed?
All life is a means for genetic information to propagate itself. Intelligence is a means to preserve and propagate life. A computer program doesn't have genes, does it? But what if you could create a computer program with a drive to propagate itself?
What we obtain too cheap, we esteem too lightly.Two, but on the CS progression I'm closer to where most people would be after three.
We do, it's called stuxnet.
edited 16th Jul '12 8:39:49 PM by EdwardsGrizzly
<><Ah. I've completed two as well, but at a normal progression. I was just wondering how stupid I should feel for not knowing that after two semesters. My answer is only kind of.
If a computer virus exists to copy itself, then all you need to wipe it out is a single antivirus, spread widely enough. Can a computer virus change and adapt itself? Biological life also involves random mutations, some beneficial, some not. So an artificial intelligence would also need to include random mutations as well as the ability to learn and adapt. Or on the other hand, is the ability to learn and adapt the result of random mutation and natural selection?
What we obtain too cheap, we esteem too lightly.AI is also a special interest of mine, so you shouldn't feel bad.
That has been done, with mixed results (and for obvious reasons the programs were kept strictly isolated from other machines).
edited 16th Jul '12 8:46:20 PM by EdwardsGrizzly
<><Ok, let me stop you there. This isn't about Artificial Life. This is about Artificial Intelligence. Something can be intelligent without fitting the qualifications of life. It can be intelligent without reproducing even once.
Ok, ok, I won't feel bad about this.
edited 16th Jul '12 8:49:30 PM by deathpigeon
AI in a petri dish. I wonder, what if somebody has already invented an intelligent virus that has spread across the 'Net, but as part of its self-programming it learned to keep itself hidden and not directly interfere in the systems it inhabits. Much like a life form that establishes equilibrium with its environment.
What we obtain too cheap, we esteem too lightly.Or it was specifically designed to do so, so that once it infected a big enough proportion of the world its designer could activate it and RULE THE WORLD!
...and now I just pictured a massive Thirty Xanatos Pileup happening where all the world's computer geeks reveal their hidden botnets one by one, and it suddenly becomes clear why computers are so ridiculously unreliable.
<><Computer geeks already rule the world. Well, "rule" in just about every important sense.
What we obtain too cheap, we esteem too lightly.Shhhhh... Don't spoil the surprise.
Plus, do you want that to happen before you've written your own?
Nah, when it happens the conflicting orders will make every machine on the planet explode, so my strategy is to make friends with all the History majors and join the sharp, pointy stick club afterwards.
<><My plan is to create one that can order the others around, and then RULE THE WORLD! MUAHAHAHAHA!
I have a question: do you think it more likely that an AI will be created via accidental emergence, or purposeful design? 'Cause this debate doesn't have enough similarity to a creationist debate for my tastes.
edited 12th Jul '12 5:01:11 PM by OhnoaBear
"The marvel is not that the Bear posts well, but that the Bear posts at all."