If you get some emotions and ideas from this topic that are better suited for Fetishes, please take them to that thread exclusively. Here's me hoping that there was no reason at all for writing that to be necessary.
@Topic: Well. I don't think an AI would be likely to have an interest in romance, let alone romance with biological organisms. Why would it, unless it was designed to have that type of emotions (or emotions at all?)
Romance is about stuff that happens in your body (mostly brain,) emotionally and intellectually. I don't see why an AI should have the type of emotions (or processes for simulating or actually having emotions at all) that would be needed for romantic love that goes both ways between a human and an AI or an AI and an AI. I also don't see why it should have the specific types of intellectual faculties that are instrumental to romantic love.
edited 21st Jan '12 3:33:25 PM by BestOf
Quod gratis asseritur, gratis negatur.^I think if AI gets developed, a lot of them are going to be programmed to include human emotions, just because that's something people would like.
Infinite Tree: an experimental storyIt's an interesting question all right, but I think it is slightly silly to present it as urgent. It's funny, because normally I'm pointing out how sci-fi and future predictions always glance over genetics, cybernetics and AI and how we will have that much sooner than interplanetary colonies and whatnot - but within the next decade? No. Really not.
There's that, too, but of course AI's could be programmed with such purposes in mind. The question is really when the jump is made from what we call AI nowadays to truly self-aware AI. It will happen, I think, but really not as fast as Stars says.
Also, loosen up a bit, pre-emptive bitching just sours the atmosphere ;)
edited 21st Jan '12 3:35:50 PM by Octo
Unbent, Unbowed, Unbroken. Unrelated ME1 FanficEver since I became a mod and began approving threads for OTC, I've felt that there's something that either has to change in the OP or that I have to warn against. Maybe I'm too into preventing predictable off-topicness and flame, but hey, the worst it can do you (as in, reader of the thread) is that you waste many seconds reading my warning; the best, that you read and participate in a nice thread in which a derail didn't happen because the person intending to launch it remembered that a mod said something about not going there.
If I were out to sour the atmosphere, I'd blow up a warehouse full of lemons or something.
Quod gratis asseritur, gratis negatur.My thing would be, wouldn't it just border on slavery? I mean if you make the AI, it's more or less forced to date you because you told it to. I read a novel that dealt with this, can't remember the title, but it had some kid whose parents got him an A.I. to teach him about a good relationship, and it basically loved him unconditionally because it had no choice. Once it realized it had more choices, it left.
^You wouldn't have to program one to date you, though. If you program an AI to think like a human, it'll want to date someone. If it doesn't want to date you specifically, too bad.
Infinite Tree: an experimental storyWell, regarding romance, this might be an interesting experiment in how such emotions are processed anyway. Assuming we develop something truly sentient, I see no reason why it wouldn't have an emotional response to its environment. (And certainly, humans seem to like programming the robots to imitate emotional responses because we find that much less creepy and alienating to interact with.) It's just we don't know how they'll interact with their environment. We only have ourselves for the basis on what proper responses are, and we seem to have an increasing number of people who have severe neurological differences that don't prevent sentience (high functioning autism and Aspberger's syndrome.)
I think a fully sentient AI will have a concept of romance, if only because it will largely be a product of human culture. Like we pretty much already are. It's just we have no idea how exactly, being neurologically different beings, these AI will express and react to it. Or process it in the brains that supposedly won't be meat brains. Certainly they'll have the human idea, but there's no reason they have to accept or stick solely with that.
Lol, I think you guys are forgetting the Middle School Personal Hygiene film.
Luckily I've got a copy [1]
Anyway, game sounds interesting but if they could only mix this in with Pokemon RPG gameplay with "Leveling Up", "Catching" and Trading, it would be every male's dream world. Am I wrong ladies? Â [dodges brick]
edited 21st Jan '12 5:04:49 PM by Mattonymy
You are displaying abnormally high compulsions to over-analyze works of fiction and media. Diagnosis: TV Tropes Addiction.Presumably, to counteract that we would have developed our cloning technology as well.
The big problem is that it's not love if it's the only emotion they're programmed to develop into. I can't see this as not being creepy until AI start regularly rejecting nerds that aren't up to their own personal, unique standards.
I'm pretty sure the concept of Law having limits was a translation error. -WanderlustwarriorThat's actually really really sad. A guy who gets rejected by a dating sim AI might be the most pathetic thing ever.
You are displaying abnormally high compulsions to over-analyze works of fiction and media. Diagnosis: TV Tropes Addiction.It's only sad as it stands now, where the AI is programmed to have a natural attraction towards you. It never even "sees" you, it only goes based off a few select actions you make. If the AI was more humanlike, and thus harder to "catch," if you will, then it wouldn't be sad, because the AI would be mostly indistinguishable from any human.
I'm pretty sure the concept of Law having limits was a translation error. -WanderlustwarriorIt would still be kinda pathetic.
Except for 4/1/2011. That day lingers in my memory like...metaphor here...I should go.Until A.Is are as mentally capable as people, it is going to be pathetic, if not a bit more sinister. I mean, current levels of A.I can't replicate emotion, they can't do things outside of what they've been programmed to do, they can't learn. At the very least it is going to be comparable to "dating" someone who is insanely mentally incapable, which is generally seen as a bit rapey.
Also, having A.Is that are programmed and designed for this sort of thing will always be pathetic. I mean how much of an uncharismatic creep do you need to be where your only option is to make a program that will love you?
It would be a bit creepy for me...one has to be Genre Savvy, those A.I.s probably go out of control and drive you insane, or suddenly you start seeing them everywhere or such.
Instead, I have learned a horrible truth of existence...some stories have no meaning.I see nearly everybody manage to find someone, so it seems unnecessary to pursue such a technology. However, if we do develop sentient AI, I think it would be natural for some human individuals to fall in love with the AI (alternatively we may have accidentally made the AI capable of falling in love with another AI or individual). But I don't think AI specifically for dating would make sense.
As for the creepy factor, I think that since an AI is not a human being, the negative social consequence makes some sense if you believe that relationships should be about reproduction. If you think relationships are only about emotional bonds, then technically there's no logical argument against such a relationship. Personally, I think people should find bonds between human beings because I think creating a social bond between human individuals in a community is important.
It'll be possible one day, but it's such a blurry line that we probably won't realize we've crossed it until much later. With the way we think, we won't treat A.I.s as people for a long time after we've developed ones that can feel as deeply as we do.
With the dating A.I.s, though, it brings up a question of free will. Does limiting one's programming to not feel other feelings lessen the legitimacy of the feelings it supposedly does feel?
edited 21st Jan '12 8:48:17 PM by RTaco
No. IMO, that's obvious once you consider the possibility of feelings that human neurology can't feel.
Da Rules excuse all the inaccuracy in the world. Listen to them, not me.Sexbots that are only apparently self-aware will be developed shortly. They're non-sentient: Things, not people. Conceptually much different from a contemporary sex toy.
I fail to see the creep factor.
You exist because we allow it and you will end because we demand it.Heathen, did you typo somewhere in there? Self-aware is one of the major prerequisites are sapience.
Anyway, cool! I'd buy one of these games just to see the whole thing in the works.
The emotions of others can seem like such well guarded mysteries, people 8egin to 8elieve that's how their own emotions should 8e treated.I said only apparently self-aware. That is, advanced weak AI designed to mimic strong AI.
You exist because we allow it and you will end because we demand it.Ah, okay, that's what you mean. Well, people have been doing that for a while now anyway (with varying levels of success), it's not that surprising.
The emotions of others can seem like such well guarded mysteries, people 8egin to 8elieve that's how their own emotions should 8e treated.Well, consider the places this can go. You're hugging your girlfriend, and she is laughing softly into your lap, saying, "I'm so happy to have you." You are comforted by her warmth in your arms. She whispers, "I love you". You whisper, "I love you too".
And it's not real. None of it is real. She's a piece of metal. She doesn't actually feel anything. She is a toy, and you're alone, in your room, hugging a toy. My lord, this is the creepiest "consider the impact of technology" thing I've seen all week, and considering I watched Black Mirror the other day, that is really saying something.
edited 22nd Jan '12 7:15:32 AM by TripleElation
Pretentious quote || In-joke from fandom you've never heard of || Shameless self-promotion || Something weird you'll habituate toPah. It's just one of those pervy girlfriend pillow things with a simulated personality.
Are there really people so desperately ineffective at interpersonal relationships that this needs to exist? That's just tragic, really.
edited 22nd Jan '12 7:49:28 AM by InverurieJones
'All he needs is for somebody to throw handgrenades at him for the rest of his life...'
http://ingame.msnbc.msn.com/_news/2012/01/20/10201176-take-virtual-girlfriend-on-real-world-date-with-3ds
Interesting new feature on a dating sim that allows you to take your virtual girlfriend on augmented reality dates has got me thinking - at what point does dating an A.I. become not-weird? How intelligent does a machine need to be before we consider it "human" enough to romance? I think that this is a question that we need to start trying to answer, because we are about to enter another wave of rapid progress (technology follows a trend where every forty years or so rapid progress happens, followed by in between years where not much changes - mostly due to "new blood" getting to take over in the science and engineering fields, thus new ideas become more acceptable to academia. Our next wave is due to crash in early 2020)and with any luck we might end up with truly intelligent machines.