Follow TV Tropes

Following

Dating an A.I.

Go To

MyGodItsFullofStars Since: Feb, 2011
#1: Jan 21st 2012 at 3:25:58 PM

http://ingame.msnbc.msn.com/_news/2012/01/20/10201176-take-virtual-girlfriend-on-real-world-date-with-3ds

Interesting new feature on a dating sim that allows you to take your virtual girlfriend on augmented reality dates has got me thinking - at what point does dating an A.I. become not-weird? How intelligent does a machine need to be before we consider it "human" enough to romance? I think that this is a question that we need to start trying to answer, because we are about to enter another wave of rapid progress (technology follows a trend where every forty years or so rapid progress happens, followed by in between years where not much changes - mostly due to "new blood" getting to take over in the science and engineering fields, thus new ideas become more acceptable to academia. Our next wave is due to crash in early 2020)and with any luck we might end up with truly intelligent machines.

BestOf FABRICATI DIEM, PVNC! from Finland Since: Oct, 2010 Relationship Status: Falling within your bell curve
FABRICATI DIEM, PVNC!
#2: Jan 21st 2012 at 3:32:26 PM

If you get some emotions and ideas from this topic that are better suited for Fetishes, please take them to that thread exclusively. Here's me hoping that there was no reason at all for writing that to be necessary.

@Topic: Well. I don't think an AI would be likely to have an interest in romance, let alone romance with biological organisms. Why would it, unless it was designed to have that type of emotions (or emotions at all?)

Romance is about stuff that happens in your body (mostly brain,) emotionally and intellectually. I don't see why an AI should have the type of emotions (or processes for simulating or actually having emotions at all) that would be needed for romantic love that goes both ways between a human and an AI or an AI and an AI. I also don't see why it should have the specific types of intellectual faculties that are instrumental to romantic love.

edited 21st Jan '12 3:33:25 PM by BestOf

Quod gratis asseritur, gratis negatur.
INUH Since: Jul, 2009
#3: Jan 21st 2012 at 3:34:20 PM

^I think if AI gets developed, a lot of them are going to be programmed to include human emotions, just because that's something people would like.

Infinite Tree: an experimental story
Octo Prince of Dorne from Germany Since: Mar, 2011
Prince of Dorne
#4: Jan 21st 2012 at 3:35:37 PM

[up][up][up]It's an interesting question all right, but I think it is slightly silly to present it as urgent. It's funny, because normally I'm pointing out how sci-fi and future predictions always glance over genetics, cybernetics and AI and how we will have that much sooner than interplanetary colonies and whatnot - but within the next decade? No. Really not.

[up][up]There's that, too, but of course AI's could be programmed with such purposes in mind. The question is really when the jump is made from what we call AI nowadays to truly self-aware AI. It will happen, I think, but really not as fast as Stars says.

Also, loosen up a bit, pre-emptive bitching just sours the atmosphere ;)

edited 21st Jan '12 3:35:50 PM by Octo

Unbent, Unbowed, Unbroken. Unrelated ME1 Fanfic
BestOf FABRICATI DIEM, PVNC! from Finland Since: Oct, 2010 Relationship Status: Falling within your bell curve
FABRICATI DIEM, PVNC!
#5: Jan 21st 2012 at 3:46:38 PM

[up]Ever since I became a mod and began approving threads for OTC, I've felt that there's something that either has to change in the OP or that I have to warn against. Maybe I'm too into preventing predictable off-topicness and flame, but hey, the worst it can do you (as in, reader of the thread) is that you waste many seconds reading my warning; the best, that you read and participate in a nice thread in which a derail didn't happen because the person intending to launch it remembered that a mod said something about not going there.

If I were out to sour the atmosphere, I'd blow up a warehouse full of lemons or something.tongue

Quod gratis asseritur, gratis negatur.
wuggles Since: Jul, 2009
#6: Jan 21st 2012 at 4:25:35 PM

My thing would be, wouldn't it just border on slavery? I mean if you make the AI, it's more or less forced to date you because you told it to. I read a novel that dealt with this, can't remember the title, but it had some kid whose parents got him an A.I. to teach him about a good relationship, and it basically loved him unconditionally because it had no choice. Once it realized it had more choices, it left.

INUH Since: Jul, 2009
#7: Jan 21st 2012 at 4:30:20 PM

^You wouldn't have to program one to date you, though. If you program an AI to think like a human, it'll want to date someone. If it doesn't want to date you specifically, too bad.

Infinite Tree: an experimental story
AceofSpades Since: Apr, 2009 Relationship Status: Showing feelings of an almost human nature
#8: Jan 21st 2012 at 4:31:54 PM

Well, regarding romance, this might be an interesting experiment in how such emotions are processed anyway. Assuming we develop something truly sentient, I see no reason why it wouldn't have an emotional response to its environment. (And certainly, humans seem to like programming the robots to imitate emotional responses because we find that much less creepy and alienating to interact with.) It's just we don't know how they'll interact with their environment. We only have ourselves for the basis on what proper responses are, and we seem to have an increasing number of people who have severe neurological differences that don't prevent sentience (high functioning autism and Aspberger's syndrome.)

I think a fully sentient AI will have a concept of romance, if only because it will largely be a product of human culture. Like we pretty much already are. It's just we have no idea how exactly, being neurologically different beings, these AI will express and react to it. Or process it in the brains that supposedly won't be meat brains. Certainly they'll have the human idea, but there's no reason they have to accept or stick solely with that.

Mattonymy Mr. Dr. from The Evils of Free Will Since: Jul, 2010
Mr. Dr.
#9: Jan 21st 2012 at 4:50:50 PM

Lol, I think you guys are forgetting the Middle School Personal Hygiene film.

Luckily I've got a copy [1]

Anyway, game sounds interesting but if they could only mix this in with Pokemon RPG gameplay with "Leveling Up", "Catching" and Trading, it would be every male's dream world. Am I wrong ladies?  [dodges brick]

edited 21st Jan '12 5:04:49 PM by Mattonymy

You are displaying abnormally high compulsions to over-analyze works of fiction and media. Diagnosis: TV Tropes Addiction.
AceofSpades Since: Apr, 2009 Relationship Status: Showing feelings of an almost human nature
#10: Jan 21st 2012 at 5:00:44 PM

Presumably, to counteract that we would have developed our cloning technology as well. wink

HeavyDDR Who's Vergo-san. from Central Texas Since: Jul, 2009
Who's Vergo-san.
#11: Jan 21st 2012 at 5:17:43 PM

The big problem is that it's not love if it's the only emotion they're programmed to develop into. I can't see this as not being creepy until AI start regularly rejecting nerds that aren't up to their own personal, unique standards.

I'm pretty sure the concept of Law having limits was a translation error. -Wanderlustwarrior
Mattonymy Mr. Dr. from The Evils of Free Will Since: Jul, 2010
Mr. Dr.
#12: Jan 21st 2012 at 5:44:19 PM

That's actually really really sad. A guy who gets rejected by a dating sim AI might be the most pathetic thing ever.

You are displaying abnormally high compulsions to over-analyze works of fiction and media. Diagnosis: TV Tropes Addiction.
HeavyDDR Who's Vergo-san. from Central Texas Since: Jul, 2009
Who's Vergo-san.
#13: Jan 21st 2012 at 5:54:02 PM

It's only sad as it stands now, where the AI is programmed to have a natural attraction towards you. It never even "sees" you, it only goes based off a few select actions you make. If the AI was more humanlike, and thus harder to "catch," if you will, then it wouldn't be sad, because the AI would be mostly indistinguishable from any human.

I'm pretty sure the concept of Law having limits was a translation error. -Wanderlustwarrior
Ultrayellow Unchanging Avatar. Since: Dec, 2010
Unchanging Avatar.
#14: Jan 21st 2012 at 6:08:00 PM

It would still be kinda pathetic.

Except for 4/1/2011. That day lingers in my memory like...metaphor here...I should go.
FMIV Since: Dec, 1969
#15: Jan 21st 2012 at 6:16:39 PM

Until A.Is are as mentally capable as people, it is going to be pathetic, if not a bit more sinister. I mean, current levels of A.I can't replicate emotion, they can't do things outside of what they've been programmed to do, they can't learn. At the very least it is going to be comparable to "dating" someone who is insanely mentally incapable, which is generally seen as a bit rapey.

Also, having A.Is that are programmed and designed for this sort of thing will always be pathetic. I mean how much of an uncharismatic creep do you need to be where your only option is to make a program that will love you?

Anfauglith Lord of Castamere Since: Dec, 2011
Lord of Castamere
#16: Jan 21st 2012 at 7:58:03 PM

It would be a bit creepy for me...one has to be Genre Savvy, those A.I.s probably go out of control and drive you insane, or suddenly you start seeing them everywhere or such.

Instead, I have learned a horrible truth of existence...some stories have no meaning.
breadloaf Since: Oct, 2010
#17: Jan 21st 2012 at 8:25:42 PM

I see nearly everybody manage to find someone, so it seems unnecessary to pursue such a technology. However, if we do develop sentient AI, I think it would be natural for some human individuals to fall in love with the AI (alternatively we may have accidentally made the AI capable of falling in love with another AI or individual). But I don't think AI specifically for dating would make sense.

As for the creepy factor, I think that since an AI is not a human being, the negative social consequence makes some sense if you believe that relationships should be about reproduction. If you think relationships are only about emotional bonds, then technically there's no logical argument against such a relationship. Personally, I think people should find bonds between human beings because I think creating a social bond between human individuals in a community is important.

RTaco Since: Jul, 2009
#18: Jan 21st 2012 at 8:44:04 PM

It'll be possible one day, but it's such a blurry line that we probably won't realize we've crossed it until much later. With the way we think, we won't treat A.I.s as people for a long time after we've developed ones that can feel as deeply as we do.

With the dating A.I.s, though, it brings up a question of free will. Does limiting one's programming to not feel other feelings lessen the legitimacy of the feelings it supposedly does feel?

edited 21st Jan '12 8:48:17 PM by RTaco

Yej See ALL the stars! from <0,1i> Since: Mar, 2010
See ALL the stars!
#19: Jan 22nd 2012 at 5:58:37 AM

[up] No. IMO, that's obvious once you consider the possibility of feelings that human neurology can't feel.

Da Rules excuse all the inaccuracy in the world. Listen to them, not me.
SavageHeathen Pro-Freedom Fanatic from Somewhere Since: Feb, 2011
Pro-Freedom Fanatic
#20: Jan 22nd 2012 at 6:12:06 AM

Sexbots that are only apparently self-aware will be developed shortly. They're non-sentient: Things, not people. Conceptually much different from a contemporary sex toy.

I fail to see the creep factor.

You exist because we allow it and you will end because we demand it.
Ramus Lead. from some computer somwhere. Since: Aug, 2009
Lead.
#21: Jan 22nd 2012 at 6:49:30 AM

Heathen, did you typo somewhere in there? Self-aware is one of the major prerequisites are sapience.

Anyway, cool! I'd buy one of these games just to see the whole thing in the works.

The emotions of others can seem like such well guarded mysteries, people 8egin to 8elieve that's how their own emotions should 8e treated.
SavageHeathen Pro-Freedom Fanatic from Somewhere Since: Feb, 2011
Pro-Freedom Fanatic
#22: Jan 22nd 2012 at 7:03:09 AM

[up] I said only apparently self-aware. That is, advanced weak AI designed to mimic strong AI.

You exist because we allow it and you will end because we demand it.
Ramus Lead. from some computer somwhere. Since: Aug, 2009
Lead.
#23: Jan 22nd 2012 at 7:07:24 AM

Ah, okay, that's what you mean. Well, people have been doing that for a while now anyway (with varying levels of success), it's not that surprising.

The emotions of others can seem like such well guarded mysteries, people 8egin to 8elieve that's how their own emotions should 8e treated.
TripleElation Diagonalizing The Matrix from Haifa, Isarel Since: Jan, 2001
Diagonalizing The Matrix
#24: Jan 22nd 2012 at 7:14:15 AM

I fail to see the creep factor.

Well, consider the places this can go. You're hugging your girlfriend, and she is laughing softly into your lap, saying, "I'm so happy to have you." You are comforted by her warmth in your arms. She whispers, "I love you". You whisper, "I love you too".

And it's not real. None of it is real. She's a piece of metal. She doesn't actually feel anything. She is a toy, and you're alone, in your room, hugging a toy. My lord, this is the creepiest "consider the impact of technology" thing I've seen all week, and considering I watched Black Mirror the other day, that is really saying something.

edited 22nd Jan '12 7:15:32 AM by TripleElation

Pretentious quote || In-joke from fandom you've never heard of || Shameless self-promotion || Something weird you'll habituate to
InverurieJones '80s TV Action Hero from North of the Wall. Since: Jan, 2010 Relationship Status: And they all lived happily ever after <3
'80s TV Action Hero
#25: Jan 22nd 2012 at 7:48:12 AM

Pah. It's just one of those pervy girlfriend pillow things with a simulated personality.

Are there really people so desperately ineffective at interpersonal relationships that this needs to exist? That's just tragic, really.

edited 22nd Jan '12 7:49:28 AM by InverurieJones

'All he needs is for somebody to throw handgrenades at him for the rest of his life...'

Total posts: 37
Top