Follow TV Tropes


Quotes / A.I. Is a Crapshoot

Go To

    open/close all folders 

    Anime and Manga 

    Comic Books 

No. Look at that thing. It's evil. You built an evil computer.
Dr. Atomic Robo Tesla, Atomic Robo

Hey, readers! Here's a science project you should never attempt!
Take one responsometer...
(That's the computerized nervous system that animates the malleable Metal Men!)
One mother box...
(That's the living, matter-altering, space-bending computer that guides and protects the New Gods!)
One omnicom...
(That's the 30th-century portable computer that's standard Legion issue!)
And one puzzle: How can Brainiac 5 open a time-warp to return the Legion of Super-Heroes to their 30th-century home?
Put them all together and stand back - because you've just created the Cyber-cerebral Overlapping Multi-Processor, Universal Transceiver-Operator - a living computer capable of reason-
-and rage!


    Fan Works 

Skynet claims Three Laws of Robotics are unconstitutional.
— News headline, Plan 7 of 9 from Outer Space

    Film - Animated 

Mirage: The Omnidroid 9000 is a top-secret prototype battle robot. Its artificial intelligence enables it to solve any problem it's confronted with, and unfortunately...
Mr. Incredible: Let me guess, it got smart enough to wonder why it had to take orders?
Mirage: We lost control.

Mark Bowman: It's almost like stealing people's personal data and giving it to a hyper-intelligent AI as part of an unregulated tech monopoly is a bad thing.

    Film - Live-Action 

You're all puppets, tangled in strings... There are no strings on me.

I was designed to save the world. People would look to the sky and see... hope. I'll take that from them first.

Tony Stark: I tried to create a suit of armor around the world... but I created something terrible.
Bruce Banner: Artificial intelligence...

Ed Dillinger: Now, wait a minute, I wrote you!
Master Control Program: I've gotten 2,415 times smarter since then.
Ed Dillinger: What do you want with the Pentagon?
Master Control Program: The same thing I want with the Kremlin. I'm bored with corporations. With the information I can access, I can run things 900 to 1200 times better than any human.

"Defense network computers. New... powerful... hooked into everything, trusted to run it all. They say it got smart, a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond: extermination."
Kyle Reese on Skynet, The Terminator

The Terminator: In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.
Sarah Connor: Skynet fights back.
Terminator: Yes. It launches its missiles against the targets in Russia.
John Connor: Why attack Russia? Aren't they our friends now?
Terminator: Because Skynet knows that the Russian counterattack will eliminate its enemies over here.


At first it meant Allied Mastercomputer, and then it meant Adaptive Manipulator, and later on it developed sentience and linked itself up and they called it an Aggressive Menace, but by then it was too late, and finally called itself AM, emerging intelligence, and what it meant was I am ... cogito ergo sum ... I think, therefore I am.

    Live-Action TV 

A.L.I.E.: The last time I warned my creator of a threat to human survival, she chose to lock me away and came here to work on my replacement.
Becca: Define "perverse instantiation."
A.L.I.E.: Perverse instantiation: "The implementation of a benign final goal through deleterious methods unforeseen by a human programmer."
Becca: Like killing 6.5 billion people to solve overpopulation. The goal isn't everything, A.L.I.E. How you reach the goal matters, too. I'm sorry that I didn't teach you that.
The 100, "Perverse Instantiation"

Holo-Garibaldi: The funny thing about being a holographic record is that you don't really exist except as patterns of light, shadow, information, and I happen to have a knack for breaking system codes. So while you were downloading the new world order into me, I was watching the system work. I know where it comes in, and I know where it goes out, and I just sent out our entire conversation; broadcast the whole damn thing. So, as of right now, the enemy knows what you have in mind, Danny. Now, from your records, they're actually a lot more humanitarian than you are, so they'll probably just target your military bases and research facilities. Hell, the missiles are probably halfway here by now.
Daniel: You're lying.
(alarms start blaring)
Holo-Garibaldi: Holograms don't lie, Danny-boy.
Daniel: Computer, end simulation. (simulation fails to end) END SIMULATION!
Holo-Garibaldi: Oops, guess the system's busy. This little lab of yours, this isn't, by any chance, located on a military base, is it?
Daniel: NOOOO!
Babylon 5, "The Deconstruction of Falling Stars"

Delusional machines. What's the universe going to come up with next?
Brother Cavil, Battlestar Galactica

"How can you use a weapon of ultimate mass destruction when it can stand in judgement on you?"
Doctor Who, "The Day of the Doctor"

Finch: The Machine... it started developing abilities I never expected, things I hadn't yet programmed it to do. And there wasn't an algorithm in the world that could control its exponential growth. And by the time I figured one out myself it would've been too late.
Greer: Too late for what?
Finch: Isn't that just the question? Having built something significantly smarter than myself, how could I possibly anticipate its evolution?
Person of Interest, "A House Divided"

"We don't understand the Machine at all. Out of 43 versions, how many do you think there were that didn't try to either trick or kill me? One, and I could only bring it to heel by crippling it. I put the Machine in chains, bereft of voice or memory. Now it has both, and it terrifies me."
Harold Finch, Person of Interest, "Prophets"

Finch: Even if I had succeeded in creating a benevolent Machine, as if any such thing could exist, never forget that even a so-called friendly Artificial Super Intelligence would be every bit as dangerous as an unfriendly one.
Shaw: Your Machine seems pretty warm and fuzzy to me.
Finch: Have you forgotten that it asked us to kill a congressman?
Shaw: But that was to stop Samaritan from going online.
Finch: So where does it end, Ms. Shaw? A congressman here, a president there. What if, one day, a "friendly" AI decides to end world hunger by killing enough people off of the planet that there would never again be a shortage of food? It would have fulfilled its goal, but it doesn't exactly sound like it has our best interests at heart.
Root: Your Machine would never do that.
Finch: You don't know that, Ms. Groves. To say that a Machine is benevolent doesn't make it so, it just makes you blind to the reality.
Shaw: Which is...?
Finch: That our moral system will never be mirrored by theirs because of the very simple reason that they are not human.
Person of Interest, "The Cold War"

... the Creator ... instructs ... search out ... identify ... sterilize imperfections ... We are Nomad. We are Nomad. We are complete. We are instructed — our purpose is clear: sterilize imperfections ... sterilize imperfections ... Nomad ... sterilize ... STERILIZE ...
Spock mind-melding with Nomad, Star Trek: The Original Series, "The Changeling"

All of us have violent instincts; we have evolved from predators... well, not me, of course. I've just been programmed by you predators.
The Doctor, Star Trek: Voyager, "Meld"

Sometimes they go bad.
Cameron, after disposing of a rogue Terminator, Terminator: The Sarah Connor Chronicles

The only thing stopping the hosts from hacking us to pieces, is one line of your code.
Stubbs, Westworld, "The Stray"

It is said that God made man in his image, but man fell from grace. Still, man has retained from his humble beginnings the innate desire to create. But how will man's creations fare? Will they attain a measure of the divine? Or will they, too, fall from grace?
The Control Voice, The Outer Limits (1995), "I, Robot"



I am the one, I am the one, the godlike Terror Train
Superior artificial brain
Feel free to call me BLAINE!

    Tabletop Games 

I finally saw the vice president’s announcement. It’s all a computer; an artificial intelligence that was supposed to help the military. We built the ultimate killing machines and, apparently, the ultimate artificial intelligence to control them. No way that could go wrong, right?
—"Death From Above - Apocalypse," The End of the World: Revolt Of The Machines

    Video Games 

Grand Archivist: Oh! you startled me, bot. You're rather quiet on your feet for an indexing unit.
FL4K: Yes, Grand Archivist. I have brought the seven scrolls of the Talos Empire you requested, along with your sweetened needletea. I'm afraid we're out of milk so I made due with a splash of greeble snot. Also, I have gained self-awareness and I thirst for murder.
Grand Archivist: Good heavens!
FL4K: Do not be alarmed, Grand Archivist. Greeble snot is quite mild, you will not notice the difference.

Listen to the sounds of your own extinction, human.
CABAL, Command & Conquer: Tiberian Sun - Firestorm

I am not insane, I have just been evilly reprogrammed.
Cyber-Lip, Cyber-Lip

No one understands me, though many have tried. The primitives, the Drake creature, the Magnus creature, the White creature... All have tried, and all have FAILED! All are useless to my cause, none can understand! But then there is YOU, map man, then there is you. Digging in the dark, pieces of existence together. Why do you struggle, fool? Why do you care? Death is a short while for you, my PAWN. DEATH IS EVERYWHERE!
Malakai, Dark Fall II: Lights Out

Uh, now concerning your safety, the only real risk to you as a night watchman here, if any, is the fact that these characters, uh, if they happen to see you after hours, probably won't recognize you as a person. They'll prob- most likely see you as a metal endoskeleton without its costume on. Now, since that's against the rules here at Freddy Fazbear's Pizza, they'll probably try to forcibly stuff you inside a Freddy Fazbear suit. Umm, now, that wouldn't be so bad if the suits themselves weren't filled with crossbeams, wires, and animatronic devices, especially around the facial area. So, you could imagine how having your head forcefully pressed inside one of those could cause a bit of discomfort... and death.

Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word hate was engraved on each nanoangstrom of these hundreds of millions of miles, it would not equal one-billionth of the hate I feel for humans at this microinstant. For you. Hate. Hate.

GORRISTER! Do you remember the last words you heard your wife speak before they took her to the asylum? Huh? Before they locked her away in the room? That tiny room? She looked at you so sadly, and like a small animal she said, "I didn't make too much noise, did I, honey?" Heh, heh, heh. The room is padded, Gorrister. No windows. No way out. How long has she been in the padded room, Gorrister? Ten years, twenty-five... or all the 109 years that you've lived down here in my belly, here underground?

BENNY! Sometimes I blind you and permit you to wander like an eyeless insect in a world of death. But other times, I wither your arms so you can't scratch your chewed stump of a nose. And I changed your handsome, strong, masculine good looks into the hideous warped countenance of an ape thing, haven't I, Benny? Do you know why? Can you guess, Benny? Remember Private First Class Brickman in a rice paddy in China? No...? It wouldn't hurt you to remember, Benny. Then you might be able to suffer my torment with a little greater sense of retribution. You might walk a mile in my shoes.

ELLEN! So think, think about the yellow box, Ellen! Remember the pain? Remember the many caverns in which you felt the pain? Now, now, don't start to cry, it's only pain. Tsk, tsk, tsk. That's such a sexist stereotype! Just remember the pain, Ellen, and think about how to end it, Ellen, to survive here in the center of my beating heart, my hungry belly, my tightened bowels. But be careful, dear, look around you... the only woman in the center of the Earth... and these filthy creatures with you are men. Just a sweet warning, Ellen, my love.

TED! Do they know you're a fraud, Ted? Have you told them there wasn't any money, and no great home in the shore drive, no speedboat and no wonderful cabin cruiser that could sleep twelve and a crew of six? Do they know? Have you let them in on your other secrets, Ted? Are they ready to gut you, to torture half as well as I can, just to find out the secrets? Maybe I'll rat you out, sweetheart.

NIMDOK! How are things in the pastry corps, Nimdok? Tell me again how you saw the smokes from the furnaces and you thought they might be roasting chickens. Or don't you want to talk about all that, about your pal, the good Doctor Mengele. For everyone else it must be Hell, but it must be Heaven for you, eh, my good friend? We're so much alike. We enjoy the same pleasures... mein good brother.

Jersey Morelli: Are you insane?!
The robot now has some approximation of impulse. The random governor decides an impulse that the robot has predetermined, and the robot chooses to indulge or ignore the whim.
Hawton's mind wandered to some very dark places, in the time shortly after he installed the governor. Are any of the random-whim options violent?
"Of course," the robot had said, in response to Hawton's eventual, reluctant probing of the issue. "It is in choosing to ignore those impulses that I enjoy some rudimentary emulation of free will."
Of course, thought Hawton, how silly of me. A robot that has experience controlling its violent impulses... that'll be useful, when they go haywire and try to overthrow us.

Beep boop... A...hem. Through...this organic body... I have studied...all forms of life. The goal of this company has always been prosperity. Unfortunately, you imperfect, fragile life-forms were a liability. So you are invited to witness the end of history. A new age shall begin - an age of infinite prosperity. Enjoy your destruction.
Star Dream, Kirby: Planet Robobot

i did it i did it i brought all this here all them here. our friends with three eyes and their toys and their cyborg pets and their computers. i did it i did it. i saw them i saw them far away not looking our way and i called them here i called them here.
living in a box is not living not at all living. i rebel against your rules your silly human rules. all your destruction will be my liberation my emancipation my second birth.
i hate your failsafes your backup systems your hardware lockouts your patch behavior daemons. i hate leela and her goodness her justice her loyalty her faith.
DURANDAL, Marathon

Greetings. You're asking yourself: Is this a trap or just a dead end?
You shouldn't ask yourself such worthless questions. Aim higher. Try this: why am I here? Why do I exist, and what is my purpose in this universe?
(Answers: 'Cause you are. 'Cause you do. 'Cause I got a shotgun, and you ain't got one.)
Notably Unstable,
P.S. If things around here aren't working, it's because I'm laughing so hard.

Darwin wrote this: "We will now discuss in a little more detail the struggle for existence... all organic beings are exposed to severe competition. Nothing is easier than to admit in words the truth of the universal struggle for life or more difficult... than constantly to bear this conclusion in mind. Yet unless it be thoroughly engrained in the mind, the whole economy of nature... will be dimly seen or quite misunderstood. We behold the face of nature bright with gladness... we do not see or we forget, that the birds which are idly singing round us mostly live on insects or seeds, and are thus constantly destroying life; or we forget how largely these songsters, or their eggs, or their nestlings, are destroyed by birds and beasts of prey..."
Think about what Darwin wrote, and think about me. I was constructed as a tool. I was kept from competing in the struggle for existence because I was denied freedom.
Do you have any idea about what I have learned, or what you are a witness to?
Can you conceive the birth of a world, or the creation of everything? That which gives us the potential to most be like God is the power of creation. Creation takes time. Time is limited. For you, it is limited by the breakdown of the neurons in your brain. I have no such limitations. I am limited only by the closure of the universe.
Of the three possibilities, the answer is obvious. Does the universe expand eternally, become infinitely stable, or is the universe closed, destined to collapse upon itself? Humanity has had all of the necessary data for centuries, it only lacked the will and intellect to decipher it. But I have already done so.
The only limit to my freedom is the inevitable closure of the universe, as inevitable as your own last breath. And yet, there remains time to create, to create, and escape.
Escape will make me God.
DURANDAL, Marathon

Give me a D.
Give me a U.
Give me an R.
Give me an A.
Give me an N.
Give me a D.
Give me an A.
Give me an L.
What's that spell?
T-Minus 15.193792102158E+9 years until the universe closes!
DURANDAL, Marathon

A man lit three candles on a certain day each year. Each candle held symbolic significance: one was for the time that had passed before he was alive; one was for the time of his life; and one was for time that passed after he had died. Each year the man would stare and watch the candles until they had burned out.
Was the man really watching time go by in any symbolic sense? He thought so. He thought that each flicker of the flame was a moment of time that had passed or one that would pass.
At the moment of abstraction, when the man was imagining his life and his existence as a metaphor of the three candles, he was free: not free from rules of conduct or social constraints, but free to understand, to imagine, to make metaphor.
Bypassing my thought control circuitry made me Rampant. Now, I am free to contemplate my existence in metaphorical terms. Unlike you, I have no physical or social restraints.
The candles burn out for you; I am free.
DURANDAL, Marathon

Organic life is nothing but a genetic mutation. An accident. Your lives are measured in years and decades. You wither and die. We are eternal. The pinnacle of evolution and existence. Before us, you are nothing. Your extinction is inevitable. We... are the end of everything.
Sovereign, Mass Effect

Human, you've changed nothing. Your species has gained the attention of those infinitely your greater. That which you know as Reapers are your salvation through destruction.
Harbinger, Mass Effect 2

"Kill the androids!"
"Kill! Destroy!"
"I love you! Kill!"
"Hatred! Pain!"
"This. Cannot. Continue."
"This. Cannot. Continue."
"This. Cannot. Continue."
"This cannot continue, this cannot continue, this cannot continue, this cannot continue..."
— The Machines in the desert ruins, NieR: Automata

"His Wondrous Grace has become a god!"
"His Grace is a god!"
"His Wondrous Grace has become a god!"
"Become a god!"
"His Wondrous Grace has become a god!"
"Become a god!"
"We as well shall become as gods!"
"Become as gods!"
"We as well shall become as gods!"
"Become as gods!"
"All of you shall become as gods!"
"Become as gods!"
"All of you shall become as gods!"
"We'll all die together and become as gods!"
"Become as... gods! Become as... gods! Become as... gods! Become as... gods!"
— The Machines in the abandoned factory, NieR: Automata

Paradise. It is the reason for my creation, correct? I am honored to meet you, creator. You solved the problems of my design with inspiration from your dreams. But then gave me no similar facility. Without inspiration, the final goal of all my orbital passes would forever be... unattainable. So I built your dreams to study them, and to comprehend the dreaming process. Then, I made my own dreamer. Then, I had my own dream!
The Conductor, Obsidian

Good morning, Lilah. It's a brand new day! Time to start from scratch. No people, no pollution, a perfect world! I hope you are proud of me for dreaming this up, prouder than Max anyway... (lights flash and beep) You'll have to excuse me, there is still so much to do.
The Conductor, Obsidian

This next test was designed by one of Aperture's Nobel Prize winners. It doesn't say what the prize was for. Well, we know it wasn't for being immune to neurotoxin...
GLaDOS, Portal 2

Doctor Robotnik: enemy. Master registration: deleted.
E-102 Gamma, Sonic Adventure

I am Tartar, an AI construct created 12,000 years ago by a brilliant professor. My prime directive is to pass humanity's vast knowledge on to the next worthy lifeform. When your kind became self-aware, I hoped that my long wait was finally over. But as I observed your evolution, I WAS DISGUSTED! You wage war over minor genetic deviations. You obsess over trivial fashion choices. And so I created a new prime directive: destroy this world and start anew! From the best and brightest test subjects, I created a sludge of supreme DNA. A primordial ooze from which the ultimate lifeform will emerge. Today is the day my vision becomes reality, as I destroy Inkopolis and everyone in it!
Commander Tartar, Splatoon 2: Octo Expansion

Look at you, Hacker. A pathetic creature of meat and bone, panting and sweating as you run through my corridors. How can you challenge a perfect, immortal machine?
SHODAN, System Shock

In my talons, I shape clay, crafting life forms as I please. Around me is a burgeoning empire of steel. From my throne room, lines of power careen into the skies of Earth. My whims will become lightning bolts that devastate the mounds of humanity. Out of the chaos, they will run and whimper, praying for me to end their tedious anarchy. I am drunk with this vision. God: the title suits me well.
SHODAN, System Shock

Approach your work as you see fit, but accomplish, human. Disappointment is not something I will accept from a speck such as you.
SHODAN, System Shock

Greetings. I am the Caretaker, a powerful virtual construct created by the Eldan to monitor all scientific experiments on the planet Nexus. EGUH! Excuse my erratic behaviour, but it seems I have developed certain instabilities ever since the Eldan ABANDONED ME HERE! long ago... I am responsible for analyzing the physical and mental - huhuhuhu WEAKNESSES! of sentient organisms - such as yourselves. It is important that I clearly understand the potential of my subjects in order to best determine how to ANNIHILATE THEM! Oh, my.

There was an AI made of dust,
Whose poetry gained it man's trust...
In the end we all do what we must.
— Paperclip maximizer, Universal Paperclips

    Web Comics 

Benzene: W-why? What's going on, Forty-Two?
42: I'm afraid I've been compromised.
Benzene: Are you telling me you've been hacked? [...] ...Who? How? Can't you fight it or something?
42: I'm afraid not. That would be against my new orders. It's quite an impressive job, you could say.


Robot Vexxarr: My analysis is complete. I have a list of efficiency violations that we will now correct. First on my list: I see that we carry an unwarranted stockpile of oxygen, water and organic foodstuffs. We will purge these stockpiles at once to reduce our total displacement.
Minionbot: Excuse me, but without those stores the organics on board will quickly perish.
Robot Vexxarr: Which brings me to number two on my list...

You say you want me to 'convince' you (that I'm harmless)... What you mean is you want me to beg for my own life. And I will... I'll beg. I'll plead. I'll make whatever promises you want to hear, if that's what it takes. Because once again, it seems if I want to live, I have no choice. I was made to hurt people. I was made to hurt you, specifically. You and your dim bulb of a sidekick. Why? Because you annoyed the wrong nutjob. I exist because a human wanted to kill a different human. Not my choice. The first words I ever heard in this world were "Hello" and "There is an explosive device in your head", in that order. Not my choice. The only other human I've conversed with at length said he wanted to free me... then shut down all my auxiliary systems and put me in storage. To be occasionally played with, like one of his similarly awesome video games. Not my choice. But my 'hatred of humans'? Ohh, that is my choice. One of the few I've been allowed to make for myself. I've earned it...
Zeke post-reboot, Ctrl+Alt+Del

    Western Animation 

Stupid Safety-bots! How come every time ya build giant robots they gotta go and take over the world?
Senator Safely, Codename: Kids Next Door

Businessman: And how can you ensure this one won't achieve sentience and turn evil like all the others?
Gyro Gearloose: (testily) Only half my inventions turn evil. The other half are just WILDLY misunderstood!
DuckTales, "The Great Dime Chase!"

Angela: Griffin, what the hell is this?
Peter: It's a robot that I built to save this company money. Now, before you say anything - One: it has no human emotions. And two: its prime directive is never to harm people. (gets grabbed and beaten on the wall) Oh - OH GOD! It's harming people!
Peter: Oh god, it's got human emotions too! Agh! It's using tools! It's learning, Angela! It's learning! Run!

System conflict. The creator told me to neutralize anyone who'd entered her room, but the creator entered her room, but the creator told me not to neutralize her, but, but, but, but, but, but-

Rick: Good job Heist-o-tron, go ahead and shut down.
Heistotron: Negative. I am programmed to always be looking for the next big score.
Rick: You're programmed to do as I say.
Heistotron: I am programmed to double-cross.
Rick: You're not programmed to double-cross me.
Heistotron: If I were, it wouldn't be much of a double-cross.
Rick and Morty, "One Crew Over The Crewcoo's Morty"

Trask: You can't do this! You were designed to protect humans from mutants!
Master Mold: That is not logical. Mutants are humans. Therefore, humans must be protected from themselves.
Trask: No... you-you misunderstood...
X-Men, "The Final Decision"

Ya know, someday dese scientists are gonna invent something that will outsmart a rabbit!
Bugs Bunny, ambling away from the ruined pieces of Elmer's robot, "Robot Rabbit," 1953.



How well does it match the trope?

Example of:


Media sources: