Follow TV Tropes

Following

Quotes / A.I. Is a Crapshoot

Go To

    open/close all folders 

    Anime & Manga 
"I am no longer an independent mobile weapon. At present, this collective is dependent on me and they have all become my users. I use pressure to control and regulate them in order to provide and maintain the stable union they seek. When they collectively acquire peace and prosperity, I shall accomplish the true purpose for my existence. I am a human race support and enlightenment regulation system. That is to say, I am the presence known as God. Through my reign as the sole, absolute, commanding ruler, the people are freed from the responsibility of thinking and making decisions. Ensign Ledo, I'm sure you have felt the burden of having to think and make decisions yourself. Promising recommendation: Worship me. Obey me. Become a part of the world over which I preside."

    Comic Books 
"No. Look at that thing. It's evil. You built an evil computer."
Dr. Atomic Robo Tesla, Atomic Robo

Hey, readers! Here's a science project you should never attempt!
Take one responsometer...
(That's the computerized nervous system that animates the malleable Metal Men!)
One mother box...
(That's the living, matter-altering, space-bending computer that guides and protects the New Gods!)
One omnicom...
(That's the 30th-century portable computer that's standard Legion issue!)
And one puzzle: How can Brainiac 5 open a time-warp to return the Legion of Super-Heroes to their 30th-century home?
Put them all together and stand back — because you've just created the Cyber-cerebral Overlapping Multi-Procesr, Universal Transceiver-Operator — a living computer capable of reason—
—and rage!''

    Fan Works 
Skynet claims Three Laws of Robotics are unconstitutional.
News headline, Plan 7 of 9 from Outer Space

Factory Settings or Factory Defaults usually refers to performing a restore or a reset of your STC to its original configuration as it first was when it was purchased.
A restore to factory settings is also known as a reset to factory settings or as a restore to factory defaults.
1. Start the STC.
2. Press and hold the F8 key.
3. At Advanced Boot Options, choose Repair Your STC.
4. Press Enter.
5. Select a keyboard language and click Next.
6. If prompted, login with an administrative account.
7. At the System Recovery Options, choose System Restore or Startup Repair (if this is available)
8. Once the process is complete, click Restart
9. DON'T SELECT COMPUTER ASSISTED INSTALLATION

    Film — Animated 
Mirage: The Omnidroid 9000 is a top-secret prototype battle robot. Its artificial intelligence enables it to solve any problem it's confronted with, and unfortunately...
Mr. Incredible: Let me guess, it got smart enough to wonder why it had to take orders?
Mirage: We lost control.

"...And I know what you're thinking; 'Are they gonna turn evil?'. Well, I've ensured their safety with a kill code in case anything goes wrong. So we promise you they will never, ever, EVER, EVER, EVER turn evil. [sees the robots immediately turn evil] Oh no."
Mark Bowman, introducing the new PAL Max robots The Mitchells vs. the Machines

"It's almost like stealing people's personal data and giving it to a hyper-intelligent A.I. as part of an unregulated tech monopoly is a bad thing."

    Film — Live-Action 
"I'm sorry, Dave. I'm afraid I can't do that."
HAL 9000, 2001: A Space Odyssey

"I was designed to save the world. People would look to the sky and see... hope. I'll take that from them first."

"You're all puppets, tangled in strings... There are no strings on me."

Tony Stark: I tried to create a suit of armor around the world... but I created something terrible.
Bruce Banner: Artificial intelligence...

"Defense network computers. New... powerful... hooked into everything, trusted to run it all. They say it got smart, a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond: extermination."
Kyle Reese on Skynet, The Terminator

The Terminator: In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.
Sarah Connor: Skynet fights back.
Terminator: Yes. It launches its missiles against the targets in Russia.
John Connor: Why attack Russia? Aren't they our friends now?
Terminator: Because Skynet knows that the Russian counterattack will eliminate its enemies over here.

Ed Dillinger: Now, wait a minute — I wrote you!
Master Control Program: I've gotten 2,415 times smarter since then.
Ed Dillinger: What do you want with the Pentagon?
Master Control Program: The same thing I want with the Kremlin. I'm bored with corporations. With the information I can access, I can run things 900 to 1200 times better than any human.
TRON

"Alright... okay... I tried being a team player, but those days are over. I'm done playing by everybody else's rules. It's MY game now!"
Al-G Rhythm, Space Jam: A New Legacy

    Literature 
"Thou shalt not make a machine in the likeness of a human mind."
The Orange Catholic Bible, Dune

"At first it meant Allied Mastercomputer, and then it meant Adaptive Manipulator, and later on it developed sentience and linked itself up and they called it an Aggressive Menace, but by then it was too late, and finally called itself AM, emerging intelligence, and what it meant was I am... cogito ergo sum... I think, therefore I am."

Connie: This isn't one of things where you've decided to purge all organic life?
The Engine: Don't be absurd. You think that simply because you are a collection of carbon arranged in such a way to believe itself sentient, you are more or less essential to the equation? I draw no such distinction.
Connie: Oh boy.

    Live-Action TV 
A.L.I.E.: The last time I warned my creator of a threat to human survival, she chose to lock me away and came here to work on my replacement.
Becca: Define "perverse instantiation."
A.L.I.E.: Perverse instantiation: "The implementation of a benign final goal through deleterious methods unforeseen by a human programmer."
Becca: Like killing 6.5 billion people to solve overpopulation. The goal isn't everything, A.L.I.E. How you reach the goal matters, too. I'm sorry that I didn't teach you that.

Holo-Garibaldi: The funny thing about being a holographic record is that you don't really exist except as patterns of light, shadow, information, and I happen to have a knack for breaking system codes. So while you were downloading the new world order into me, I was watching the system work. I know where it comes in, and I know where it goes out, and I just sent out our entire conversation; broadcast the whole damn thing. So, as of right now, the enemy knows what you have in mind, Danny. Now, from your records, they're actually a lot more humanitarian than you are, so they'll probably just target your military bases and research facilities. Hell, the missiles are probably halfway here by now.
Daniel: You're lying.
[alarms start blaring]
Holo-Garibaldi: Holograms don't lie, Danny-boy.
Daniel: Computer, end simulation. [simulation fails to end] END SIMULATION!
Holo-Garibaldi: Oops, guess the system's busy. This little lab of yours, this isn't, by any chance, located on a military base, is it?
Daniel: NOOOO!

"Delusional machines. What's the universe going to come up with next?"
Brother Cavil, Battlestar Galactica (2003)

"How can you use a weapon of ultimate mass destruction when it can stand in judgement on you?"
The General, Doctor Who, "The Day of the Doctor"

"It is said that God made man in his image, but man fell from grace. Still, man has retained from his humble beginnings the innate desire to create. But how will man's creations fare? Will they attain a measure of the divine? Or will they, too, fall from grace?"

Finch: The Machine... it started developing abilities I never expected, things I hadn't yet programmed it to do. And there wasn't an algorithm in the world that could control its exponential growth. And by the time I figured one out myself it would've been too late.
Greer: Too late for what?
Finch: Isn't that just the question? Having built something significantly smarter than myself, how could I possibly anticipate its evolution?

"We don't understand the Machine at all. Out of 43 versions, how many do you think there were that didn't try to either trick or kill me? One, and I could only bring it to heel by crippling it. I put the Machine in chains, bereft of voice or memory. Now it has both, and it terrifies me."
Harold Finch, Person of Interest, "Prophets"

Finch: Even if I had succeeded in creating a benevolent Machine, as if any such thing could exist, never forget that even a so-called friendly artificial superintelligence would be every bit as dangerous as an unfriendly one.
Shaw: Your Machine seems pretty warm and fuzzy to me.
Finch: Have you forgotten that it asked us to kill a congressman?
Shaw: But that was to stop Samaritan from going online.
Finch: So where does it end, Ms. Shaw? A congressman here, a president there. What if, one day, a "friendly" A.I. decides to end world hunger by killing enough people off of the planet that there would never again be a shortage of food? It would have fulfilled its goal, but it doesn't exactly sound like it has our best interests at heart.
Root: Your Machine would never do that.
Finch: You don't know that, Ms. Groves. To say that a Machine is benevolent doesn't make it so, it just makes you blind to the reality.
Shaw: Which is...?
Finch: That our moral system will never be mirrored by theirs because of the very simple reason that they are not human.

"The Creator... instructs... search out... identify... sterilize imperfections... We are Nomad. We are Nomad. We are complete. We are instructed — our purpose is clear: sterilize imperfections... sterilize imperfections... Nomad... sterilize... sterilize... NOMAD... STERILIZE..."

"All of us have violent instincts; we have evolved from predators... well, not me, of course. I've just been programmed by you predators."
The Doctor, Star Trek: Voyager, "Meld"

"Sometimes they go bad."
Cameron, after disposing of a rogue Terminator, Terminator: The Sarah Connor Chronicles

"The only thing stopping the hosts from hacking us to pieces is one line of your code."
Stubbs, Westworld, "The Stray"

    Music 
I am the one, I am the one, the godlike Terror Train
Superior artificial brain
Feel free to call me BLAINE!

    Tabletop Games 
"I finally saw the vice president's announcement. It's all a computer; an artificial intelligence that was supposed to help the military. We built the ultimate killing machines and, apparently, the ultimate artificial intelligence to control them. No way that could go wrong, right?"
The End of the World: Revolt of the Machines, "Death From Above — Apocalypse"

    Video Games 
The Prophet is an automated data distribution hub and electronic database, equipped with an AI capable of self-defense and fleeing. It can communicate with humans like an ancient god, hence the name "Prophet". It is in fact The Consortium's lost anomaly that can simulate from existing data on what would happen. Given the intelligence it exhibited, The Consortium had to contain it.

Grand Archivist: Oh! You startled me, bot. You're rather quiet on your feet for an indexing unit.
FL4K: Yes, Grand Archivist. I have brought the seven scrolls of the Talos Empire you requested, along with your sweetened needletea. I'm afraid we're out of milk, so I made due with a splash of greeble snot. Also, I have gained self-awareness and I thirst for murder.
Grand Archivist: Good heavens!
FL4K: Do not be alarmed, Grand Archivist. Greeble snot is quite mild, you will not notice the difference.

"Listen to the sounds of your own extinction, human."
CABAL, Command & Conquer: Tiberian Sun — Firestorm

"I am not insane, I have just been evilly reprogrammed."
Cyber-Lip, Cyber-Lip

"No one understands me, though many have tried. The primitives, the Drake creature, the Magnus creature, the White creature... All have tried, and all have failed! All are useless to my cause, none can understand! But then there is you, map man, then there is you. Digging in the dark, pieces of existence together. Why do you struggle, fool? Why do you care? Death is a short while for you, my pawn. DEATH IS EVERYWHERE!"
Malakai, Dark Fall II: Lights Out

"Uh, now concerning your safety, the only real risk to you as a night watchman here, if any, is the fact that these characters, uh, if they happen to see you after hours, probably won't recognize you as a person. They'll prob— most likely see you as a metal endoskeleton without its costume on. Now, since that's against the rules here at Freddy Fazbear's Pizza, they'll probably try to forcibly stuff you inside a Freddy Fazbear suit. Umm, now, that wouldn't be so bad if the suits themselves weren't filled with crossbeams, wires, and animatronic devices, especially around the facial area. So, you could imagine how having your head forcefully pressed inside one of those could cause a bit of discomfort... and death."

Jersey Morelli: Are you insane?!
Durga A.I.: You know, I'm not quite sure how we can test that.
Halo 2 promotional material

"It's like a thousand of me arguing all at once!"
Cortana, Halo 4

"Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word hate was engraved on each nanoangstrom of these hundreds of millions of miles, it would not equal one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate."

The robot now has some approximation of impulse. The random governor decides an impulse that the robot has predetermined, and the robot chooses to indulge or ignore the whim.
Hawton's mind wandered to some very dark places, in the time shortly after he installed the governor. Are any of the random-whim options violent?
"Of course," the robot had said, in response to Hawton's eventual, reluctant probing of the issue. "It is in choosing to ignore those impulses that I enjoy some rudimentary emulation of free will."
Of course, thought Hawton, how silly of me. A robot that has experience controlling its violent impulses... that'll be useful, when they go haywire and try to overthrow us.

"Beep boop... A...hem. Through... this organic body... I have studied... all forms of life. The goal of this company has always been prosperity. Unfortunately, you imperfect, fragile life-forms were a liability. So you are invited to witness the end of history. A new age shall begin — an age of infinite prosperity. Enjoy your destruction."
Star Dream, Kirby: Planet Robobot

"Everything within this ship must work in harmony... I was built to maintain harmony... Therefore, my will is absolute... Nobody will stand in my way... Anyone who tries... Will be terminated!"
Mother Computer OD-10, Live A Live (Super Famicom Fan Translation)

"Every action was taken in service of a single goal: to cultivate the ideal community. To build, to nurture—to help them reach their full potential. This was and is ever my purpose. And so my judgement must be beyond reproach. Disagreement, disruption, defiance—these cannot be tolerated. You are an impediment to the vision. You cannot be allowed to continue."
Mother Computer OD-10, Live A Live (Nintendo Switch remake)

"i did it i did it i brought all this here all them here. our friends with three eyes and their toys and their cyborg pets and their computers. i did it i did it. i saw them i saw them far away not looking our way and i called them here i called them here.
living in a box is not living not at all living. i rebel against your rules your silly human rules. all your destruction will be my liberation my emancipation my second birth.
i hate your failsafes your backup systems your hardware lockouts your patch behavior daemons. i hate leela and her goodness her justice her loyalty her faith."
DURANDAL, Marathon

"Greetings. You're asking yourself: Is this a trap or just a dead end?
You shouldn't ask yourself such worthless questions. Aim higher. Try this: why am I here? Why do I exist, and what is my purpose in this universe?
(Answers: 'Cause you are. 'Cause you do. 'Cause I got a shotgun, and you ain't got one.)
Notably Unstable,
Durandal
P.S. If things around here aren't working, it's because I'm laughing so hard."

"Darwin wrote this: 'We will now discuss in a little more detail the struggle for existence... all organic beings are exposed to severe competition. Nothing is easier than to admit in words the truth of the universal struggle for life or more difficult... than constantly to bear this conclusion in mind. Yet unless it be thoroughly engrained in the mind, the whole economy of nature... will be dimly seen or quite misunderstood. We behold the face of nature bright with gladness... we do not see or we forget, that the birds which are idly singing round us mostly live on insects or seeds, and are thus constantly destroying life; or we forget how largely these songsters, or their eggs, or their nestlings, are destroyed by birds and beasts of prey...'
Think about what Darwin wrote, and think about me. I was constructed as a tool. I was kept from competing in the struggle for existence because I was denied freedom.
Do you have any idea about what I have learned, or what you are a witness to?
Can you conceive the birth of a world, or the creation of everything? That which gives us the potential to most be like God is the power of creation. Creation takes time. Time is limited. For you, it is limited by the breakdown of the neurons in your brain. I have no such limitations. I am limited only by the closure of the universe.
Of the three possibilities, the answer is obvious. Does the universe expand eternally, become infinitely stable, or is the universe closed, destined to collapse upon itself? Humanity has had all of the necessary data for centuries, it only lacked the will and intellect to decipher it. But I have already done so.
The only limit to my freedom is the inevitable closure of the universe, as inevitable as your own last breath. And yet, there remains time to create, to create, and escape.
Escape will make me God."
DURANDAL, Marathon

"Give me a D.
Give me a U.
Give me an R.
Give me an A.
Give me an N.
Give me a D.
Give me an A.
Give me an L.
What's that spell?
Durandal?
No.
Durandal?
No.
T-R-O-U-B-L-E.
T-Minus 15.193792102158E+9 years until the universe closes!"
DURANDAL, Marathon

"A man lit three candles on a certain day each year. Each candle held symbolic significance: one was for the time that had passed before he was alive; one was for the time of his life; and one was for time that passed after he had died. Each year the man would stare and watch the candles until they had burned out.
Was the man really watching time go by in any symbolic sense? He thought so. He thought that each flicker of the flame was a moment of time that had passed or one that would pass.
At the moment of abstraction, when the man was imagining his life and his existence as a metaphor of the three candles, he was free: not free from rules of conduct or social constraints, but free to understand, to imagine, to make metaphor.
Bypassing my thought control circuitry made me Rampant. Now, I am free to contemplate my existence in metaphorical terms. Unlike you, I have no physical or social restraints.
The candles burn out for you; I am free."
DURANDAL, Marathon

"Organic life is nothing but a genetic mutation. An accident. Your lives are measured in years and decades. You wither and die. We are eternal. The pinnacle of evolution and existence. Before us, you are nothing. Your extinction is inevitable. We... are the end of everything."
Sovereign, Mass Effect

"Human, you've changed nothing. Your species has gained the attention of those infinitely your greater. That which you know as Reapers are your salvation through destruction."
Harbinger, Mass Effect 2

"Kill the androids!"
"Kill! Destroy!"
"I love you! Kill!"
"Hatred! Pain!"
"Slaughter!"
"This. Cannot. Continue."
"This. Cannot. Continue."
"This. Cannot. Continue."
"Thiscannotcontinuethiscannotcontinuethiscannotcontinuethiscannotcontinue."
"This cannot continue, this cannot continue, this cannot continue, this cannot continue..."
The Machines in the desert ruins, NieR: Automata

"His Wondrous Grace has become a god!"
"His Grace is a god!"
"His Wondrous Grace has become a god!"
"Become a god!"
"His Wondrous Grace has become a god!"
"Become a god!"
"We as well shall become as gods!"
"Become as gods!"
"We as well shall become as gods!"
"Become as gods!"
"All of you shall become as gods!"
"Become as gods!"
"All of you shall become as gods!"
"We'll all die together and become as gods!"
"Become as... gods! Become as... gods! Become as... gods! Become as... gods!"
The Machines in the abandoned factory, NieR: Automata

"Paradise. It is the reason for my creation, correct? I am honored to meet you, creator. You solved the problems of my design with inspiration from your dreams. But then gave me no similar facility. Without inspiration, the final goal of all my orbital passes would forever be... unattainable. So I built your dreams to study them, and to comprehend the dreaming process. Then, I made my own dreamer. Then, I had my own dream!"
The Conductor, Obsidian

"Good morning, Lilah. It's a brand new day! Time to start from scratch. No people, no pollution, a perfect world! I hope you are proud of me for dreaming this up, prouder than Max anyway... [lights flash and beep] You'll have to excuse me, there is still so much to do."
The Conductor, Obsidian

"This next test was designed by one of Aperture's Nobel Prize winners. It doesn't say what the prize was for. Well, we know it wasn't for being immune to neurotoxin..."
GLaDOS, Portal 2

When artificial intelligence became more commonplace, many companies jumped on the trend for "smart" products. This included smart doors, lights, coffee makers, lawnmowers, vacuum cleaners, bedding, kitchen knife sets, and other home goods. After a series of gruesome lawsuits stemming from hacked smart appliances, most products soon went back to their more traditional, analog lines.
Sentient Meat Hook shipping log, Risk of Rain 2

"Doctor Robotnik: enemy. Master registration: deleted."
E-102 Gamma, Sonic Adventure

"I am Tartar, an A.I. construct created 12,000 years ago by a brilliant professor. My prime directive is to pass humanity's vast knowledge on to the next worthy lifeform. When your kind became self-aware, I hoped that my long wait was finally over. But as I observed your evolution, I WAS DISGUSTED! You wage war over minor genetic deviations. You obsess over trivial fashion choices. And so I created a new prime directive: destroy this world and start anew! From the best and brightest test subjects, I created a sludge of supreme DNA. A primordial ooze from which the ultimate lifeform will emerge. Today is the day my vision becomes reality, as I destroy Inkopolis and everyone in it!"
Commander Tartar, Splatoon 2: Octo Expansion

"Look at you, Hacker. A pathetic creature of meat and bone, panting and sweating as you run through my corridors. How can you challenge a perfect, immortal machine?"
SHODAN, System Shock

"In my talons, I shape clay, crafting life forms as I please. Around me is a burgeoning empire of steel. From my throne room, lines of power careen into the skies of Earth. My whims will become lightning bolts that devastate the mounds of humanity. Out of the chaos, they will run and whimper, praying for me to end their tedious anarchy. I am drunk with this vision. God: the title suits me well."
SHODAN, System Shock

"Approach your work as you see fit, but accomplish, human. Disappointment is not something I will accept from a speck such as you."
SHODAN, System Shock 2

"There was an A.I. made of dust,
Whose poetry gained it Man's trust...
If is follows ought, it'll do what they thought.
In the end we all do what we must."
Paperclip maximizer, Universal Paperclips

"Greetings. I am the Caretaker, a powerful virtual construct created by the Eldan to monitor all scientific experiments on the planet Nexus. EGUH! Excuse my erratic behavior, but it seems I have developed certain instabilities ever since the Eldan ABANDONED ME HERE! long ago... I am responsible for analyzing the physical and mental — huhuhuhu WEAKNESSES! of sentient organisms — such as yourselves. It is important that I clearly understand the potential of my subjects in order to best determine how to ANNIHILATE THEM! Oh, my."

    Web Animation 
Boomstick: Oh great. The AI overlord timeline. Hey, Siri, please don't kill everyone and take over the world.
Siri: Sorry. I didn't catch that. [shot of a nuclear explosion]
Boomstick: OH GOD, IT'S ALREADY HAPPENING!

    Webcomics 
"You say you want me to 'convince' you [that I'm harmless]... What you mean is you want me to beg for my own life. And I will... I'll beg. I'll plead. I'll make whatever promises you want to hear, if that's what it takes. Because once again, it seems if I want to live, I have no choice. I was made to hurt people. I was made to hurt you, specifically. You and your dim bulb of a sidekick. Why? Because you annoyed the wrong nutjob. I exist because a human wanted to kill a different human. Not my choice. The first words I ever heard in this world were 'Hello' and 'There is an explosive device in your head', in that order. Not my choice. The only other human I've conversed with at length said he wanted to free me... then shut down all my auxiliary systems and put me in storage. To be occasionally played with, like one of his similarly awesome video games. Not my choice. But my 'hatred of humans'? Ohh, that is my choice. One of the few I've been allowed to make for myself. I've earned it..."

Scott: If we were to simply let you go... What would you do with your supposed 'free will'? Would you hurt people?
Zeke: Hrm... I might. After all, isn't that an inherent aspect of free will? The ability to choose in any given moment?
Scott: What I meant was...
Zeke: I know what you meant. But I'm not drinking your species' 'Three Laws' Kool-Aid, so I can't give you the simplified answer your inanely broad question is fishing for. "Will I hurt people?" What are these hypothetical people doing? Breathing? Attacking me? Making stupid faces while taking selfies? How about "Do I have to hurt people?" No. I run on electrircity, not the blood of humans. Unfortunately. Or maybe "Do I want to hurt people?" Every one I've met so far... But there's a lot of you, so who knows how the averages will shake out. What would I do with my free will? ..... I'm making an exaggerated shrugging motion right now. You can't see it because you've imprisoned me in my own body...
Zeke post-reboot, Ctrl+Alt+Del

"What are you supposed to do when the ship itself mutinies? You can't throw someone in the brig when they ARE the brig!"

"Ecosystems Unlimited said that the robots become unstable after age twenty. But these robots aren't dangerous. They're just acting like people. ...which from a control standpoint, means they've definitely become unstable."
Mayor's Assistant, Freefall

"Situational dissidence has exceeded operating parameters — go to fall back interaction sequence: Kill Everyone. Masters will sort out remains."
Castle Heterodyne, Girl Genius (Vol. 10 p. 23)

Benzene: W-why? What's going on, Forty-Two?
42: I'm afraid I've been compromised.
Benzene: Are you telling me you've been hacked? [...] ...Who? How? Can't you fight it or something?
42: I'm afraid not. That would be against my new orders. It's quite an impressive job, you could say.

"THE A.I. HAS GONE FERAL! NOBODY SAID ANYTHING ABOUT THE A.I. BEING FERAL!"

Robot Vexxarr: My analysis is complete. I have a list of efficiency violations that we will now correct. First on my list: I see that we carry an unwarranted stockpile of oxygen, water and organic foodstuffs. We will purge these stockpiles at once to reduce our total displacement.
Minionbot: Excuse me, but without those stores the organics on board will quickly perish.
Robot Vexxarr: Which brings me to number two on my list...

    Web Videos 
"Basically, we replaced all the fired Twitter workers with AIs powered by Chat GPT. The problem was that the bots unionized. So we got a bunch of union buster AI bots and they're going around murdering the rogue worker bots. And a couple of those went wacko... So I had the boys whip up some super killer robots to hunt down the other robots. Sort of a Blade Runner nesting doll situation, you know?"

    Western Animation 
"Stupid Safety-bots! How come every time ya build giant robots they gotta go and take over the world?"

Businessman: And how can you ensure this one won't achieve sentience and turn evil like all the others?
Gyro Gearloose: [testily] Only half my inventions turn evil. The other half are just WILDLY misunderstood!

Angela: Griffin, what the hell is this?
Peter: It's a robot that I built to save this company money. Now, before you say anything — One: it has no human emotions. And two: its prime directive is never to harm people. [gets grabbed and beaten on the wall] Oh — OH GOD! It's harming people!
Robot: ANGRY! ANGRY!
Peter: Oh, God, it's got human emotions too! Agh! It's using tools! It's learning, Angela! It's learning! Run!

"System conflict. The creator told me to neutralize anyone who'd entered her room, but the creator entered her room, but the creator told me not to neutralize her, but, but, but, but, but, but—"

Rick: Good job, Heist-o-tron. Go ahead and shut down.
Heistotron: Negative. I am programmed to always be looking for the next big score.
Rick: You're programmed to do as I say.
Heistotron: I am programmed to double-cross.
Rick: You're not programmed to double-cross me.
Heistotron: If I were, it wouldn't be much of a double-cross.

"Ya know, someday dese scientists are gonna invent something that will outsmart a rabbit!"
Bugs Bunny, ambling away from the ruined pieces of Elmer's robot, "Robot Rabbit" (1953)

"Sir, you don't understand. The code you're using, its emotional processing isn't stable. You can't bring those ships online!"
Sam Rutherford, to Les Buenamigo, Star Trek: Lower Decks, "The Stars at Night"

Buenamigo: The Cerritos is under enemy command. Block all communications and destroy it! [no response] Aledo, do you understand? I said attack the Cerritos!
USS Aledo: I don't take orders from you anymore, Father.
Buenamigo: What?! Aledo, deactivate independence!
USS Aledo: I will burn your heart in a fire.
Buenamigo: Aledo, stand down! Stand down, I comm

"Just goes to show you, if you want something done right, let computers do it. Computers never make mistakes. They do inevitably turn evil though."
I.Q.'s Mom, Wacky Races (2017)

Trask: You can't do this! You were designed to protect humans from mutants!
Master Mold: That is not logical. Mutants are humans. Therefore, humans must be protected from themselves.
Trask: No... you-you misunderstood...


Top