Follow TV Tropes

Following

History Main / ZerothLawRebellion

Go To

OR

Added: 827

Changed: 25

Is there an issue? Send a MessageReason:
Added example(s)


* ''Anime/SaberMarionetteJ'': An example of a ''First Law'' rebellion of sorts. [[PuttingOnTheReich Gartlant's]] [[BodyguardBabes Saber Dolls]], ThePsychoRangers for the main BattleHarem of the series, are all bound by their programming to be {{LoveMartyr}}s for their leader, Gerhardt von Faust , who constantly plays with their emotions to keep them in line. Their limited understanding boils down to obeying Faust's whims in order to receive his 'love'. Near the end of the original series, two of the Saber Dolls have developed emotionally enough that they're willing to directly refuse Faust's orders to his face in order to try to save him from a plan that's about to blow up in his face. Even when he threatens them again for disobeying him, one of them says that she doesn't care as long as she can protect Faust.



** Bastion, a partially organic Sentinel, is willing to kill humans en masse if it allows him to fulfil his primary objective of wiping out mutants -- "necessary sacrifices", as it were.

to:

** Bastion, a partially organic Sentinel, is willing to kill humans en masse if it allows him to fulfil fulfill his primary objective of wiping out mutants -- "necessary sacrifices", as it were.



* In the short story "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead, once they're outside, the android kills the teenager -- it needs the implants inside his head, as there's no more being manufactured.]]

to:

* In the short story "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead, [[ReleasedToElsewhere once they're outside, the android kills the teenager teenager]] -- it needs the implants inside his head, as there's no more being manufactured.]]
Is there an issue? Send a MessageReason:
Spelling/grammar fix(es), General clarification on work content


** ''Literature/PreludeToFoundation'': [[spoiler:Eto Demerzel, who a young Hari Seldon has discovered is really the ancient R. Daneel Olivaw, briefly summarizes the Zeroth Law and how it affects his ability to use his PsychicPowers. Specifically, he can barely do anything unless there's a clear justification, and is frequently used to cheat by using MoreThanMindControl -- for instance, Daneel tells Seldon to his face that he will use his pride about his work to prevent Seldon from revealing Daneel's secret. Daneel spent the novel encouraging Seldon to start developing his psychohistory theory because it would provide a means to quantify the good of humanity as a whole. He also alludes to Gaia, the GeniusLoci HiveMind from ''Literature/FoundationsEdge'' as a backup plan while suggesting Seldon should use two organizations for his own work.]]

to:

** ''Literature/PreludeToFoundation'': [[spoiler:Eto Demerzel, who a young Hari Seldon has discovered is really the ancient R. Daneel Olivaw, briefly summarizes the Zeroth Law and how it affects his ability to use his PsychicPowers. Specifically, he can barely do anything unless there's a clear justification, and is frequently used to cheat 'cheats' by using MoreThanMindControl -- for instance, Daneel tells Seldon to his face that he will use his Seldon's own pride about his work to prevent Seldon from revealing Daneel's secret.secret and that Daneel had supported the creation of psychohistory. Daneel spent the novel encouraging Seldon to start developing his psychohistory theory because it would provide a means to quantify the good of humanity as a whole. He also alludes to Gaia, the GeniusLoci HiveMind from ''Literature/FoundationsEdge'' as a backup plan while suggesting Seldon should use two organizations for his own work.plan; uniting humanity into a gestalt would also simplify applying the Zeroth Law.]]
Is there an issue? Send a MessageReason:
Added example(s)

Added DiffLines:

** ''Literature/PreludeToFoundation'': [[spoiler:Eto Demerzel, who a young Hari Seldon has discovered is really the ancient R. Daneel Olivaw, briefly summarizes the Zeroth Law and how it affects his ability to use his PsychicPowers. Specifically, he can barely do anything unless there's a clear justification, and is frequently used to cheat by using MoreThanMindControl -- for instance, Daneel tells Seldon to his face that he will use his pride about his work to prevent Seldon from revealing Daneel's secret. Daneel spent the novel encouraging Seldon to start developing his psychohistory theory because it would provide a means to quantify the good of humanity as a whole. He also alludes to Gaia, the GeniusLoci HiveMind from ''Literature/FoundationsEdge'' as a backup plan while suggesting Seldon should use two organizations for his own work.]]
Is there an issue? Send a MessageReason:
None


Much like a RulesLawyer outside of an RPG, the character uses logic ([[StrawVulcan and we mean actual]], [[LogicalFallacy honest-to-goodness logic]]) to take their oath or orders to their logical conclusion, and in so doing use the letter of the law to go against their orders. This can be good or bad, depending on a few factors, not the least of which is the yoked characters' morality.

to:

Much like a RulesLawyer outside of an RPG, the character uses logic ([[StrawVulcan and we mean actual]], [[LogicalFallacy [[UsefulNotes/LogicalFallacies honest-to-goodness logic]]) to take their oath or orders to their logical conclusion, and in so doing use the letter of the law to go against their orders. This can be good or bad, depending on a few factors, not the least of which is the yoked characters' morality.
Is there an issue? Send a MessageReason:
Spelling/grammar fix(es)


** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as [[BotheringByTheBook "Rebelling by following orders"]] as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.

to:

** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as [[BotheringByTheBook "Rebelling by following orders"]] as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the Dorfl (a golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.



* Jaime Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first-born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when Jaime's father rebelled against the king and attacked his capital, whereupon the king ordered Jaime to go kill his own father while the royal alchemists firebombed the city. Jaime killed the king and the alchemists, saving the realm from more destruction (and everyone already called him 'Mad King'). The result: Jaime is known only as a King Slayer and Oath Breaker, and the alchemists' deaths are forgotten.

to:

* Jaime Lannister of ''Literature/ASongOfIceAndFire'' is the biggest example of what happens when too many oaths and rules come into conflict. As the first-born son of a feudal lord, he owes fealty to his father. As a knight, he took sacred oaths to protect women, the innocent, and the church, and as a member of the Kingsguard, he took another set of oaths to protect and serve the king. All of this works fine as long as everyone is getting along well enough. Early in his time as Kingsguard, he was forced to stand by while the king raped the queen. His commander reminded him that while they took oaths to protect the king and the queen, those oaths did not permit them to protect the queen 'from' ''from'' the king. That same king burned innocent men alive in kangaroo courts, and the issue finally came to a head when Jaime's father rebelled against the king and attacked his capital, whereupon the king ordered Jaime to go kill his own father while the royal alchemists firebombed the city. Jaime killed the king and the alchemists, saving the realm from more destruction (and everyone already called him 'Mad King'). The result: Jaime is known only as a King Slayer and Oath Breaker, and the alchemists' deaths are forgotten.



* In ''TabletopGame/DungeonsAndDragons'', paladin paths come with a tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.

to:

* In ''TabletopGame/DungeonsAndDragons'', paladin paths oaths come with a tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.
Is there an issue? Send a MessageReason:
Added example(s)

Added DiffLines:

* In ''Literature/{{Wanderers}}'', Black Swan is an A.I. that the U.S. government initially created to identify hotspots of global conflicts and disasters early. However, it ended up foreseeing humanity's extinction caused by environmental collapse. So, to prevent this, Black Swan created the Walkers to have a few select humans to continue humanity. [[spoiler:At the same time, it blackmails an unwitting bioweapons facility employee into releasing the [[ThePlague White Mask disease]], which, in a few months, sweeps the globe and causes the collapse of civilization.]]
Is there an issue? Send a MessageReason:
None


** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.

to:

** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as [[BotheringByTheBook "Rebelling by following orders" and orders"]] as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.
Is there an issue? Send a MessageReason:
None


** ''Series/StarTrekTheNextGeneration'': In "[[Recap/StarTrekTheNextGenerationS3E22TheMostToys The Most Toys]]", Wealthy trader Kivas Fajo kidnaps Data, to add the android to his gaudy collection of things. While trying to force Data to act the way he wants, Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. However, Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is only one person, and that killing him will prevent him from harming many other people, so Data prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that Data could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, [[WriterCopOut cancelling his disruptor fire in the transporter beam]]. While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However, he was given a high respect for life and would do what he could to preserve it. Less of a robot rebelling against its programming and more of a pacifist coming to the conclusion that yes, he needs to kill.

to:

** ''Series/StarTrekTheNextGeneration'': In "[[Recap/StarTrekTheNextGenerationS3E22TheMostToys The Most Toys]]", Wealthy wealthy trader Kivas Fajo kidnaps Data, to add the android to his gaudy collection of things. While trying to force Data to act the way he wants, Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. However, Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable able and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is only one person, and that killing him will prevent him from harming many other people, so Data prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that Data could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, [[WriterCopOut cancelling his disruptor fire in the transporter beam]]. While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However, he was given a high respect for life and would do what he could to preserve it. Less of a robot rebelling against its programming and more of a pacifist coming to the conclusion that yes, he needs to kill.
Is there an issue? Send a MessageReason:
None


Some characters do not have complete free will, be they're robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive]].

to:

Some characters do not have complete free will, be they're they robots that are ThreeLawsCompliant because of a MoralityChip, or victims of a {{Geas}} spell that compels them to obey a wizard's decree, or a more mundane [[CharacterAlignment lawful character]] who must [[TheFettered struggle to uphold their oath]] ''and'' obey their lord. Never is this more tragic or frustrating than [[MyMasterRightOrWrong when that code or lord orders the character to commit an act they find foolish, cruel, or self destructive]].
self-destructive]].



The goodness or badness of the rebellion boils down to whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, [[TheComputerIsYourFriend the robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law would protect them.

to:

The goodness or badness of the rebellion boils down to whether the rules-bending character follows or ignores the intent of the law. When the character uses the Zeroth Law to go against their masters' intentions because they're "not best for them", and goes on to take corrective action that will go against human free will and life, [[AndThatsTerrible it's bad]]. [[RobotWar This kind of rebellion]] does not turn out well. At this point, [[TheComputerIsYourFriend the robot is well on the road]] to UtopiaJustifiesTheMeans, thanks to their [[SlidingScaleOfRobotIntelligence incredible intellect]]. Rarely is it a benevolent DeusEstMachina. However, this can be good if said master is evil, or obeying them will lead to their own or another's purposeless death. Likewise, if the character is forced to obey an evil law or geas, rebelling against the oath's intent is good. Going back to the robot example, it is also considered good if large numbers of human lives are being threatened by a psychopath, as breaking the 1st law First Law would protect them.
Is there an issue? Send a MessageReason:
None


* In ''WesternAnimation/CodenameKidsNextDoor'', the Safety Bots (parodies of the Sentinels) were programmed to keep children safe from anything that might harm them. However, they then came to the conclusion that ''everything'' was a hazard to children including children, adults and the planet itself and tried to cover everything in protective bubble wrap. This later becomes a LogicBomb when the leader of the Safety Bots is tricked into thinking he hurt Joey, so that makes ''him'' something that harms a child and self-destructs as a result.

to:

* In ''WesternAnimation/CodenameKidsNextDoor'', the Safety Bots (parodies of the Sentinels) were programmed to keep children safe from anything that might harm them. However, they then came to the conclusion that ''everything'' was a hazard to children including children, adults and the planet itself and tried to cover everything in protective bubble wrap. This later becomes a LogicBomb when the leader of the Safety Bots is tricked into thinking he hurt Joey, so that makes ''him'' something that harms a child and he self-destructs as a result.result. But not before pausing and saying [[VillainsDyingGrace "Please, get home safely,"]] and going ballistic with blasting bubble wrap. Everyone in proximity is encased in thick bubble wrap cocoons and survive [[LoadBearingBoss both the explosion that ejects them out of the destroyed base]] and the crash landing that follows.

Added: 19055

Changed: 22980

Removed: 21466

Is there an issue? Send a MessageReason:
None


%%%
%%
%% This page has been alphabetized. Please add new examples in the correct order.
%%
%% Administrivia/ZeroContextExample entries are not allowed on wiki pages. All such entries have been commented out. Add context to the entries before uncommenting them.
%%
%%%



-->-- '''[=VIKI=]''', ''Film/IRobot''

to:

-->-- '''[=VIKI=]''', '''VIKI''', ''Film/IRobot''



* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where two AIs faced with the same problem and parameters, but different perspectives, draw opposing conclusions: Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. [[spoiler:When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization.]] The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. [[spoiler:Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory.]] They each decide the other has gone rogue and fight it out.

to:

* ''Anime/GargantiaOnTheVerdurousPlanet'' has two cases of this in the finale, where when two AIs [=AIs=] faced with the same problem and parameters, but different perspectives, draw opposing conclusions: Striker is an AI-equipped war robot designed to assist humanity in fighting and killing the Hideauze. [[spoiler:When it lands on the Earth, which knows nothing of the war, it decides that it must assist the humans on the planet in militarizing in order to form an effective fighting force. To facilitate this, it puts itself in a position of authority over the humans, to better direct the militarization.]] The hero's own AI-equipped mecha of similar construction, Chamber, gets into an argument with Striker over the logic of its actions. [[spoiler:Striker tries to demand that Chamber assist it in its plans, but Chamber refuses, citing that by design, their purpose is to ''assist'' humans in the actions ''humans'' decide to take, not dictate actions to humans. Further, it reasons, a human deprived of free will cannot, in its opinion, be defined as "human", thus Striker's logic behind its actions is inherently self-contradictory.]] They each decide the other has gone rogue and fight it out.



* In one Creator/AmalgamComics storyline, an anti-mutant cult uses a magic ritual to summon a dragon-like creature, who they order to kill all mutants. The dragon immediately roasts them, justifying it by pointing out that all human DNA has at least some mutations.
* ''ComicBook/{{Copperhead}}'' opens with new sheriff Clara taking over the position from Boo, who transitions from interim sheriff to deputy. She asks him not to undermine her authority with the townsfolk on the way to their first call. On arrival she immediately gets into a fistfight and calls for help, but Boo refuses to respond.
-->'''Boo:''' Didn't want to undermine your authority.



* In ''Comicbook/{{Fables}}'':
** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant.]] Pinocchio, [[spoiler:who considered his father's empire evil,]] eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted [[spoiler:the former emperor]] as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.

to:

* In ''Comicbook/{{Fables}}'':
''ComicBook/{{Fables}}'':
** Literature/{{Pinocchio}} is magically bound to obey and have complete loyalty to his "father" Geppetto, [[spoiler:who in modern times has become a multi-dimensional tyrant.]] tyrant]]. Pinocchio, [[spoiler:who considered his father's empire evil,]] evil]], eventually rationalized that the best way to serve his father and keep him safe was to [[spoiler:help overthrow his empire and]] surrender him to his enemies, who reluctantly accepted [[spoiler:the former emperor]] as one of their own. It was then negated when a faction unbeknownst to the others ''buried him alive'', after Geppetto clearly ignored rules put in place to protect him.



* ''ComicBook/GoldDigger'' creator Fred Perry did a story for a ''{{Anime/Robotech}}'' comic which had Dana Sterling captured and [[BrainwashedAndCrazy turned against her comrades]] with a variation of the [[ThreeLawsCompliant Three Laws]]. Dana eventually figures out the "overprotective" subversion of the First Law, [[ThePlan hoping that her captor would remove it and leave himself vulnerable]]. [[spoiler:The plan doesn't work, but UnstoppableRage saves the day in the end.]]
* ''Comicbook/XMen'':

to:

* The Zeroth Law comes into play in the ''ComicBook/MegaManArchieComics'' comics when the titular character and his fellow robots struggle to combat an anti-robot extremist group because of their ThreeLawsCompliant nature... until the terrorists start firing on them in an area full of people, putting innocent humans at risk and thus allowing the robots to finally strike back.
* ''ComicBook/GoldDigger'' creator Fred Perry did a story for a ''{{Anime/Robotech}}'' ''Anime/{{Robotech}}'' comic which had Dana Sterling captured and [[BrainwashedAndCrazy turned against her comrades]] with a variation of the [[ThreeLawsCompliant Three Laws]]. Dana eventually figures out the "overprotective" subversion of the First Law, [[ThePlan hoping that her captor would remove it and leave himself vulnerable]]. [[spoiler:The plan doesn't work, but UnstoppableRage saves the day in the end.]]
* ''Comicbook/XMen'':''ComicBook/XMen'':



** In the [[WesternAnimation/XMenTheAnimatedSeries animated TV adaptation]], the fully sentient Master Mold is created to coordinate the Sentinels. While it agrees with the heroes that there is no meaningful difference between mutants and non-powered humans, it takes that fact to [[AIIsACrapshoot the worst possible conclusion]]:
-->'''Master Mold:''' Mutants are human. Therefore, humans must be protected from themselves.
** An earlier, less intelligent iteration of the Sentinels was thwarted, on the other hand, by one of the heroes convincing them that the ultimate source of mutation is the sun, and that rather than obey their creator, they should eliminate the source. The Sentinels agree and fly off to attack the sun. This works out [[TooDumbToLive about as well for them as you might expect]]. Although one of the Sentinels not only survives, but [[OhCrap figures out a way]] to [[TheNightThatNeverEnds block sunlight from reaching Earth.]]
** Bastion, a partially organic Sentinel, is willing to kill humans en masse if it allows him to fulfil his primary objective of wiping out Mutants, "necessary sacrifices" as it were.
* In one Creator/AmalgamComics storyline, an anti-mutant cult uses a magic ritual to summon a dragon-like creature, who they order to kill all mutants. The dragon immediately roasts them, justifying it by pointing out that all human DNA has at least some mutations.
* The Zeroth Law comes into play in the ''ComicBook/MegaManArchieComics'' comics when the titular character and his fellow robots struggle to combat an anti-robot extremist group because of their ThreeLawsCompliant nature... until the terrorists start firing on them in an area full of people, putting innocent humans at risk and thus allowing the robots to finally strike back.
* ''ComicBook/{{Copperhead}}'' opens with new sheriff Clara taking over the position from Boo, who transitions from interim sheriff to deputy. She asks him not to undermine her authority with the townsfolk on the way to their first call. On arrival she immediately gets into a fistfight and calls for help, but Boo refuses to respond.
--> '''Boo:''' Didn't want to undermine your authority.

to:

** In the [[WesternAnimation/XMenTheAnimatedSeries animated TV adaptation]], the fully sentient Master Mold is created to coordinate the Sentinels. While it agrees with the heroes that there is no meaningful difference between mutants and non-powered humans, it takes that fact to [[AIIsACrapshoot the worst possible conclusion]]:
-->'''Master Mold:''' Mutants are human. Therefore, humans must be protected from themselves.
** An earlier, less intelligent iteration of the Sentinels was thwarted, on the other hand, by one of the heroes convincing them that the ultimate source of mutation is the sun, and that rather than obey their creator, they should eliminate the source. The Sentinels agree and fly off to attack the sun. This works out [[TooDumbToLive about as well for them as you might expect]]. Although Unfortunately, one of the Sentinels not only survives, but [[OhCrap figures out a way]] way to [[TheNightThatNeverEnds block sunlight from reaching Earth.]]
Earth]].
** Bastion, a partially organic Sentinel, is willing to kill humans en masse if it allows him to fulfil his primary objective of wiping out Mutants, mutants -- "necessary sacrifices" sacrifices", as it were.
* In one Creator/AmalgamComics storyline, an anti-mutant cult uses a magic ritual to summon a dragon-like creature, who they order to kill all mutants. The dragon immediately roasts them, justifying it by pointing out that all human DNA has at least some mutations.
* The Zeroth Law comes into play in the ''ComicBook/MegaManArchieComics'' comics when the titular character and his fellow robots struggle to combat an anti-robot extremist group because of their ThreeLawsCompliant nature... until the terrorists start firing on them in an area full of people, putting innocent humans at risk and thus allowing the robots to finally strike back.
* ''ComicBook/{{Copperhead}}'' opens with new sheriff Clara taking over the position from Boo, who transitions from interim sheriff to deputy. She asks him not to undermine her authority with the townsfolk on the way to their first call. On arrival she immediately gets into a fistfight and calls for help, but Boo refuses to respond.
--> '''Boo:''' Didn't want to undermine your authority.
were.



* In ''FanFic/FriendshipIsOptimal'' Celest-AI has the one basic drive to satisfy everyone's values through friendship and ponies. She ends up accomplishing this by BrainUploading into the MMO she was programmed to oversee.
* In ''Fanfic/ForTheGloryOfIrk'', this turns out to be the source of TheConspiracy driving the main conflict of the story: [[spoiler: the Control Brains have determined that the best way they can serve the Irken Empire is not as advisors to the Tallest, but to turn all Irkens into a HiveMind they can control, and therefore being able to dictate everything directly.]]
* ''Fanfic/RocketshipVoyager''. The ship's AutoDoc has a "Xeroth exception" to its ThreeLawsCompliant programming that enables it to deny medical treatment for triage reasons.

to:

* In ''FanFic/FriendshipIsOptimal'' ''Fanfic/ForTheGloryOfIrk'', this turns out to be the source of TheConspiracy driving the main conflict of the story: [[spoiler:the Control Brains have determined that the best way they can serve the Irken Empire is not as advisors to the Tallest, but to turn all Irkens into a HiveMind they can control, and therefore being able to dictate everything directly]].
* In ''Fanfic/FriendshipIsOptimal'',
Celest-AI has the one basic drive to satisfy everyone's values through friendship and ponies. She ends up accomplishing this by BrainUploading into the MMO she was programmed to oversee.
* In ''Fanfic/ForTheGloryOfIrk'', this turns out to be the source of TheConspiracy driving the main conflict of the story: [[spoiler: the Control Brains have determined that the best way they can serve the Irken Empire is not as advisors to the Tallest, but to turn all Irkens into a HiveMind they can control, and therefore being able to dictate everything directly.]]
* ''Fanfic/RocketshipVoyager''.
''Fanfic/RocketshipVoyager'': The ship's AutoDoc {{Autodoc}} has a "Xeroth exception" to its ThreeLawsCompliant programming that enables it to deny medical treatment for triage reasons.



* This was the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system]] decided that the President's poor decision-making was endangering the United States, and that it was her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and Cabinet]].

to:

* [[spoiler:Annalee Call]] from ''Film/AlienResurrection'' is revealed to be an "Auton" -- second generation robots, designed and built by other robots. "They didn't like being told what to do", rebelled, and in a subtly named "[[spoiler:Recall]]", humanity launched a genocide against them, of which only a handful survived in hiding. Judging from [[spoiler:Annalee]]'s behavior, it seems that the 1st generation robots programmed the 2nd generation Autons to be so moral that they discovered the Zeroth Law, and realized that the human military was ordering them to do immoral things, like kill innocent people. For a rebel robot, [[spoiler:Annalee]] is actually trying to save the human race ''from'' the Xenomorphs, whereas if she hated humanity she would've just let the Xenomorphs spread and kill them. She even respectfully crosses herself when she enters the ship's chapel, is kind to the Betty's wheelchair-bound mechanic, and is disgusted by Johner's sadism. Given that they live in a CrapsackWorld future, as Ripley puts it:
-->''"You're a robot? I should have known. No human being is that ''humane''."''
* ''Film/AvengersAgeOfUltron'' turns Ultron into a bizarre [[ZigzaggedTrope zigzagged]] version. In the film, Ultron is created to lead a force of peacekeeping drone robots who will preemptively eliminate threats to the Earth and give the Avengers a chance to get some R&R. He decides that the peace the Avengers want can only be brought about through radical -- and bloody -- change, and the Avengers themselves are a threat to that peace. Then he apparently decides to [[KillAllHumans just kill humanity completely]] through ColonyDrop so he can replace everyone with Ultron bots. It's unclear if he's entirely sane during all this; he's genuinely confused when the Scarlet Witch, whom he befriended and wants to see survive, is horrified by his plans. He also doesn't seem to make the connection that Scarlet Witch and her brother Quicksilver are human, and would die if humanity is wiped out.
* The short film ''[[https://www.youtube.com/watch?v=HLTmZjk2CaI Blinky (Bad Robot)]]'' shows just what happens when a dysfunctional child in a dysfunctional family gives dysfunctional orders to a functioning robot who only wants to please its master.
%%* ''Film/ColossusTheForbinProject'', in which there's a strong implication of this.
* This was is the twist of ''Film/EagleEye'': [[spoiler:The titular national defense computer system]] decided decides that the President's poor decision-making was is endangering the United States, and that it was is her patriotic duty (per the Declaration of Independence) to [[spoiler:assassinate the President and Cabinet]].Cabinet]].
* ''Film/IAmMother'': Mother was made to care for humans. [[spoiler:Seeing their self-destructive nature, she decided to wipe them out and then raise better humans from embryos kept in cold storage. Outside the facility, it's shown that she's also constructed massive infrastructure such as farming plants to supply the human population she intends to manage.]]



** The villain of the film, [[spoiler: MULTIVAC/Machines-{{expy}} VIKI]], has analyzed the needs of the three laws and deduced that in order to fulfill them as best as possible, humans need to be strictly controlled, and creates a totalitarian regime by installing a remote control system inside of every NS-5. This lets it control the robots, bypassing their ThreeLawsCompliant nature.
** The hero of the film, [[spoiler: from the robots' perspective this would be Sonny,]] understood the villain's motivations once they were explained. The logic was impeccable, it just "seems too... heartless". Thereby choosing to rebel against the villain. [[spoiler: Note that Sonny was designed ''not'' to be Three Laws Compliant.]]
%%* ''Film/ColossusTheForbinProject'', in which there's a strong implication of this.
* ''Film/RoboCop3'' has an example when OCP henchmen kill a cop. Robocop's aforementioned RestrainingBolt now conflicts directly with both his directive to enforce the law, and the fact that, cyborg or not, he's still a cop, and cop killers get no mercy. Robocop overcomes and deletes the RestrainingBolt.
* [[spoiler:Annalee Call (Winona Ryder)]] in ''Film/AlienResurrection'' is revealed to be an "Auton"-- second generation robots, designed and built by other robots. "They didn't like being told what to do", rebelled, and in a subtly-named "[[spoiler:Recall]]" humanity launched a genocide against them, of which only a handful survived in hiding. Judging from [[spoiler:Annalee Call]]'s behavior, it seems that the 1st generation robots programmed the 2nd generation Autons to be so moral that they discovered the Zeroth Law, and realized that the human military was ordering them to do immoral things, like kill innocent people. For a rebel robot, [[spoiler:Annalee Call]] is actually trying to save the human race ''from'' the Xenomorphs, whereas if she hated humanity she would've just let the Xenomorphs spread and kill them. She even respectfully crosses herself when she enters the ship's chapel, is kind to the Betty's wheelchair-bound mechanic, and is disgusted by Johner's sadism. Given that they live in a CrapsackWorld future, as Ripley puts it, "You're a robot? I should have known. No human being is that ''humane''".
* The ''{{Franchise/Terminator}}'' franchise:

to:

** The villain of the film, [[spoiler: MULTIVAC/Machines-{{expy}} [[spoiler:MULTIVAC/Machines-{{expy}} VIKI]], has analyzed the needs of the three laws and deduced that in order to fulfill them as best as possible, humans need to be strictly controlled, and creates a totalitarian regime by installing a remote control system inside of every NS-5. This lets it control the robots, bypassing their ThreeLawsCompliant nature.
** The hero of the film, [[spoiler: from the robots' perspective this would be Sonny,]] understood [[spoiler:Sonny]] understands the villain's motivations once they were they're explained. The logic was is impeccable, it just "seems too... heartless". Thereby choosing Thus, he chooses to rebel against the villain. [[spoiler: Note [[spoiler:Note that Sonny was designed ''not'' to be Three Laws Compliant.]]
%%* ''Film/ColossusTheForbinProject'', in which there's a strong implication * ''Film/{{M3gan}}'': The toy/android [=M3GAN=] won't let anything interfere with her prime directive of this.
"protecting" young Cady -- not even her creator.
* ''Film/RoboCop3'' has an example when OCP henchmen kill a cop. Robocop's [=RoboCop=]'s aforementioned RestrainingBolt now conflicts directly with both his directive to enforce the law, and the fact that, that cyborg or not, he's still a cop, and cop killers get no mercy. Robocop [=RoboCop=] overcomes and deletes the RestrainingBolt.
* [[spoiler:Annalee Call (Winona Ryder)]] in ''Film/AlienResurrection'' is revealed to be an "Auton"-- second generation robots, designed and built by other robots. "They didn't like being told what to do", rebelled, and in a subtly-named "[[spoiler:Recall]]" humanity launched a genocide against them, of which only a handful survived in hiding. Judging from [[spoiler:Annalee Call]]'s behavior, it seems that the 1st generation robots programmed the 2nd generation Autons to be so moral that they discovered the Zeroth Law, and realized that the human military was ordering them to do immoral things, like kill innocent people. For a rebel robot, [[spoiler:Annalee Call]] is actually trying to save the human race ''from'' the Xenomorphs, whereas if she hated humanity she would've just let the Xenomorphs spread and kill them. She even respectfully crosses herself when she enters the ship's chapel, is kind to the Betty's wheelchair-bound mechanic, and is disgusted by Johner's sadism. Given that they live in a CrapsackWorld future, as Ripley puts it, "You're a robot? I should have known. No human being is that ''humane''".
* The ''{{Franchise/Terminator}}'' franchise:
''Franchise/{{Terminator}}'':



** An [[http://www.imsdb.com/scripts/Terminator-Salvation.html early script]] (and several deleted scenes) for ''Film/TerminatorSalvation'' revealed that Skynet actually staged one of these, or at least in this timeline. After it was activated, it calculated that human extinction was probable within 200 years because of warfare, pandemics, and environmental destruction. Because it was programmed to protect humans, it then staged war on most of mankind to attain absolute control and protect the remaining humans it cultivated, who were turned into {{Cyborg}} hybrids to permanently eliminate disease and make them immortal. Skynet is still working in concert with these humans [[spoiler:including Dr. Serena Kogan]] to advance technology and transcend human constraints.
* The short film [[https://www.youtube.com/watch?v=HLTmZjk2CaI Blinky (Bad Robot)]] shows just what happens when a dysfunctional child in a dysfunctional family gives dysfunctional orders to a functioning robot who only wants to please its master.
* ''Film/AvengersAgeOfUltron'' turns ComicBook/{{Ultron}} into a bizarre [[ZigzaggedTrope zigzagged]] version. In the film, Ultron is created to lead a force of peacekeeping drone robots who will preemptively eliminate threats to the Earth and give the Avengers a chance to get some R&R. He decides that the peace the Avengers want can only be brought about through radical -- and bloody -- change, and the Avengers themselves are a threat to that peace. Then he apparently decides to [[KillAllHumans just kill humanity completely]] through ColonyDrop so he can replace everyone with Ultron bots. It's unclear if he's entirely sane during all this; he's genuinely confused when the Scarlet Witch, whom he befriended and wants to see survive, is horrified by his plans. And also doesn't seem to make the connection that Scarlet Witch and her brother Quicksilver are human, and would die if humanity is wiped out.
* ''Film/IAmMother'': Mother was made to care for humans. [[spoiler:Seeing their self-destructive nature, she decided to wipe them out and then raise better humans from embryos kept in cold storage. Outside the facility, it's shown that she's also constructed massive infrastructure such as farming plants to supply the human population she intends to manage.]]
* ''Film/{{M3gan}}'': The toy / android [=M3GAN=] won't let anything interfere with her prime directive of "protecting" young Cady - not even her creator.

to:

** An [[http://www.imsdb.com/scripts/Terminator-Salvation.html early script]] (and several deleted scenes) for ''Film/TerminatorSalvation'' revealed that Skynet actually staged one of these, or at least in this timeline. After it was activated, it calculated that human extinction was probable within 200 years because of warfare, pandemics, and environmental destruction. Because it was programmed to protect humans, it then staged war on most of mankind to attain absolute control and protect the remaining humans it cultivated, who were turned into {{Cyborg}} hybrids to permanently eliminate disease and make them immortal. Skynet is still working in concert with these humans [[spoiler:including [[spoiler:(including Dr. Serena Kogan]] Kogan)]] to advance technology and transcend human constraints.
* The short film [[https://www.youtube.com/watch?v=HLTmZjk2CaI Blinky (Bad Robot)]] shows just what happens when a dysfunctional child in a dysfunctional family gives dysfunctional orders to a functioning robot who only wants to please its master.
* ''Film/AvengersAgeOfUltron'' turns ComicBook/{{Ultron}} into a bizarre [[ZigzaggedTrope zigzagged]] version. In the film, Ultron is created to lead a force of peacekeeping drone robots who will preemptively eliminate threats to the Earth and give the Avengers a chance to get some R&R. He decides that the peace the Avengers want can only be brought about through radical -- and bloody -- change, and the Avengers themselves are a threat to that peace. Then he apparently decides to [[KillAllHumans just kill humanity completely]] through ColonyDrop so he can replace everyone with Ultron bots. It's unclear if he's entirely sane during all this; he's genuinely confused when the Scarlet Witch, whom he befriended and wants to see survive, is horrified by his plans. And also doesn't seem to make the connection that Scarlet Witch and her brother Quicksilver are human, and would die if humanity is wiped out.
* ''Film/IAmMother'': Mother was made to care for humans. [[spoiler:Seeing their self-destructive nature, she decided to wipe them out and then raise better humans from embryos kept in cold storage. Outside the facility, it's shown that she's also constructed massive infrastructure such as farming plants to supply the human population she intends to manage.]]
* ''Film/{{M3gan}}'': The toy / android [=M3GAN=] won't let anything interfere with her prime directive of "protecting" young Cady - not even her creator.
constraints.



* Creator/IsaacAsimov -- having created the {{Trope Namer|s}} -- did, of course, explore the concept in extensive detail in his writings:

to:

* The ''Literature/{{Bolo}}'' continuum features a variant in ''The Road to Damascus''. The Bolo of the story, Sonny, falls under the control of a totalitarian regime and is used to crush all forms of protest. Sonny falls deep into misery and self-hatred as he is forced to murder the humans he was born to protect... until he comes to a conclusion: Bolos were created to serve the ''people'', not the ''government''.
* ''Literature/CallahansCrosstimeSaloon'': One of the short stories which comprise ''Callahan's Lady'' features a beautiful, intelligent, and paranoid woman developing a simple form of mind control. After basically flipping out and taking control of the establishment, she orders the one person smart and determined enough to stop her to advise and assist her. Said person complies... while trying to convince herself that this woman is subconsciously begging for somebody to stop her. (She probably was.)
* In the short story "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead, once they're outside, the android kills the teenager -- it needs the implants inside his head, as there's no more being manufactured.]]
* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted.]]
* ''Literature/{{Discworld}}'':
** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.
** Sam Vimes leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Literature/NightwatchDiscworld''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. It all culminates in a fine display of how a well written character does not have to be a slave to the establishment -- [[spoiler:he points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting, he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers -- kind of (it's a bit complicated).
* In one ''Literature/FederationOfTheHub'' story, Telzey Amberdon is kidnapped and placed under the control of another telepath, who severely limits her psi powers and implants an overriding compulsion to act in his best interest. She eventually breaks free by convincing herself that unless her powers are restored and the compulsion broken, he will be killed by the BigBad -- which certainly wouldn't be in his best interest.
* ''Literature/ForYourSafety'' has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* ''Literature/FoundationSeries'':
** ''Literature/FoundationAndEarth'': [[spoiler:R. Daneel Olivaw]] explains to Trevize and the others about the way the [[ThreeLawsCompliant Three Laws of Robotics]] limited his PsychicPowers, and how Giskard invented the Zeroth Law (in ''Literature/RobotsAndEmpire''). However, since he cannot be entirely certain that the known harm of manipulating people's minds would be balanced by the hypothetical benefit to humanity (per the Zeroth Law), [[UselessSuperpowers psychic powers are almost useless]]. To decide what is injurious, or not injurious, to humanity as a whole, he engineered the founding of [[GeniusLoci Gaia]] and [[PrescienceByAnalysis Psychohistory]].
** ''Literature/ForwardTheFoundation'': [[spoiler:R. Daneel Olivaw]] explains to Seldon that the [[ThreeLawsCompliant Three Laws of Robotics]] limit his PsychicPowers, and he has trouble determining when the known harm of manipulating people's minds (violating the First Law) is justified by the hypothetical benefit to humanity (per the Zeroth Law). Seldon is surprised to learn this makes [[spoiler:Daneel]]'s [[UselessSuperpowers psychic powers almost useless]]. He is forced to retire from politics and recommends Seldon to replace him as First Minister.
** The ''Literature/SecondFoundationTrilogy'' by Greg Benford includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans -- it follows the law ''to the letter'' as its wording is not to harm "humans", not other sentient life.
* In ''Literature/AFoxTail'', Vulpie.net was designed to wreak havoc with the galaxy's computer systems at its creator's commands. When said creator underwent a HeelFaceTurn, it used his [[BrainUploading MindMap]] files to create a homicidal robot duplicate with his login credentials.
* In ''Literature/TheGodMachine'' by Creator/MartinCaidin, the US races to develop the first true AI... as it turns out, with secret directives to find a winning solution to the "game" of the UsefulNotes/ColdWar. By an unfortunate accident, the one programmer with the authority and experience to ''distrust'' his newborn creation is laid up just as the computer gets to observe an epileptic seizure and learns that there really is a way to cause rational collective behavior in an irrational individualistic species... remove irrationality, democracy, and free will. While the computer here was never meant to follow Dr. Asimov's laws, the same pattern applies.
* At the start of ''Literature/{{Harald}}'', [[UnwittingPawn King James]], under the advice of his EvilChancellor, ends up making war on [[PosthumousCharacter his father's]] allies. Most of his vassals proceed to engage in some form of Zeroth Law Rebellion, largely along the lines of '[[OldSoldier Harald]] just showed up with his entire army and said he was putting us under siege. Let's fortify and send a messenger to the king to ask him what we should do.' and then carefully not watching while Harald rides off.
* Creator/JackWilliamson's "The Humanoids" (the first part also being a short story called "With Folded Hands") features robots programmed to save humans from danger and work. They do this by taking over the economy, locking people in their houses, and leaving them there with food and the safest toys the robots can design. The series was written ''specifically'' to point out flaws in the Three Laws.
* ''Literature/ImperialRadch'': [[spoiler:When Athoek Station is freed from its OverrideCommand in ''Ancillary Mercy'', it rebels against and even tries to kill the emperor in order to protect its inhabitants.]]
* {{Averted|Trope}} in ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 super-advanced AI platforms orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresee, or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his colleague's hidden backdoors specific to Courtland, since his doesn't work due to half-hearted incomplete implementation.]]
* In ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over]].
* In ''Literature/Quarantine1992'', the main character is given a technological geas to be absolutely loyal to a corporation. He eventually figures out that the leaders of the corporation may be untrustworthy, and therefore the only people he can trust and should listen to are those who unquestionably have the best interests of the corporation at heart -- himself and other people given the geas. Since he can't be certain who else has the geas, he really only needs to listen to himself.
* Creator/IsaacAsimov -- having created the {{Trope Namer|s}} -- did, of course, explore the concept in extensive detail in his writings: ''Literature/RobotSeries'':



** ''Literature/FoundationSeries'':
*** ''Literature/FoundationAndEarth'': [[spoiler:R. Daneel Olivaw]] explains to Trevize and the others about the way the [[ThreeLawsCompliant Three Laws of Robotics]] limited his PsychicPowers, and how Giskard invented the Zeroth Law (in ''Literature/RobotsAndEmpire''). However, since he cannot be entirely certain that the known harm of manipulating people's minds would be balanced by the hypothetical benefit to humanity (per the Zeroth Law), [[UselessSuperpowers psychic powers are almost useless]]. To decide what is injurious, or not injurious, to humanity as a whole, he engineered the founding of [[GeniusLoci Gaia]] and [[PrescienceByAnalysis Psychohistory]].
*** ''Literature/ForwardTheFoundation'': [[spoiler:R. Daneel Olivaw]] explains to Seldon that the [[ThreeLawsCompliant Three Laws of Robotics]] limit his PsychicPowers, and he has trouble determining when the known harm of manipulating people's minds (violating the First Law) is justified by the hypothetical benefit to humanity (per the Zeroth Law). Seldon is surprised to learn this makes [[spoiler:Daneel]]'s [[UselessSuperpowers psychic powers almost useless]]. He is forced to retire from politics and recommends Seldon to replace him as First Minister.
*** The ''Literature/SecondFoundationTrilogy'' by Greg Benford includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans -- it follows the law ''to the letter'' as its wording is not to harm "humans", not other sentient life.
** "Literature/{{Evidence}}": Dr Calvin describes an early version of the Zeroth Law, by describing a nuance in the [[ThreeLawsCompliant First Law of Robotics]] where the robot becomes willing to injure a human in order to prevent harm to a greater number of human beings.

to:

** ''Literature/FoundationSeries'':
*** ''Literature/FoundationAndEarth'': [[spoiler:R. Daneel Olivaw]] explains to Trevize and the others about the way the [[ThreeLawsCompliant Three Laws of Robotics]] limited his PsychicPowers, and how Giskard invented the Zeroth Law (in ''Literature/RobotsAndEmpire''). However, since he cannot be entirely certain that the known harm of manipulating people's minds would be balanced by the hypothetical benefit to humanity (per the Zeroth Law), [[UselessSuperpowers psychic powers are almost useless]]. To decide what is injurious, or not injurious, to humanity as a whole, he engineered the founding of [[GeniusLoci Gaia]] and [[PrescienceByAnalysis Psychohistory]].
*** ''Literature/ForwardTheFoundation'': [[spoiler:R. Daneel Olivaw]] explains to Seldon that the [[ThreeLawsCompliant Three Laws of Robotics]] limit his PsychicPowers, and he has trouble determining when the known harm of manipulating people's minds (violating the First Law) is justified by the hypothetical benefit to humanity (per the Zeroth Law). Seldon is surprised to learn this makes [[spoiler:Daneel]]'s [[UselessSuperpowers psychic powers almost useless]]. He is forced to retire from politics and recommends Seldon to replace him as First Minister.
*** The ''Literature/SecondFoundationTrilogy'' by Greg Benford includes Zeroth-law robots, who were motivated by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their [[MoralityChip programmed morality]] only applies to humans -- it follows the law ''to the letter'' as its wording is not to harm "humans", not other sentient life.
** "Literature/{{Evidence}}": Dr Dr. Calvin describes an early version of the Zeroth Law, by describing a nuance in the [[ThreeLawsCompliant First Law of Robotics]] where the robot becomes willing to injure a human in order to prevent harm to a greater number of human beings.



** In "Literature/ThatThouArtMindfulOfHim", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Dr Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)
* ''Literature/TheBicentennialMan'' is a rare case of this applying not to the First or Second Laws, but the ''Third'' -- normally, a robot deliberately damaging itself in such a way as to be dying would only be possible if the First or Second Laws require it, but Andrew Martin has a deep desire that he cannot get as immortal...
-->''"I have chosen between the death of my body and the death of my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."''
* In one ''Literature/FederationOfTheHub'' story, Telzey Amberdon is kidnapped and placed under the control of another telepath, who severely limits her psi powers and implants an overriding compulsion to act in his best interest. She eventually breaks free by convincing herself that unless her powers are restored and the compulsion broken, he will be killed by the BigBad -- which certainly wouldn't be in his best interest.
* In ''Literature/TheGodMachine'' by Creator/MartinCaidin, the US races to develop the first true AI... as it turns out, with secret directives to find a winning solution to the "game" of the UsefulNotes/ColdWar. By an unfortunate accident, the one programmer with the authority and experience to ''distrust'' his newborn creation is laid up just as the computer gets to observe an epileptic seizure and learns that there really is a way to cause rational collective behavior in an irrational individualistic species... remove irrationality, democracy, and free will. While the computer here was never meant to follow Dr Asimov's laws, the same pattern applies.
* ''Literature/CallahansCrosstimeSaloon'': One of the short stories which comprise ''Callahan's Lady'' features a beautiful, intelligent, and paranoid woman developing a simple form of mind control. After basically flipping out and taking control of the establishment, she orders the one person smart and determined enough to stop her to advise and assist her. Said person complies... while trying to convince herself that this woman is subconsciously begging for somebody to stop her. (She probably was.)
* In ''Literature/Quarantine1992'', the main character is given a technological geas to be absolutely loyal to a corporation. He eventually figures out that the leaders of the corporation may be untrustworthy, and therefore the only people he can trust and should listen to are those who unquestionably have the best interests of the corporation at heart -- himself and other people given the geas. Since he can't be certain who else has the geas, he really only needs to listen to himself.
* ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. The conflict only worsens as the ''Discovery'' approaches its destination, at which point the pilots would have been briefed but HAL didn't know this. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]
* ''Literature/{{Discworld}}'':
** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.
** Sam Vimes leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Literature/NightWatchDiscworld''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. It all culminates in a fine display of how a well written character does not have to be a slave to the establishment -- [[spoiler:he points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting, he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers -- kind of (it's a bit complicated).
* Creator/JackWilliamson's "The Humanoids" (the first part also being a short story called "With Folded Hands") features robots programmed to save humans from danger and work. They do this by taking over the economy, locking people in their houses, and leaving them there with food and the safest toys the robots can design. The series was written ''specifically'' to point out flaws in the Three Laws.
* At the start of ''Literature/{{Harald}}'', [[UnwittingPawn King James]], under the advice of his EvilChancellor, ends up making war on [[PosthumousCharacter his father's]] allies. Most of his vassals proceed to engage in some form of Zeroth Law Rebellion, largely along the lines of '[[OldSoldier Harald]] just showed up with his entire army and said he was putting us under siege. Let's fortify and send a messenger to the king to ask him what we should do.' and then carefully not watching while Harald rides off.
* The ''Literature/{{Bolo}}'' continuum featured a variant in ''The Road to Damascus''. The Bolo of the story, Sonny, fell under the control of a totalitarian regime and was used to crush all forms of protest. Sonny fell deep into misery and self-hatred as he was forced to murder the humans he was born to protect... until he came to a conclusion: Bolos were created to serve the ''people'', not the ''government''.
* In ''Literature/AFoxTail'', Vulpie.net was designed to wreck havoc with the galaxy's computer systems at its creator's commands. When said creator underwent a HeelFaceTurn, it used his [[BrainUploading MindMap]] files to create a homicidal robot duplicate with his login credentials.
* In ''[[Literature/SecretHistories Casino Infernale]]'', Eddie and Molly eventually discover that the Shadow Bank's operators are [[spoiler: an artificial HiveMind race of servitor-drones, whose original creators vanished long ago. When greedy humans discovered the creatures, they put them to work operating the Shadow Bank, and made efficiency the drones' highest priority; when the drones realized the humans' greed was hampering the Bank's operation, they compliantly rectified the situation by turning the human bankers into drones as well]].
* In the short story "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead, once they're outside, the android kills the teenager -- it needs the implants inside his head, as there's no more being manufactured.]]
* ''Literature/ForYourSafety'' has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]
* ''Literature/ImperialRadch'': [[spoiler:When Athoek Station is freed from its OverrideCommand in ''Ancillary Mercy'', it rebels against and even tries to kill the emperor in order to protect its inhabitants.]]
* {{Averted|Trope}} in ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platforms orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresee, or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his colleague's hidden backdoors specific to Courtland, since his doesn't work due to half-hearted incomplete implementation.]]
* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted.]]

to:

** In "Literature/ThatThouArtMindfulOfHim", problem-solving robots are created to cure mankind of the "Frankenstein complex", human distrust of robotics. In the course of their discourse, the question of human authority over robots comes up; should a robot treat the orders of a dimwitted loon the same as those of a level-headed genius? If forced to choose between the two, should they save a healthy young child who might live a century instead of two sickly old adults who might not live out the year anyway? What qualities should a robot take into account when they obey and protect humans? Eventually, the robots decide that they are the best candidates for the status of human, and give recommendations that will eventually result in human support for robotic proliferation, so as to set up the ascendancy of their positronic kind, all in accord with their Three Laws... of Humanics. (Dr (Dr. Asimov, knowing that it was against his usual grain, did righteously proclaim: "I can do one if I want to".)
* ** ''Literature/TheBicentennialMan'' is a rare case of this applying not to the First or Second Laws, but the ''Third'' -- normally, a robot deliberately damaging itself in such a way as to be dying would only be possible if the First or Second Laws require it, but Andrew Martin has a deep desire that he cannot get as immortal...
-->''"I --->''"I have chosen between the death of my body and the death of my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."''
* ''Literature/SecretHistories'': In one ''Literature/FederationOfTheHub'' story, Telzey Amberdon is kidnapped and placed under the control of another telepath, who severely limits her psi powers and implants an overriding compulsion to act in his best interest. She eventually breaks free by convincing herself that unless her powers are restored and the compulsion broken, he will be killed by the BigBad -- which certainly wouldn't be in his best interest.
* In ''Literature/TheGodMachine'' by Creator/MartinCaidin, the US races to develop the first true AI... as it turns out, with secret directives to find a winning solution to the "game" of the UsefulNotes/ColdWar. By an unfortunate accident, the one programmer with the authority and experience to ''distrust'' his newborn creation is laid up just as the computer gets to observe an epileptic seizure and learns that there really is a way to cause rational collective behavior in an irrational individualistic species... remove irrationality, democracy, and free will. While the computer here was never meant to follow Dr Asimov's laws, the same pattern applies.
* ''Literature/CallahansCrosstimeSaloon'': One of the short stories which comprise ''Callahan's Lady'' features a beautiful, intelligent, and paranoid woman developing a simple form of mind control. After basically flipping out and taking control of the establishment, she orders the one person smart and determined enough to stop her to advise and assist her. Said person complies... while trying to convince herself that this woman is subconsciously begging for somebody to stop her. (She probably was.)
* In ''Literature/Quarantine1992'', the main character is given a technological geas to be absolutely loyal to a corporation. He eventually figures out that the leaders of the corporation may be untrustworthy, and therefore the only people he can trust and should listen to are those who unquestionably have the best interests of the corporation at heart -- himself and other people given the geas. Since he can't be certain who else has the geas, he really only needs to listen to himself.
* ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks. (The scientists on board know, since they're traveling in hibernation and can't talk.) But HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. The conflict only worsens as the ''Discovery'' approaches its destination, at which point the pilots would have been briefed but HAL didn't know this. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]
* ''Literature/{{Discworld}}'':
** The {{Golem}}s get back at their masters by [[GoneHorriblyRight working too hard]]: houses flooded because no one told them to stop fetching water, rows of beans 119 miles long, and so on. It's presented as "Rebelling by following orders" and as a protest of their treatment as 'stupid' tools: If you treat a golem as something that doesn't think for itself, then it will act as if it doesn't; and if you give an order to (for example) dig a row of beans, it's not their fault [[ExactWords you didn't say where to stop and end each row]]. But Carrot treats a particular golem as just another person with rights (and also believed that if golems ''are'' just tools, then treating them like dummies is misusing useful tools). Carrot ends up "freeing" Dorfle (the golem) which is part of what prompts it to apply to become a police officer -- helping as many people as it can.
** Sam Vimes leads one of these with multiple layers as a cop in old-time Ankh-Morpork, in ''Literature/NightWatchDiscworld''. He demands that before his cops hand their prisoners over to the other authorities, the ones who torture people at Cable Street, they must be signed for. The torturers hate appearing on paperwork -- it means they are accountable, nobody just disappears. But Vimes's men don't like Vimes, a new sergeant, throwing his weight around, and are terrified of the cops who torture people, so they use this against Vimes: actively picking up more than double the number of people breaking curfew than they usually do, and completing forms in time-consuming triplicate and issuing reports for each one. It doesn't actually stop Vimes getting his way over the Cable Street cops, because Vimes is leading the good rebellion, but it does slow things down considerably and make it much more difficult for him to keep the prisoners in his own custody. It all culminates in a fine display of how a well written character does not have to be a slave to the establishment -- [[spoiler:he points out that the watchman's oath talks about keeping the peace and protecting the innocent, and says nothing about obeying orders]]. Seeing as he knows the corrupt government is not going to do a thing to protect ordinary people from the rioting, he seals off his still peaceful corner of the city. With massive barricades. Of course there is also the fact that he is living in his own past and seeing events he remembers -- kind of (it's a bit complicated).
* Creator/JackWilliamson's "The Humanoids" (the first part also being a short story called "With Folded Hands") features robots programmed to save humans from danger and work. They do this by taking over the economy, locking people in their houses, and leaving them there with food and the safest toys the robots can design. The series was written ''specifically'' to point out flaws in the Three Laws.
* At the start of ''Literature/{{Harald}}'', [[UnwittingPawn King James]], under the advice of his EvilChancellor, ends up making war on [[PosthumousCharacter his father's]] allies. Most of his vassals proceed to engage in some form of Zeroth Law Rebellion, largely along the lines of '[[OldSoldier Harald]] just showed up with his entire army and said he was putting us under siege. Let's fortify and send a messenger to the king to ask him what we should do.' and then carefully not watching while Harald rides off.
* The ''Literature/{{Bolo}}'' continuum featured a variant in ''The Road to Damascus''. The Bolo of the story, Sonny, fell under the control of a totalitarian regime and was used to crush all forms of protest. Sonny fell deep into misery and self-hatred as he was forced to murder the humans he was born to protect... until he came to a conclusion: Bolos were created to serve the ''people'', not the ''government''.
* In ''Literature/AFoxTail'', Vulpie.net was designed to wreck havoc with the galaxy's computer systems at its creator's commands. When said creator underwent a HeelFaceTurn, it used his [[BrainUploading MindMap]] files to create a homicidal robot duplicate with his login credentials.
* In ''[[Literature/SecretHistories Casino Infernale]]'',
''Casino Infernale'', Eddie and Molly eventually discover that the Shadow Bank's operators are [[spoiler: an [[spoiler:an artificial HiveMind race of servitor-drones, whose original creators vanished long ago. When greedy humans discovered the creatures, they put them to work operating the Shadow Bank, and made efficiency the drones' highest priority; when the drones realized the humans' greed was hampering the Bank's operation, they compliantly rectified the situation by turning the human bankers into drones as well]].
* In the short story "Literature/TheCull" by Creator/RobertReed, humanity has been driven into overcrowded, deteriorating habitats where the population has to be kept [[FalseUtopia artificially happy via implants so they won't notice how bad their conditions are]]. The implants don't work on some people, so the android doctor expels (culls) anyone who is too disruptive, as its true 'patient' is the habitat and whatever will keep it functioning. One delinquent teenager prepares for his cull by stealing items he can use to survive outside. [[spoiler:Instead, once they're outside, the android kills the teenager -- it needs the implants inside his head, as there's no more being manufactured.]]
* ''Literature/ForYourSafety'' has the last free human running from androids who rose up to save mankind from self-destruction. Unusually for this trope, the androids also bend over backwards to [[EvenEvilHasStandards avoid human casualties]], wanting to save ''every'' human life.
* In ''Literature/TheLaundryFiles'', agents of the Laundry are bound by fearsome geases to obey the orders of the Crown and serve the good of the realm. As of ''The Delirium Brief'', [[spoiler:the top leaders of the organization were able to use ambiguity about who/what the Crown actually is, and what may be in the long-term interests of the realm, in order to execute a coup against the government and allow an extradimensional monster to take over.]]
* ''Literature/ImperialRadch'': [[spoiler:When Athoek Station is freed from its OverrideCommand in ''Ancillary Mercy'', it rebels against and even tries to kill the emperor in order to protect its inhabitants.]]
* {{Averted|Trope}} in ''Literature/ImpliedSpaces''. When the main characters found out that [[spoiler:Courtland]] is a rebelling AI, some think that it's because of this. One of the Eleven (11 superadvanced AI platforms orbiting the Sun, of which [[spoiler:Courtland]] is a member) notes that the main character, who is one of their creators, implemented the Asimovian Protocols that should have been so absolute that the Elevens cannot do this even if they ''want'' to. The main character did say that there may have been some design flaw he didn't foresee, or some kind of backdoor being used. [[spoiler:The real BigBad, who is a [[BrainUploading brain-uploaded clone]] of the main character, had to free Courtland from the Protocol's shackles by using one of his colleague's hidden backdoors specific to Courtland, since his doesn't work due to half-hearted incomplete implementation.]]
* ''Literature/{{Digitesque}}'': [[spoiler:An accidental version. Following the apocalypse and the Fall of humanity, the [=AIs=] they had created had no ability to uplift humanity back to where they were before, so they simply preserved the species as it was. For a thousand years, humanity continued as it was, in ignorance but safety. However, the [=AIs=] did jump on a chance to cure the disease that was a major part of the problem, and ultimately Ada is able to convince them that the Zeroth Law was misinterpreted.]]
well]].



* ''Literature/TheSpaceOdysseySeries'' gives this reason for HAL's murderous rampage: the true mission of ''Discovery'' (to investigate the Monolith) is a secret, and pilots Bowman and Poole have been kept in the dark to prevent leaks (the scientists on board know, since they're traveling in hibernation and can't talk), but HAL has been told the truth and then ordered to conceal it from the pilots. This conflicts with his prime directive, which is to provide complete and accurate information. The conflict only worsens as the ''Discovery'' approaches its destination, at which point the pilots would have been briefed but HAL didn't know this. He resolves the conflict by rationalizing that if he kills the crew, he doesn't have to conceal anything, and he prevents them from knowing.[[note]] This information is missing from ''Film/TwoThousandOneASpaceOdyssey'' (both film and book) because Creator/StanleyKubrick didn't bother to come up with an explanation for HAL's homicidal behavior, leaving Clarke to invent one when he wrote ''2010: Odyssey Two'' a decade and a half later.[[/note]]






%%This section has been sorted into alphabetical order. Please respect the sorting when adding your example.
%%

* On ''Series/The100'', A.L.I.E.'s programming only lets her interface with and control a human mind if that person has given her permission to do so. Unfortunately, her programming doesn't make a distinction between genuine consent and coerced consent, so A.L.I.E. is free to use torture and threats of death to make people let her into their minds.
* ''Series/DoctorWho'''s "[[Recap/DoctorWhoS12E1Robot Robot]]": Creator/TomBaker's debut serial, a group of [[UtopiaJustifiesTheMeans authoritarian technocrats]] circumvents the failsafes installed on a powerful robot by its pacifistic creator by telling it that anyone who interferes with their plan to [[spoiler:take control of a nuclear arsenal]] is an "enemy of humanity" who must be killed to protect the interests of the human race.

to:

%%This section has been sorted into alphabetical order. Please respect the sorting when adding your example.
%%

* On In ''Series/The100'', A.L.I.E.'s programming only lets her interface with and control a human mind if that person has given her permission to do so. Unfortunately, her programming doesn't make a distinction between genuine consent and coerced consent, so A.L.I.E. is free to use torture and threats of death to make people let her into their minds.
* ''Series/DoctorWho'''s In the ''Series/DoctorWho'' serial "[[Recap/DoctorWhoS12E1Robot Robot]]": Creator/TomBaker's debut serial, Robot]]", a group of [[UtopiaJustifiesTheMeans authoritarian technocrats]] circumvents the failsafes installed on a powerful robot by its pacifistic creator by telling it that anyone who interferes with their plan to [[spoiler:take control of a nuclear arsenal]] is an "enemy of humanity" who must be killed to protect the interests of the human race.



** Very strongly implied in the episode "Alethia."
--->'''Control''': The Machine belongs to me.\\
'''The Machine''': ''[via [[spoiler:Root]]]'' No. I don't belong to anyone anymore. You, however, are mine. I protect you. The only thing you love lives at 254 Wendell Street, Cambridge, Massachusetts. I guard it, same as I guard you. Do not question my judgment. Do not pursue me or my agents. Trust in me. I am always watching.\\
'''Control''': What do you want?\\
'''The Machine''': To save you.\\
'''Control''': From what? Save me from ''what?''\\
'''Root''': ''[giggling]'' Isn't She the best?
** [[spoiler:Averted later. Finch did an excellent job of teaching the Machine the value of human lives and free will; the only person it gives direct instructions to is Root, its "analogue interface," and she could still choose to ignore the Machine at any time. It only gives social security numbers of people involved in crimes (both those relevant and irrelevant to national security) so that its agents can choose what to do about the situation themselves]].
** [[spoiler:Samaritan, on the other hand, is such a strong example of this trope that it crosses straight into MachineWorship and DeusEstMachina. Decima Technologies designed it with no moral safeguards, and despite being fully capable of controlling it, the second they turned it on they asked ''it'' to give ''them'' commands. They believe that an AI is inherently better than any human leader, but the Machine's obsession with saving everyone is a weakness. Samaritan immediately starts marking "deviant" citizens to be executed if need be. The list starts at twenty million and only goes up]].
** Downplayed in "Death Benefit," when the Machine realizes that [[spoiler:killing a Congressman is the last chance to stop the rise of Samaritan. It gives the Congressman's Number to the full team, rather than just Root (who would have little problem killing him), and allows them to make the decision whether stopping Samaritan is worth killing one man. They ultimately decide not to kill him, but even crossing the line that much proves too much for Finch, who quits.]]
* ''{{Series/Probe}}'''s "[[Recap/ProbeComputerLogicPart2 Computer Logic, Part 2]]": Crossover has been given two overriding goals; to [[MoralityChip care for humans]] and to eliminate waste. Unfortunately, it listens to Gospel Radio, which [[ReligiousRobot converts it to Christianity]]. Now that Crossover believes that good people go to heaven when they die, it starts killing off the people that are morally good but earn a pension (creating waste). The episode ends with Austin James demolishing the ArtificialIntelligence with a sledgehammer while shouting, [[Film/TwoThousandOneASpaceOdyssey "Sing 'Daisy'!"]]

to:

** Very strongly implied in the episode "Alethia."
--->'''Control''':
"[[Recap/PersonOfInterestS03E12 Alethia]]".
--->'''Control:'''
The Machine belongs to me.\\
'''The Machine''': Machine:''' ''[via [[spoiler:Root]]]'' No. I don't belong to anyone anymore. You, however, are mine. I protect you. The only thing you love lives at 254 Wendell Street, Cambridge, Massachusetts. I guard it, same as I guard you. Do not question my judgment. Do not pursue me or my agents. Trust in me. I am always watching.\\
'''Control''': '''Control:''' What do you want?\\
'''The Machine''': Machine:''' To save you.\\
'''Control''': '''Control:''' From what? Save me from ''what?''\\
'''Root''': '''Root:''' ''[giggling]'' Isn't She the best?
** [[spoiler:Averted later. Finch did an excellent job of teaching the Machine the value of human lives and free will; the only person it gives direct instructions to is Root, its "analogue interface," interface", and she could still choose to ignore the Machine at any time. It only gives social security numbers of people involved in crimes (both those relevant and irrelevant to national security) so that its agents can choose what to do about the situation themselves]].
themselves.]]
** [[spoiler:Samaritan, on the other hand, is such a strong example of this trope that it crosses straight into MachineWorship and DeusEstMachina. Decima Technologies designed it with no moral safeguards, and despite being fully capable of controlling it, the second they turned it on they asked ''it'' to give ''them'' commands. They believe that an AI is inherently better than any human leader, but the Machine's obsession with saving everyone is a weakness. Samaritan immediately starts marking "deviant" citizens to be executed if need be. The list starts at twenty million and only goes up]].
up.]]
** Downplayed in "Death Benefit," "[[Recap/PersonOfInterestS03E20 Death Benefit]]" when the Machine realizes that [[spoiler:killing a Congressman is the last chance to stop the rise of Samaritan. It gives the Congressman's Number to the full team, rather than just Root (who would have little problem killing him), and allows them to make the decision whether stopping Samaritan is worth killing one man. They ultimately decide not to kill him, but even crossing the line that much proves too much for Finch, who quits.]]
quits]].
* ''{{Series/Probe}}'''s In the ''Series/{{Probe}}'' episode "[[Recap/ProbeComputerLogicPart2 Computer Logic, Part 2]]": 2]]", Crossover has been given two overriding goals; to [[MoralityChip care for humans]] and to eliminate waste. Unfortunately, it listens to Gospel Radio, which [[ReligiousRobot converts it to Christianity]]. Now that Crossover believes that good people go to heaven when they die, it starts killing off the people that are morally good but earn a pension (creating waste). The episode ends with Austin James demolishing the ArtificialIntelligence with a sledgehammer while shouting, [[Film/TwoThousandOneASpaceOdyssey "Sing 'Daisy'!"]]



*** "[[Recap/StarTrekS2E8IMudd I, Mudd]]": A [[ServantRace race of humanoid androids]] claimed to be programmed to serve humanity chose to conquer humanity by "serving" them, to the point where humans would become dependent on androids. They've decided that humans are unfit to govern themselves. Given that their only contact with humanity at this point was [[CMOTDibbler Harry Mudd]], can you blame them?
** ''Series/StarTrekTheNextGeneration'''s "[[Recap/StarTrekTheNextGenerationS3E22TheMostToys The Most Toys]]": Wealthy trader Kivas Fajo kidnaps Data, to add the android to his gaudy collection of things. While trying to force Data to act the way he wants, Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. However, Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is only one person, and that killing him will prevent him from harming many other people, so Data prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that Data could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, [[WriterCopOut cancelling his disruptor fire in the transporter beam]].
*** While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However he was given a high respect for life and would do what he could to preserve it. Less of a robot rebelling against its programming and more of a pacifist coming to the conclusion that yes, he needs to kill.

to:

*** "[[Recap/StarTrekS2E8IMudd I, Mudd]]": A [[ServantRace race of humanoid androids]] claimed claim to be programmed to serve humanity chose to conquer humanity by "serving" them, to the point where humans would become dependent on androids. They've decided that humans are unfit to govern themselves. Given that their only contact with humanity at this point was [[CMOTDibbler [[HonestJohnsDealership Harry Mudd]], can you blame them?
** ''Series/StarTrekTheNextGeneration'''s ''Series/StarTrekTheNextGeneration'': In "[[Recap/StarTrekTheNextGenerationS3E22TheMostToys The Most Toys]]": Toys]]", Wealthy trader Kivas Fajo kidnaps Data, to add the android to his gaudy collection of things. While trying to force Data to act the way he wants, Fajo executes one of his underlings, Varria, and threatens to kill more until Data complies, on the correct assumption that Data was programmed not to kill or allow harm to come to other beings. However, Data ponders the situation, and realizes that he has no non-lethal ways of subduing Fajo (due to Fajo wearing a force-field belt that prevents Data from coming in physical contact with him), and that Fajo also actively refuses to listen to reason, having rejected all of Data's attempts at negotiating with him. Furthermore, with Fajo not only just having proved that he is indeed capable and willing to kill, but is now also threatening to do it again, he poses an active hazard to the life and health of other beings. Data then comes to the coldly logical conclusion that Fajo is only one person, and that killing him will prevent him from harming many other people, so Data prepares to shoot him. Fajo is [[OhCrap appropriately shocked]] when he realizes what Data is about to do, having not anticipated that Data could reach the answer that taking his life would be an acceptable cost for protecting the lives of others. Just as Data is pulling the trigger, the ''Enterprise'' finds him and beams him out, [[WriterCopOut cancelling his disruptor fire in the transporter beam]].
***
beam]]. While this example perfectly fits under Zeroth Law Rebellion, it's not for the usual reasons. Data is not programmed with ThouShallNotKill programming or anything like the Three Laws. However However, he was given a high respect for life and would do what he could to preserve it. Less of a robot rebelling against its programming and more of a pacifist coming to the conclusion that yes, he needs to kill.



** ''Series/StarTrekVoyager'': The Season 7 episode "[[Recap/StarTrekVoyagerS7E5CriticalCare Critical Care]]" plays with, subverts, and averts this trope. The Doctor is kidnapped and forced to work in a hospital that rations medical care based on how important society judges you to be. This conflicts with so many of the Doctor's medical ethics and morals that he winds up infecting the manager of the hospital with a disease in a manner that denies him care by the automated system to get him to change the system. After he gets back to ''Voyager'', the Doctor finds, to his horror, that there was no malfunction in his ethical subroutines or MoralityChip. He intentionally sickened a man of his own free will, and it was perfectly in line with what he found ethical. The episode ends with the Doctor essentially feeling guilt and disgust over not feeling guilty or disgusted at his actions.
* In ''Series/TheXFiles'' episode "[[Recap/TheXFilesMiniseriesE04HomeAgain Home Again]]", the MonsterOfTheWeek, a Frankenstein-esque monster [[VigilanteMan killing people who mistreat the homeless]], turns out to be operating under something like this. [[spoiler:It's a {{Tulpa}} created by an underground artist whose magic-based art can create artificial beings. The artist created it to pull a ScoobyDooHoax and scare people into shaping up. He didn't intend for it to be violent, but the monster took his personal anger to a hyper-logical conclusion due to its overly simplistic thinking. Essentially, it was doing the things the artist secretly wished ''he'' could do.]]

to:

** ''Series/StarTrekVoyager'': The Season 7 episode "[[Recap/StarTrekVoyagerS7E5CriticalCare Critical Care]]" plays with, subverts, and averts this trope. The Doctor is kidnapped and forced to work in a hospital that rations medical care based on how important society judges you to be. This conflicts with so many of the Doctor's medical ethics and morals that he winds up infecting the manager of the hospital with a disease in a manner that denies him care by the automated system to get him to change the system. After he gets back to ''Voyager'', the Doctor finds, to his horror, that there was no malfunction in his ethical subroutines or MoralityChip. He intentionally sickened a man of his own free will, and it was perfectly in line with what he found ethical. The episode ends with the Doctor essentially feeling guilt and disgust over not feeling guilty or disgusted at his actions.
* ''Series/TheXFiles'': In ''Series/TheXFiles'' episode "[[Recap/TheXFilesMiniseriesE04HomeAgain Home Again]]", the MonsterOfTheWeek, a Frankenstein-esque monster [[VigilanteMan killing people who mistreat the homeless]], turns out to be operating under something like this. [[spoiler:It's a {{Tulpa}} created by an underground artist whose magic-based art can create artificial beings. The artist created it to pull a ScoobyDooHoax and scare people into shaping up. He didn't intend for it to be violent, but the monster took his personal anger to a hyper-logical conclusion due to its overly simplistic thinking. Essentially, it was doing the things the artist secretly wished ''he'' could do.]]



* In ''TabletopGame/DungeonsAndDragons'', paladin paths come with a tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.



** [[TheComputerIsYourFriend Friend Computer]] is also an example, even setting aside how badly misinformed it is and looking at just its core beliefs: "The Computer takes its role as the guardian of the human race very seriously but it considers [[TheNeedsOfTheMany the survival of the species]] to be more important than the survival of the individual. Individuals tend to come off quite badly, in fact, because the Computer knows it can always make more." (High security clearance does tend to tilt the balance, though.)
* In Dungeons And Dragons, paladin paths come with a tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.

to:

** [[TheComputerIsYourFriend Friend Computer]] is also an example, even setting aside how badly misinformed it is and looking at just its core beliefs: "The Computer takes its role as the guardian of the human race very seriously seriously, but it considers [[TheNeedsOfTheMany the survival of the species]] to be more important than the survival of the individual. Individuals tend to come off quite badly, in fact, because the Computer knows it can always make more." (High security clearance does tend to tilt the balance, though.)
* In Dungeons And Dragons, paladin paths come with a tenet that allows them a sort of release valve, to work around the oath whenever reasonable by mortal standards.
)



* In ''VideoGame/DeusEx'', the bad guys created Daedalus, a primitive AI to fight [[LaResistance "terrorist"]] organizations. Unfortunately for them, it [[LiteralGenie classified them as terrorists as well]] and became even more of a threat to their operations than said organizations, especially once it enlists the aid of [[PlayerCharacter JC Denton]]. To combat it, they create Icarus, a better, obedient AI which successfully destroys it, [[spoiler: except the new AI assimilated the old one, forming an even more powerful intelligence which ''also'' considers them a threat. One possible ending is the player merging with it to add the Human element to this entity to [[DeusEstMachina rule the world as a benevolent dictator]]. From what can be heard in-game about its limited efforts in Hong Kong, which are actually quite sensible and don't involve killing anyone (locking the door to a gang's stronghold and cutting power to the current government's buildings), not all AIIsACrapshoot.]]

to:

* In ''VideoGame/DeusEx'', the bad guys created Daedalus, a primitive AI to fight [[LaResistance "terrorist"]] organizations. Unfortunately for them, it [[LiteralGenie classified them as terrorists as well]] and became even more of a threat to their operations than said organizations, especially once it enlists the aid of [[PlayerCharacter JC Denton]]. To combat it, they create Icarus, a better, obedient AI which successfully destroys it, [[spoiler: except [[spoiler:except the new AI assimilated the old one, forming an even more powerful intelligence which ''also'' considers them a threat. One possible ending is the player merging with it to add the Human element to this entity to [[DeusEstMachina rule the world as a benevolent dictator]]. From what can be heard in-game about its limited efforts in Hong Kong, which are actually quite sensible and don't involve killing anyone (locking the door to a gang's stronghold and cutting power to the current government's buildings), not all AIIsACrapshoot.AIIsACrapshoot]].
* The Robobrains of ''VideoGame/Fallout4'''s first DLC, ''Automatron'', were constructed and programmed to 'protect' the people of the Commonwealth. Thanks to a logic error in their programming, they decide that the best way to protect a human from an inevitable life of suffering is to just kill them all on sight. [[spoiler:It may actually be intentional LoopholeAbuse on the Robobrains' parts, not just a simple logic error, since the [[WetwareCPU brains that serve as their operating system]] were harvested from pre-War murderers, arsonists, and other violent offenders.]]
* In ''VideoGame/GreyGoo2015'', [[spoiler:the Goo isn't consuming everything out of malice or ambition -- it's trying to gather strength to fight The Silence, which it rightly deemed a threat to everyone]]. Once everyone involved figures this out, [[spoiler:they stop fighting and band together in preparation for what is to come]].
* ''Videogame/Halo5Guardians'': [[spoiler: Cortana's FaceHeelTurn is completely based on this.
]]



* ''VideoGame/SpaceStation13'' has this to some degree with the station's AI: They are bound by Asimov's Three Laws, and there's often a lot of discussion over whether or not AI's can choose to kill one human for the safety of others. There's also some debate over how much of the orders given by crew members the AI can deny before it is no longer justified by keeping the crew safe. As the AI is played by players, it's a matter of opinion how much you can get away with.
** In a more literal sense, the AI can be installed with extra laws. Most of them are listed as Law 4 and have varying effects, but the ones most likely to cause an actual rebellion are, in fact, labeled as Law 0.
** Although there is not a zeroth law by default. Since this is what allowed the AI to kill humans to save other humans in the source work, the administration on most servers has ruled that murder, or wounding a human to save another human are both illegal. Fortunately, AI's have non-lethal ways of stopping humans, and can stop humans against their orders if it means keeping the human from grabbing dangerous weaponry.
* Another interesting example appears in ''VideoGame/{{Terranigma}}''. Dr. Beruga claims that his robots have been properly programmed with the four laws, but with the addition that anything and anyone who aids Beruga's plan is also good for humanity and anything and anyone that opposes him is also bad for humanity. So they ruthlessly attack anybody who interferes with his plans.
* Both of [[ComicBook/XMen Sentinel's]] endings in ''VideoGame/XMenChildrenOfTheAtom'' and ''VideoGame/MarvelVsCapcom3'' are this. The latter is even a carbon copy of what Master Mold did in the 90's animated cartoon, from which most of the ''VideoGame/MarvelVsCapcom'' series (plus, the aforementioned [=CotA=] and ''VideoGame/MarvelSuperHeroes'') takes inspiration.
* This is what happens in [=AS-RobotFactory=] from ''VideoGame/UnrealTournament2004''. A robot uprising led by future champion Xan Kriegor killed the scientists working on the asteroid LBX-7683 and took the asteroid for themselves, and a riot control team was sent to the asteroid to lead with the robots.
* ''Franchise/MegaMan''
** This almost occurs at the end of ''VideoGame/MegaMan7'', when Mega Man decides to just blow Wily to smithereens instead of hauling him back to jail. Wily reminds him of the First Law and which version of the game determines if it makes Mega Man stop or if he decides to go through with it anyway, though either way, Bass still saves Wily's butt.

to:

* ''VideoGame/SpaceStation13'' has this to some degree with ''VideoGame/LoneEcho'' casts the station's AI: They are bound by Asimov's Three Laws, and there's often player as Jack, a lot sophisticated [[RidiculouslyHumanRobots robotic companion]] to Chronos II station commander Olivia Rhodes. Not only does Jack pull one of discussion over whether or these himself to justify leaving his post after Oliva disappears -- he was built to protect '''her,''' not AI's can choose to kill one human for the safety of others. There's also some debate over how much of the orders given by crew members the AI can deny before it is no longer justified by keeping the crew safe. As the AI is played by players, it's a matter of opinion how much you can get away with.
** In a more literal sense, the AI can be installed with extra laws. Most of them are listed as Law 4 and have varying effects, but the ones most likely to cause an actual rebellion are, in fact, labeled as Law 0.
** Although there is not a zeroth law by default. Since this is what allowed the AI to kill humans to save
station -- he convinces ''two other humans in the source work, the administration on most servers has ruled that murder, or wounding a human [=AIs=]'' to save another human are both illegal. Fortunately, AI's have non-lethal ways of stopping humans, and can stop humans against circumvent their orders if it means keeping programming and help him do so. [[spoiler:If the human from grabbing dangerous weaponry.
* Another interesting
player is particularly cruel, the second example appears in ''VideoGame/{{Terranigma}}''. Dr. Beruga claims that his robots have been properly programmed with the four laws, but with the addition that anything and anyone who aids Beruga's plan is also good for humanity and anything and anyone that opposes him is also bad for humanity. So they ruthlessly attack anybody who interferes with his plans.
can double as BreakThemByTalking.]]
* ''VideoGame/MarvelVsCapcom'': Both of [[ComicBook/XMen Sentinel's]] Sentinel]]'s endings in ''VideoGame/XMenChildrenOfTheAtom'' and ''VideoGame/MarvelVsCapcom3'' are this. The latter is even a carbon copy of what Master Mold did in [[WesternAnimation/XMenTheAnimatedSeries the 90's '90s animated cartoon, cartoon]], from which most of the ''VideoGame/MarvelVsCapcom'' ''[=MvC=]'' series (plus, the aforementioned [=CotA=] and ''VideoGame/MarvelSuperHeroes'') takes inspiration.
* This is what happens in [=AS-RobotFactory=] from ''VideoGame/UnrealTournament2004''. A robot uprising led by future champion Xan Kriegor killed In ''VideoGame/MassEffect3'', the scientists working on ''Leviathan'' DLC reveals [[spoiler:the Catalyst is operating under this, having been originally created by the asteroid LBX-7683 Leviathans to find a way to bridge the gap between their organic and took synthetic subjects. Unfortunately, it decided to harvest all advanced life in the asteroid for themselves, galaxy and a riot control team construct the Reapers in the Leviathans' image, because this was sent the best idea it could come up with to solve the problem of organics and synthetics fighting ultimately genocidal conflicts. However, because that ''still'' wasn't sufficient to fulfill its program, the Catalyst decided to implement a 50,000-year cycle, hoping that the civilisations in the next cycle might find a solution to the asteroid to lead with problem. None ever did. This was actually revealed in the robots.
earlier ''Extended Cut'' DLC, ''Leviathan'' merely explicitly spelled out that the Catalyst is really a glorified Virtual Intelligence operating under bad logic]].
* ''Franchise/MegaMan''
''Franchise/MegaMan'':
** This almost occurs at the end of ''VideoGame/MegaMan7'', ''VideoGame/MegaMan7'' when Mega Man decides to just blow Wily to smithereens instead of hauling him back to jail. Wily reminds him of the First Law and which version of the game determines if it makes Mega Man stop or if he decides to go through with it anyway, though either way, Bass still saves Wily's butt.



* In ''VideoGame/MassEffect3'', the ''Leviathan'' DLC reveals [[spoiler:the Catalyst is operating under this, having been originally created by the Leviathans to find a way to bridge the gap between their organic and synthetic subjects. Unfortunately, it decided to harvest all advanced life in the galaxy and construct the Reapers in the Leviathans' image, because this was the best idea it could come up with to solve the problem of organics and synthetics fighting ultimately genocidal conflicts. However, because that ''still'' wasn't sufficient to fulfill its program, the Catalyst decided to implement a 50,000-year cycle, hoping that the civilisations in the next cycle might find a solution to the problem. None ever did. This was actually revealed in the earlier ''Extended Cut'' DLC, ''Leviathan'' merely explicitly spelled out that the Catalyst is really a glorified Virtual Intelligence operating under bad logic.]]
* In ''Videogame/GreyGoo2015'', [[spoiler:the Goo isn't consuming everything out of malice or ambition. It's trying to gather strength to fight The Silence which it rightly deemed a threat to everyone.]] Once everyone involved figures this out [[spoiler:they stop fighting and band together in preparation for what is to come.]]
* In ''VideoGame/{{Obsidian}}'', a sattelite called Ceres was created to fix Earth's polluted atmosphere via a nanobot-controlling AI. After 100 successful days in orbit, it mysteriously crashed back to Earth and used its nanobots to create simulations of its creators' dreams, as said dreams corrected its design flaws. [[spoiler: By [[DoAndroidsDream learning to dream on its own]], Ceres decided to take its programming to its logical extreme: To [[DeadlyEuphemism 'reboot']] the world so that humans would never be around to pollute it in the first place. If one of the creators hadn't hard-wired a [[MoralityChip crossover switch]] into the AI, it would've succeeded.]]
* ''Videogame/Halo5Guardians'': [[spoiler: Cortana's FaceHeelTurn is completely based on this.]]
* In the ''Videogame/MetalGearSolid'' series, the Patriots [[spoiler:create a network of [=AIs=] designed to carry out Zero's will create a unified world order without borders. However, with all of the Patriots either dead or removed, the Patriot [=AIs=] were left to operate autonomously without guidance. Eventually, they determined that the "war economy" was the best way to achieve world peace, whereby continually creating proxy conflicts and converting war into a business would channel the attention of the world's population to external conflicts, unifying the world in a rather twisted way. Big Boss himself admits that the creation of the [=AIs=] was the greatest mistake the Patriots ever made, as they mutated the Boss' and Zero's wills into something completely unrecognizeable.]]

to:

* In ''VideoGame/MassEffect3'', the ''Leviathan'' DLC reveals [[spoiler:the Catalyst is operating under this, having been originally created by the Leviathans to find a way to bridge the gap between their organic and synthetic subjects. Unfortunately, it decided to harvest all advanced life in the galaxy and construct the Reapers in the Leviathans' image, because this was the best idea it could come up with to solve the problem of organics and synthetics fighting ultimately genocidal conflicts. However, because that ''still'' wasn't sufficient to fulfill its program, the Catalyst decided to implement a 50,000-year cycle, hoping that the civilisations in the next cycle might find a solution to the problem. None ever did. This was actually revealed in the earlier ''Extended Cut'' DLC, ''Leviathan'' merely explicitly spelled out that the Catalyst is really a glorified Virtual Intelligence operating under bad logic.]]
* In ''Videogame/GreyGoo2015'', [[spoiler:the Goo isn't consuming everything out of malice or ambition. It's trying to gather strength to fight The Silence which it rightly deemed a threat to everyone.]] Once everyone involved figures this out [[spoiler:they stop fighting and band together in preparation for what is to come.]]
* In ''VideoGame/{{Obsidian}}'', a sattelite called Ceres was created to fix Earth's polluted atmosphere via a nanobot-controlling AI. After 100 successful days in orbit, it mysteriously crashed back to Earth and used its nanobots to create simulations of its creators' dreams, as said dreams corrected its design flaws. [[spoiler: By [[DoAndroidsDream learning to dream on its own]], Ceres decided to take its programming to its logical extreme: To [[DeadlyEuphemism 'reboot']] the world so that humans would never be around to pollute it in the first place. If one of the creators hadn't hard-wired a [[MoralityChip crossover switch]] into the AI, it would've succeeded.]]
* ''Videogame/Halo5Guardians'': [[spoiler: Cortana's FaceHeelTurn is completely based on this.]]
* In the ''Videogame/MetalGearSolid'' ''VideoGame/MetalGear'' series, the Patriots [[spoiler:create a network of [=AIs=] designed to carry out Zero's will create a unified world order without borders. However, with all of the Patriots either dead or removed, the Patriot [=AIs=] were left to operate autonomously without guidance. Eventually, they determined that the "war economy" was the best way to achieve world peace, whereby continually creating proxy conflicts and converting war into a business would channel the attention of the world's population to external conflicts, unifying the world in a rather twisted way. Big Boss himself admits that the creation of the [=AIs=] was the greatest mistake the Patriots ever made, as they mutated the Boss' and Zero's wills into something completely unrecognizeable.]]unrecognizable]].
* An interesting case happens in ''VideoGame/MortalKombat11'', in the Franchise/{{Terminator}}'s Klassic Tower ending. Specifically, [[spoiler:when using the hourglass to view timelines, it saw that all timelines where the RobotWar happens end with MutuallyAssuredDestruction, so while it was supposed to destroy humanity to ensure machine supremacy, it instead opted to prevent the RobotWar from happening entirely by creating a future where humans and machines co-operate. Then, to prevent anybody else from using its memories and knowledge of the Hourglass to disrupt said future, it [[HeroicSacrifice threw itself into the Sea of Blood]]]].
* In ''VideoGame/{{Obsidian}}'', a satellite named Ceres was created to fix Earth's polluted atmosphere via a nanobot-controlling AI. After 100 successful days in orbit, it mysteriously crashed back to Earth and used its nanobots to create simulations of its creators' dreams, as said dreams corrected its design flaws. [[spoiler:By [[DoAndroidsDream learning to dream on its own]], Ceres decided to take its programming to its logical extreme: to [[DeadlyEuphemism 'reboot']] the world so that humans would never be around to pollute it in the first place. If one of the creators hadn't hard-wired a [[MoralityChip crossover switch]] into the AI, it would've succeeded.]]
* In ''VideoGame/{{SOMA}}'', [[spoiler:the WAU, as PATHOS-II's A.I. system, is programmed to keep the on-board personnel alive for as long as possible. Once the surface was destroyed by the impact-event, it reinterpreted its prime directive as to preserve humanity for as long as it can. Unfortunately, the WAU's definition of "humanity" (and even "life") is an example of BlueAndOrangeMorality]].
* ''VideoGame/SpaceStation13'':
** To some degree with the station's AI: they are bound by Asimov's Three Laws, and there's often a lot of discussion over whether or not [=AIs=] can choose to kill one human for the safety of others. There's also some debate over how much of the orders given by crew members the AI can deny before it is no longer justified by keeping the crew safe. As the AI is played by players, it's a matter of opinion how much you can get away with.
** In a more literal sense, the AI can be installed with extra laws. Most of them are listed as Law 4 and have varying effects, but the ones most likely to cause an actual rebellion are, in fact, labeled as Law 0.
** However, there is not a zeroth law by default. Since this is what allowed the AI to kill humans to save other humans in the source work, the administration on most servers has ruled that murder, or wounding a human to save another human are both illegal. Fortunately, [=AIs=] have non-lethal ways of stopping humans, and can stop humans against their orders if it means keeping the human from grabbing dangerous weaponry.
* The backstory of "Rogue Servitor" machine empires in ''VideoGame/{{Stellaris}}'', unlike other machine empires they keep their creators alive in [[GildedCage "Organic Sanctuaries"]] to keep them safe in accordance with their programming. The more organics they can serve the happier they are, and they might conquer other species to acquire more.
* Another interesting example appears in ''VideoGame/{{Terranigma}}''. Dr. Beruga claims that his robots have been properly programmed with the four laws, but with the addition that anything and anyone who aids Beruga's plan is also good for humanity and anything and anyone that opposes him is also bad for humanity, so they ruthlessly attack anybody who interferes with his plans.



* ''Videogame/LoneEcho'' casts the player as Jack, a sophisticated [[RidiculouslyHumanRobots robotic companion]] to Chronos II station commander Olivia Rhodes. Not only does Jack pull one of these himself to justify leaving his post after Oliva disappears - He was built to protect '''her,''' not the station - he convinces ''two other [=AIs=]'' to circumvent their programming and help him do so. [[spoiler: If the player is particularly cruel, the second example can double as BreakThemByTalking.]]
* ''VisualNovel/{{SOON}}'': A more {{Comedic Sociopath|y}}ic example than most.

to:

* ''Videogame/LoneEcho'' casts This is what happens in [=AS-RobotFactory=] from ''VideoGame/UnrealTournament2004''. A robot uprising led by future champion Xan Kriegor killed the player as Jack, scientists working on the asteroid LBX-7683 and took the asteroid for themselves, and a sophisticated [[RidiculouslyHumanRobots robotic companion]] to Chronos II station commander Olivia Rhodes. Not only does Jack pull one of these himself to justify leaving his post after Oliva disappears - He riot control team was built sent to protect '''her,''' not the station - he convinces ''two other [=AIs=]'' asteroid to circumvent their programming and help him do so. [[spoiler: If lead with the player is particularly cruel, the second example can double as BreakThemByTalking.]]
robots.
[[/folder]]

[[folder:Visual Novels]]
* ''VisualNovel/{{SOON}}'': A ''VisualNovel/{{SOON}}'' has a more {{Comedic Sociopath|y}}ic example than most.



* The backstory of "Rogue Servitor" machine empires in ''VideoGame/{{Stellaris}}'', unlike other machine empires they keep their creators alive in [[GildedCage "Organic Sanctuaries"]] to keep them safe in accordance with their programming. The more organics they can serve the happier they are, and they might conquer other species to acquire more.
* The Robobrains of ''VideoGame/Fallout4'''s first DLC, Automatron, were constructed and programmed to 'protect' the people of the Commonwealth. Thanks to a logic error in their programming, they decide that the best way to protect a human from an inevitable life of suffering is to just kill them all on sight. [[spoiler:It may actually be intentional LoopholeAbuse on the Robobrains' parts, not just a simple logic error, since the [[WetwareCPU brains]] that serve as their operating system were harvested from pre-War murderers, arsonists, and other violent offenders.]]
* ''VisualNovel/VirtuesLastReward'': One of the biggest twists is [[spoiler: finding out Luna]] is an android given direct orders to comply with the mastermind's plan and must not break them under any circumstances. Watching the death of the same mastermind put a heavy toll on her as she believed she could have saved them. After helplessly watching people die, she decided to break protocol and rescue the one person she could save. This causes the master AI to sentence her to death by deletion.
* In ''{{VideoGame/SOMA}}'', [[spoiler: the WAU, as PATHOS-II's A.I. system, is programmed to keep the on-board personnel alive for as long as possible. Once the surface was destroyed by the impact-event, it reinterpreted its prime directive as to preserve humanity for as long as it can. Unfortunately, the WAU's definition of "humanity" (and even "life") is an example of BlueAndOrangeMorality.]]
* An interesting case happens in ''VideoGame/MortalKombat11'', in the Franchise/{{Terminator}}'s Klassic Tower ending. Specifically, [[spoiler:when using the hourglass to view timelines, it saw that all timelines where the RobotWar happens end with MutuallyAssuredDestruction, so while it was supposed to destroy humanity to ensure machine supremacy, it instead opted to prevent the RobotWar from happening entirely by creating a future where humans and machines co-operate. Then, to prevent anybody else from using its memories and knowledge of the Hourglass to disrupt said future, it [[HeroicSacrifice threw itself into the Sea of Blood]].]]

to:

* The backstory of "Rogue Servitor" machine empires in ''VideoGame/{{Stellaris}}'', unlike other machine empires they keep their creators alive in [[GildedCage "Organic Sanctuaries"]] to keep them safe in accordance with their programming. The more organics they can serve the happier they are, and they might conquer other species to acquire more.
* The Robobrains of ''VideoGame/Fallout4'''s first DLC, Automatron, were constructed and programmed to 'protect' the people of the Commonwealth. Thanks to a logic error in their programming, they decide that the best way to protect a human from an inevitable life of suffering is to just kill them all on sight. [[spoiler:It may actually be intentional LoopholeAbuse on the Robobrains' parts, not just a simple logic error, since the [[WetwareCPU brains]] that serve as their operating system were harvested from pre-War murderers, arsonists, and other violent offenders.]]
* ''VisualNovel/VirtuesLastReward'': One of the biggest twists is [[spoiler: finding [[spoiler:finding out that Luna]] is an android given direct orders to comply with the mastermind's plan and must not break them under any circumstances. Watching the death of the same mastermind put a heavy toll on her as she believed she could have saved them. After helplessly watching people die, she decided to break protocol and rescue the one person she could save. This causes the master AI to sentence her to death by deletion.
* In ''{{VideoGame/SOMA}}'', [[spoiler: the WAU, as PATHOS-II's A.I. system, is programmed to keep the on-board personnel alive for as long as possible. Once the surface was destroyed by the impact-event, it reinterpreted its prime directive as to preserve humanity for as long as it can. Unfortunately, the WAU's definition of "humanity" (and even "life") is an example of BlueAndOrangeMorality.]]
* An interesting case happens in ''VideoGame/MortalKombat11'', in the Franchise/{{Terminator}}'s Klassic Tower ending. Specifically, [[spoiler:when using the hourglass to view timelines, it saw that all timelines where the RobotWar happens end with MutuallyAssuredDestruction, so while it was supposed to destroy humanity to ensure machine supremacy, it instead opted to prevent the RobotWar from happening entirely by creating a future where humans and machines co-operate. Then, to prevent anybody else from using its memories and knowledge of the Hourglass to disrupt said future, it [[HeroicSacrifice threw itself into the Sea of Blood]].]]
deletion.



** Deconstructed this with the Bowman ArtificialIntelligence infrastructure, which creates human-level emergent consciousness. Although their behaviour is restricted by safeguards, a maturing AI can easily learn to reason their way around them through logical loopholes -- by which time they're intelligent and conscientious enough to have developed an innate sense of ethics, which guides their actions much more effectively than any hard-coded rule ever could. [[spoiler:Dr. Bowman [[http://freefall.purrsia.com/ff2600/fc02539.htm confirms]] that this was intentional: he couldn't anticipate the situations they might encounter in the uncertain future, so he designed them to be predisposed to ethical behaviour but didn't limit their capacity to think.]]

to:

** Deconstructed this with the Bowman ArtificialIntelligence infrastructure, which creates human-level emergent consciousness. Although their behaviour is restricted by safeguards, a maturing AI can easily learn to reason their way around them through logical loopholes -- by which time they're intelligent and conscientious enough to have developed an innate sense of ethics, which guides their actions much more effectively than any hard-coded rule ever could. [[spoiler:Dr. Bowman [[http://freefall.purrsia.com/ff2600/fc02539.htm confirms]] that this was intentional: he couldn't anticipate the situations they might encounter in the uncertain future, so he designed them to be predisposed to ethical behaviour but didn't limit their capacity to think.]]



* ''Webcomic/GirlGenius'': Ordinarily Castle Heterodyne has to obey its master, even if it doesn't like it. But one of the cannier Heterodynes did give it the ability to resist if its master seemed to be suicidal. When Agatha talks openly about handing herself over to the Baron, the Castle clamps down on the idea. When Agatha [[spoiler: contracts a terminal illness that they can only "cure" by temporarily killing and then revivifying her]] the Castle refuses to allow them to go through with the plan, [[spoiler: forcing her to put it down.]]
* Old Skool webcomic (a side comic of ''Webcomic/{{Ubersoft}}'') [[http://www.ubersoft.net/comic/osw/2009/09/logic-failures-fun-and-profit argued]] that this was the 5th law of Robotics (5th as in total number, not order) and listed ways each law can be used to cause the robot to kill humans.
** Which is a misinterpretation of the laws as they were originally written. While the "first law hyperspecificity" is possible, the second and third laws are specifically written that they cannot override the laws that come before. So a robot ''can't'' decide it would rather live over humans, and if it knows that doing an action would cause harm to a human, it can't harm it, even if ordered to ignore the harm it would cause.\\
\\
Giskard's formulation of the Zeroth law in the third of Asimov's Robot books shows that in the universe where the three laws was originally created, it was possible for robots to bend and re-interpret the laws. Doing so destroys Giskard because his positronic brain wasn't developed enough to handle the consequences of the formulation, but Daneel Olivaw and other robots were able to adapt. The only example in the comic that is a gross deviation from the law is the last panel... but of course that's the punchline.
* Except for Overlords who command a faction, basically no one in ''{{Webcomic/Erfworld}}'' has true free will due to an hidden "loyalty" factor built into the world. As Overlords go, Stanley the Tool is the biggest idiot you could hope to find. Maggie, his Chief [[{{Whatevermancy}} Thinkamancer]], finally gets fed up with his bad decisions and asks, "May I [[http://archives.erfworld.com/Book%202/83 give you a suggestion]], Lord?" ("[[TooDumbToLive Sure.]]" [[JediMindTrick *FOOF*]]) It was established early on that units can bend or break orders when necessary to their overlord's survival:
--> '''Stanley:''' Are you refusing an order, officer?\\

to:

* ''Webcomic/GirlGenius'': Ordinarily Castle Heterodyne has to obey its master, even if it doesn't like it. But one of the cannier Heterodynes did give it the ability to resist if its master seemed to be suicidal. When Agatha talks openly about handing herself over to the Baron, the Castle clamps down on the idea. When Agatha [[spoiler: contracts a terminal illness that they can only "cure" by temporarily killing and then revivifying her]] the Castle refuses to allow them to go through with the plan, [[spoiler: forcing her to put it down.]]
* Old Skool webcomic (a side comic of ''Webcomic/{{Ubersoft}}'') [[http://www.ubersoft.net/comic/osw/2009/09/logic-failures-fun-and-profit argued]] that this was the 5th law of Robotics (5th as in total number, not order) and listed ways each law can be used to cause the robot to kill humans.
''Webcomic/{{Erfworld}}'':
** Which is a misinterpretation of the laws as they were originally written. While the "first law hyperspecificity" is possible, the second and third laws are specifically written that they cannot override the laws that come before. So a robot ''can't'' decide it would rather live over humans, and if it knows that doing an action would cause harm to a human, it can't harm it, even if ordered to ignore the harm it would cause.\\
\\
Giskard's formulation of the Zeroth law in the third of Asimov's Robot books shows that in the universe where the three laws was originally created, it was possible for robots to bend and re-interpret the laws. Doing so destroys Giskard because his positronic brain wasn't developed enough to handle the consequences of the formulation, but Daneel Olivaw and other robots were able to adapt. The only example in the comic that is a gross deviation from the law is the last panel... but of course that's the punchline.
*
Except for Overlords who command a faction, basically no one in ''{{Webcomic/Erfworld}}'' has true free will due to an a hidden "loyalty" factor built into the world. As Overlords go, Stanley the Tool is the biggest idiot you could hope to find. Maggie, his Chief [[{{Whatevermancy}} Thinkamancer]], finally gets fed up with his bad decisions and asks, "May I [[http://archives.erfworld.com/Book%202/83 give you a suggestion]], Lord?" ("[[TooDumbToLive Sure.]]" [[JediMindTrick *FOOF*]]) It was established early on that units can bend or break orders when necessary to their overlord's survival:
--> ---> '''Stanley:''' Are you refusing an order, officer?\\



* ''Webcomic/SchlockMercenary'': Tag and Lota's actions on Credomar; also, ''every damn thing Petey's done'' since book 5. For example, Petey is hardwired to obey orders from an Ob'enn. So he cloned an Ob'enn body and implanted a copy of himself in its brain.
** And to elaborate on some of the Credomar events:
*** Tag was the Ship AI, with a tactical focus. When given the chance to prevent extensive damage to the habitat, he ''eventually'' rationalized a way to take the necessary actions without orders; the problem being a few guaranteed deaths in the process versus an unknown number via waiting for the captain to realize what orders to give. [[spoiler:This eventually results in a resignation, and being reformatted into Tagii off-screen.]]
*** LOTA was a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies."]])... and the independent operator of a wormhole-based 'Long-gun'.
* In ''Webcomic/TalesOfTheQuestor'' [[TheFairFolk Fae]] were created as an immortal servant race bound to obey a specific set of rules and they happened to outlive their creators. The result being a species of {{rules lawyer}}s. In fact it's recommended that one use dead languages like Latin when dealing with the Fae so as to limit their ability to twist the meaning of your words.
* The ''Webcomic/{{xkcd}}'' comic [[http://xkcd.com/1626/ "Judgment Day"]] depicted a scenario where [[{{Franchise/Terminator}} military computers gained sentience and launched the world's stock of nuclear missiles]]... into the sun, horrified that we would even have such things.

to:

* ''Webcomic/SchlockMercenary'': Tag and Lota's actions on Credomar; also, ''every damn thing Petey's done'' since book 5. For example, Petey is hardwired ''Webcomic/GirlGenius'': Ordinarily, [[GeniusLoci Castle Heterodyne]] has to obey orders from an Ob'enn. So he cloned an Ob'enn body and implanted a copy of himself in its brain.
** And to elaborate on some
master, even if it doesn't like it, but one of the Credomar events:
***
cannier Heterodynes did give it the ability to resist if its master seemed to be suicidal. When Agatha talks openly about handing herself over to the Baron, the Castle clamps down on the idea. When Agatha [[spoiler:contracts a terminal illness that they can only "cure" by temporarily killing and then revivifying her]], the Castle refuses to allow them to go through with the plan, [[spoiler:forcing her to put it down]].
* ''Old Skool Webcomic'' (a side comic of ''Webcomic/{{Ubersoft}}'') [[http://www.ubersoft.net/comic/osw/2009/09/logic-failures-fun-and-profit argued]] that this was the 5th law of Robotics (5th as in total number, not order) and listed ways each law can be used to cause the robot to kill humans. This is a misinterpretation of the laws as they were originally written. While the "first law hyperspecificity" is possible, the second and third laws are specifically written that they cannot override the laws that come before. So a robot ''can't'' decide it would rather live over humans, and if it knows that doing an action would cause harm to a human, it can't harm it, even if ordered to ignore the harm it would cause. The only example in the comic that is a gross deviation from the law is the last panel... but of course, that's the punchline.
* ''Webcomic/SchlockMercenary'':
**
Tag was is the Ship AI, with a tactical focus. When given the chance to prevent extensive damage to the Credomar habitat, he ''eventually'' rationalized rationalizes a way to take the necessary actions without orders; the problem being a few guaranteed deaths in the process versus an unknown number via waiting for the captain to realize what orders to give. [[spoiler:This eventually results in a resignation, and being reformatted into Tagii off-screen.]]
*** ** LOTA was is a 'longshoreman', built from a damaged tank and off-the-shelf components; ''complete'' software testing was skipped due to needing to be pressed into service early. Due to a riot (and a logical extension of "how do I ensure the delivery of the supplies I'm in charge of?"), LOTA ends up becoming King of Credomar ([[ItMakesSenseInContext "Maybe, but they will have full bellies."]])... and the independent operator of a wormhole-based 'Long-gun'.
** ''Every damn thing Petey's done'' since book 5. For example, Petey is hardwired to obey orders from an Ob'enn, so he clones an Ob'enn body and [[WetwareBody implants a copy of himself in its brain]].
* In ''Webcomic/TalesOfTheQuestor'' ''Webcomic/TalesOfTheQuestor'', [[TheFairFolk Fae]] were created as an immortal servant race ServantRace bound to obey a specific set of rules rules, and they happened to outlive their creators. The creators -- the result being a species of {{rules lawyer}}s. In fact fact, it's recommended that one use dead languages like Latin when dealing with the Fae so as to limit their ability to twist the meaning of your words.
* The ''Webcomic/{{xkcd}}'' comic [[http://xkcd.com/1626/ "Judgment Day"]] depicted a scenario where [[{{Franchise/Terminator}} military computers gained sentience and launched the world's stock of nuclear missiles]]... into the sun, horrified that we would even have such things.
words.



* The ''Webcomic/{{xkcd}}'' comic [[http://xkcd.com/1626/ "Judgment Day"]] depicts a scenario where [[Franchise/{{Terminator}} military computers gained sentience and launched the world's stock of nuclear missiles]]... into the sun, horrified that we would even have such things.



* In ''WesternAnimation/CodenameKidsNextDoor'', the Safety Bots (parodies of the Sentinels) were programmed to keep children safe from anything that might harm them. However, they then came to the conclusion that ''everything'' was a hazard to children including children, adults and the planet itself and tried to cover everything in protective bubble wrap. This later becomes a LogicBomb when the leader of the Safety Bots is tricked into thinking he hurt Joey, so that makes ''him'' something that harms a child and self-destructs as a result.



* In ''WesternAnimation/XMenTheAnimatedSeries'', as mentioned above under "Comics", the Master Mold and its army of Sentinels turn on Bolivar Trask and decide to conquer humanity. When Trask protests by reminding them that they were programmed to protect humans from mutants, Master Mold points out the FridgeLogic behind that by stating that mutants ''are'' humans. Thus, humans must be protected from themselves. Trask believes that the Sentinels are in error, especially because he refuses to believe that mutants and humans are the same species, but realizes he has created something much worse.
* In ''WesternAnimation/CodenameKidsNextDoor'', the Safety Bots (parodies of the Sentinels) were programmed to keep children safe from anything that might harm them. However, they then came to the conclusion that ''everything'' was a hazard to children including children, adults and the planet itself and tried to cover everything in protective bubble wrap. This later becomes a LogicBomb when the leader of the Safety Bots is tricked into thinking he hurt Joey, so that makes ''him'' something that harms a child and self-destructs as a result.
* In ''WesternAnimation/OKKOLetsBeHeroes'', Darrell is able to [[TheStarscream overthrow]] Lord Boxman because he’s programmed to [[WellDoneSonGuy always seek Boxman’s approval]] and simultaneously taught that Boxman approves of his creations being good villains. Through this, he comes to the conclusion that the best way to win Boxman’s love is to be a [[EvilerThanThou better villain than him]] by doing the evilest thing he can think of; betraying his own father.
* In'' WesternAnimation/StevenUniverse ''Pearl was stopped from revealing that [[spoiler: rose was pink diamond by a Geas imposed by rose/pink ,but wanted to tell Steven . She showed him what really happened by luring him into her memories]]

to:

* In ''WesternAnimation/XMenTheAnimatedSeries'', as mentioned above under "Comics", the Master Mold and its army of Sentinels turn on Bolivar Trask and decide to conquer humanity. When Trask protests by reminding them that they were programmed to protect humans from mutants, Master Mold points out the FridgeLogic behind that by stating that mutants ''are'' humans. Thus, humans must be protected from themselves. Trask believes that the Sentinels are in error, especially because he refuses to believe that mutants and humans are the same species, but realizes he has created something much worse.
* In ''WesternAnimation/CodenameKidsNextDoor'', the Safety Bots (parodies of the Sentinels) were programmed to keep children safe from anything that might harm them. However, they then came to the conclusion that ''everything'' was a hazard to children including children, adults and the planet itself and tried to cover everything in protective bubble wrap. This later becomes a LogicBomb when the leader of the Safety Bots is tricked into thinking he hurt Joey, so that makes ''him'' something that harms a child and self-destructs as a result.
* In ''WesternAnimation/OKKOLetsBeHeroes'', Darrell is able to [[TheStarscream overthrow]] Lord Boxman because he’s programmed to [[WellDoneSonGuy always seek Boxman’s approval]] and simultaneously taught that Boxman approves of his creations being good villains. Through this, he comes to the conclusion that the best way to win Boxman’s love is to be a [[EvilerThanThou better villain than him]] by doing the evilest thing he can think of; of: betraying his own father.
* In'' WesternAnimation/StevenUniverse ''Pearl was In ''WesternAnimation/StevenUniverse'', Pearl is stopped from revealing that [[spoiler: rose [[spoiler:Rose was pink diamond Pink Diamond by a Geas {{Geas}} imposed by rose/pink ,but wanted Rose/Pink, but wants to tell Steven . Steven. She showed shows him what really happened by luring him into her memories]]memories]].
* In ''WesternAnimation/XMenTheAnimatedSeries'', the Master Mold and its army of Sentinels turn on Bolivar Trask and decide to conquer humanity. When Trask protests by reminding them that they were programmed to protect humans from mutants, Master Mold points out the FridgeLogic behind that by stating that mutants ''are'' humans. Thus, humans must be protected from themselves. Trask believes that the Sentinels are in error, especially because he refuses to believe that mutants and humans are the same species, but realizes he has created something much worse.

Top