Follow TV Tropes

Following

History Main / ThreeLawsCompliant

Go To

OR

Is there an issue? Send a MessageReason:
None


** The first episode establishes that all the robot in Avangard City are three-laws compliant (to the point that eliminating the laws would mean rewriting the entire programming)-all robots including the police bots, who have to call human cops to actually perform arrests due the chance of harming the suspect. Being GenreSavvy, the people of Avangard City made sure the police robots ''could'' have the compliance deactivated by the mayor in case of necessity, with the safety that they will have to obey to the highest authority available and the device to make them uncompliant kept in a well-guarded bulletproof display. Due the threat of the Phantom Blot threatening the ceremony for the renaming of the city in Robopolis with an army of non-compliant robots, the mayor deactivates the safeties and brings back the Panthers, a tougher but defective model of police robot... [[spoiler:[[AllAccordingToPlan Just as Phantom Blot wanted]], as he didn't actually have an army of robots but, having replaced the deputy mayor, he can now incapacitate the mayor and replace him as the highest authority available]].

to:

** The first episode establishes that all the robot in Avangard City are three-laws compliant (to the point that eliminating the laws would mean rewriting the entire programming)-all programming) - all robots including the police bots, who have to call human cops to actually perform arrests due the chance of harming the suspect. Being GenreSavvy, the people of Avangard City made sure the police robots ''could'' have the compliance deactivated by the mayor in case of necessity, with the safety that they will have to obey to the highest authority available and the device to make them uncompliant kept in a well-guarded bulletproof display. Due the threat of the Phantom Blot threatening the ceremony for the renaming of the city in Robopolis with an army of non-compliant robots, the mayor deactivates the safeties and brings back the Panthers, a tougher but defective model of police robot... [[spoiler:[[AllAccordingToPlan Just as Phantom Blot wanted]], as he didn't actually have an army of robots but, having replaced the deputy mayor, he can now incapacitate the mayor and replace him as the highest authority available]].
Is there an issue? Send a MessageReason:
None


->''[[Film/TheTerminator Skynet]] claims Three Laws of Robotics are unconstitutional.''

to:

->''[[Film/TheTerminator ->''[[Franchise/TheTerminator Skynet]] claims Three Laws of Robotics are unconstitutional.''
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''TabletopGame/RedDwarf'': Series 4000 and Hudzen 10 droids are programmed not to kill and Asimov's Law is a character trait they all have, though the latter has a tendency towards psychosis which overrides this. Simulants can also be installed with this, but it's noted that they become very surly if this happens.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** The novel itself was inspired by one of the most famous science fiction essays by Paul Abrahm and Stuart Kenter, "Tik-Tok and the Three Laws of Robotics". Tik-Tok himself in the Oz books was also ThreeLawsCompliant, [[https://www.depauw.edu/sfs/backissues/14/abrahm14art.htm before it became a thing]], making him an UnbuiltTrope. To be fair, Tik-Tok follows them but isn't ''enforced'' by them.
Is there an issue? Send a MessageReason:
None


* ''Fanfic/RocketshipVoyager''. Spacefleet's robots are Three Laws Compliant, including the ship's AutoDoc (though with Zeroth exceptions as guided by its compassion-protection algorithms). On seeing the MechaMooks on the Array, Captain Janeway wants to know WhoWouldBeStupidEnough to build an armed autonomous android.

to:

* ''Fanfic/RocketshipVoyager''. Spacefleet's robots are Three Laws Compliant, including the ship's AutoDoc (though with Zeroth exceptions as guided by its compassion-protection algorithms). On seeing the MechaMooks on the Array, Captain Janeway wants to know WhoWouldBeStupidEnough to build an [[KillerRobot armed autonomous android.android]].
Is there an issue? Send a MessageReason:
None


* {{Invoked}} in Episode 3 of ''VisualNovel/MajikoiLoveMeSeriously'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.

to:

* {{Invoked}} {{Invoked|Trope}} in Episode 3 of ''VisualNovel/MajikoiLoveMeSeriously'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.



* {{Discussed}} (and apparently {{Averted}}) in ''VisualNovel/VA11HALLA'' by Jill (player character and bartender) and Dorothy (frequent patron and android prostitute who is purposefully OlderThanTheyLook). Dorothy finds the concept of binding AIs to artificial laws after they've become advanced enough to be self-aware to be ridiculous.

to:

* {{Discussed}} {{Discussed|Trope}} (and apparently {{Averted}}) {{averted|Trope}}) in ''VisualNovel/VA11HALLA'' by Jill (player character and bartender) and Dorothy (frequent patron and android prostitute who is purposefully OlderThanTheyLook). Dorothy finds the concept of binding AIs to artificial laws after they've become advanced enough to be self-aware to be ridiculous.



* ''WesternAnimation/RickAndMorty'': In search of Morty, Rick comes across his AI-powered hologram likeness. This exchange occured:

to:

* ''WesternAnimation/RickAndMorty'': In search of Morty, Rick comes across his AI-powered hologram likeness. This exchange occured:occurred:
Is there an issue? Send a MessageReason:
None


* ''Fanfic/RocketshipVoyager''. Spacefleet's robots are Three Laws Compliant, including the ship's AutoDoc (though with Zeroth exceptions as guided by its compassion-protection algorithms). Captain Janeway is incredulous to encounter an armed autonomous robot on the Array, and wants to know WhoWouldBeStupidEnough to create such a thing.

to:

* ''Fanfic/RocketshipVoyager''. Spacefleet's robots are Three Laws Compliant, including the ship's AutoDoc (though with Zeroth exceptions as guided by its compassion-protection algorithms). On seeing the MechaMooks on the Array, Captain Janeway is incredulous to encounter an armed autonomous robot on the Array, and wants to know WhoWouldBeStupidEnough to create such a thing.
build an armed autonomous android.
-->'''Nee'Lix:''' The [[Recap/StarTrekVoyagerS2E13Prototype Pralor]], actually (it didn't work out well for them).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* ''Fanfic/RocketshipVoyager''. Spacefleet's robots are Three Laws Compliant, including the ship's AutoDoc (though with Zeroth exceptions as guided by its compassion-protection algorithms). Captain Janeway is incredulous to encounter an armed autonomous robot on the Array, and wants to know WhoWouldBeStupidEnough to create such a thing.
Is there an issue? Send a MessageReason:
None


** In the ''VideoGame/MegaManZero'' series, [[spoiler:Copy-X]] is at least somewhat Three-Laws Compliant. As a result, [[spoiler:Copy-X]] has to hold back against LaResistance since the Resistance leader Ciel is human [[spoiler:until ''Zero 3'', where Copy-X decided to attack in full force, trying to justify his actions by marking the Resistance as dangerous "extremists." This can be explained by Weil having revived Copy-X, and, likely for Weil's own purposes, was brought back wrong, ignoring protests and alternate suggestions by his own generals and having a stutter]].

to:

** In the ''VideoGame/MegaManZero'' series, [[spoiler:Copy-X]] is at least somewhat Three-Laws Compliant.Compliant, or at bare minimum [[spoiler:since he ''is'' a copy of X down to the "free will" in theory]] he maintains it in order to keep looking to others and himself like humanity's proper "savior". As a result, [[spoiler:Copy-X]] has to hold back against LaResistance since the Resistance leader Ciel is human [[spoiler:until ''Zero 3'', where Copy-X decided to attack in full force, trying to justify his actions by marking the Resistance as dangerous "extremists." This can be explained by Weil having revived Copy-X, and, likely for Weil's own purposes, was brought back wrong, ignoring protests and alternate suggestions by his own generals and having a stutter]].
Is there an issue? Send a MessageReason:
None


* In ''Series/TheMiddleman'', the titular character invokes the First Law on Ida, his robot secretary.[[spoiler: Nanobots were messing with her programming.]] She responds [[GettingCrapPastTheRadar "Kiss my Asimov."]].

to:

* In ''Series/TheMiddleman'', the titular character invokes the First Law on Ida, his robot secretary.[[spoiler: Nanobots were messing with her programming.]] She responds [[GettingCrapPastTheRadar "Kiss my Asimov."]].".
Is there an issue? Send a MessageReason:
None


* {{Invoked}} in Episode 3 of ''VisualNovel/MajiDeWatashiNiKoiShinasai'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.

to:

* {{Invoked}} in Episode 3 of ''VisualNovel/MajiDeWatashiNiKoiShinasai'', ''VisualNovel/MajikoiLoveMeSeriously'', where Miyako orders Cookie to taze Yamato into submission, while Yamato orders the robot to get Miyako off him. Cookie considers this dilemma out loud, where he has to obey a human command, yet isn't allowed to seriously hurt a human, yet also cannot allow a human to come to harm through his inaction.



* The Third Law becomes a key plot point in episode 8 of ''FanFic/MegaManDefenderOfTheHumanRace''. [[spoiler:Drill Man has been ordered by Wily on a suicide mission to create an artificial volcano in New York. He really doesn't want to since it will kill him, but he's doing it to make Wily proud. Roll convinces him to stop by pointing out it's a violation of the third law, nullifying the order. Once he realizes this, Drill Man happily abandons his mission.]]

to:

* The Third Law becomes a key plot point in episode 8 of ''FanFic/MegaManDefenderOfTheHumanRace''.''Fanfic/MegaManDefenderOfTheHumanRace''. [[spoiler:Drill Man has been ordered by Wily on a suicide mission to create an artificial volcano in New York. He really doesn't want to since it will kill him, but he's doing it to make Wily proud. Roll convinces him to stop by pointing out it's a violation of the third law, nullifying the order. Once he realizes this, Drill Man happily abandons his mission.]]



* Creator/IsaacAsimov: [[TropeMakers Trope Maker]] of the "Three Laws of Robotics", because Dr Asimov believed that robots were machines that would be built with [[MoralityChip restrictions on their behaviour]]. It's an UnbuiltTrope, as Asimov was perfectly aware of all the ways the laws he created can go wrong, and wrote countless stories [[{{Deconstruction}} deconstructing]] and [[{{Reconstruction}} reconstructing]] them and generally showing how the three laws don't always prevent drama:

to:

* Creator/IsaacAsimov: [[TropeMakers Trope Maker]] {{Trope Maker|s}} of the "Three Laws of Robotics", because Dr Asimov believed that robots were machines that would be built with [[MoralityChip restrictions on their behaviour]]. It's an UnbuiltTrope, as Asimov was perfectly aware of all the ways the laws he created can go wrong, and wrote countless stories [[{{Deconstruction}} deconstructing]] {{Deconstructi|on}}ng and [[{{Reconstruction}} reconstructing]] {{Reconstructi|on}}ng them and generally showing how the three laws don't always prevent drama:



** "{{Literature/Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]

to:

** "{{Literature/Evidence}}": "Literature/{{Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]



** "{{Literature/Lenny}}": The climax comes from the fact that the [[InSeriesNickname Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]

to:

** "{{Literature/Lenny}}": "Literature/{{Lenny}}": The climax comes from the fact that the [[InSeriesNickname Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]



** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.

to:

** "{{Literature/Robbie}}": "Literature/{{Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.



** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.

to:

** "{{Literature/Runaround}}": "Literature/{{Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.



* At the beginning of ''{{Literature/Lifelike}}'' by Creator/JayKristoff, Asimov's laws are shown in crossed-out text, with alternate wordings written underneath:

to:

* At the beginning of ''{{Literature/Lifelike}}'' ''Literature/{{Lifelike}}'' by Creator/JayKristoff, Asimov's laws are shown in crossed-out text, with alternate wordings written underneath:



* ''VideoGame/{{Portal 2}}'' gives us this gem: "''All Military Androids have been taught to read and given a copy of the Three Laws of Robotics. To share.''" Because if the robots couldn't kill you, how could you do [[ForScience science]]?!

to:

* ''VideoGame/{{Portal 2}}'' ''VideoGame/Portal2'' gives us this gem: "''All Military Androids have been taught to read and given a copy of the Three Laws of Robotics. To share.''" Because if the robots couldn't kill you, how could you do [[ForScience science]]?!



** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries, ]][[NotUsedToFreedom as they struggle to comprehend what good freedom is ]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom, ]] and deciding that they must be scrapped from the world [[IsntItIronic so their subjects may receive ]][[InNameOnly "true freedom", ]][[StrawHypocrite by force if necessary. ]]The second is the [[AIIsACrapshoot Determined ]][[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:

to:

** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries, ]][[NotUsedToFreedom sanctuaries]], [[NotUsedToFreedom as they struggle to comprehend what good freedom is ]] is]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom, ]] freedom]], and deciding that they must be scrapped from the world [[IsntItIronic so their subjects may receive ]][[InNameOnly receive]] [[InNameOnly "true freedom", ]][[StrawHypocrite freedom"]], [[StrawHypocrite by force if necessary. ]]The necessary.]] The second is the [[AIIsACrapshoot Determined ]][[Film/TheTerminator Determined]] [[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:



[[folder:Web Comics]]

to:

[[folder:Web Comics]][[folder:Webcomics]]
Is there an issue? Send a MessageReason:
None


* Creator/IsaacAsimov: [[TropeMakers Trope Maker]] of the "Three Laws of Robotics", because Dr Asimov believed that robots were machines that would be built with [[MoralityChip restrictions on their behaviour]]. The following examples, however, show how being a "Three Laws" robot often isn't enough to prevent drama:

to:

* Creator/IsaacAsimov: [[TropeMakers Trope Maker]] of the "Three Laws of Robotics", because Dr Asimov believed that robots were machines that would be built with [[MoralityChip restrictions on their behaviour]]. The following examples, however, show It's an UnbuiltTrope, as Asimov was perfectly aware of all the ways the laws he created can go wrong, and wrote countless stories [[{{Deconstruction}} deconstructing]] and [[{{Reconstruction}} reconstructing]] them and generally showing how being a "Three Laws" robot often isn't enough to the three laws don't always prevent drama:
Is there an issue? Send a MessageReason:
None


# Robots must serve humanity.
# Robots must not kill or harm humans.
# A robot must call its human creator “father.”
# A robot can make anything, except money.
# Robots may not go abroad without permission.
# Male and female robots may not change their genders.
# Robots may not change their face to become a different robot.
# A robot created as an adult may not become a child.
# A robot may not reassemble a robot that has been disassembled by a human.
# Robots shall not destroy human homes or tools.

to:

# ## Robots must serve humanity.
# ## Robots must not kill or harm humans.
# ## A robot must call its human creator “father.”
# ## A robot can make anything, except money.
# ## Robots may not go abroad without permission.
# ## Male and female robots may not change their genders.
# ## Robots may not change their face to become a different robot.
# ## A robot created as an adult may not become a child.
# ## A robot may not reassemble a robot that has been disassembled by a human.
# ## Robots shall not destroy human homes or tools.
Willbyr MOD

Added: 141

Changed: 324

Is there an issue? Send a MessageReason:
None


%% Image selected via crowner in the Image Suggestion thread: https://tvtropes.org/pmwiki/crowner.php/ImagePickin/ImageSuggestions55

to:

%% Image selected per Image Pickin' thread: https://tvtropes.org/pmwiki/posts.php?discussion=1610676311038761500
%% Previous image
selected via crowner in the Image Suggestion thread: https://tvtropes.org/pmwiki/crowner.php/ImagePickin/ImageSuggestions55



[[quoteright:300:[[Script/IRobotTheIllustratedScreenplay https://static.tvtropes.org/pmwiki/pub/images/tumblr_mpfu64is7g1sw6xjdo5_540.png]]]]
[[caption-width-right:300:''[[http://markzug.com/i-robot-ellison-asimov-and-som/color-plates/2794089 Robbie]]'' by [[http://markzug.com/ Mark Zug]].]]

to:

[[quoteright:300:[[Script/IRobotTheIllustratedScreenplay [[quoteright:350:[[WebAnimation/ExtraCredits https://static.tvtropes.org/pmwiki/pub/images/tumblr_mpfu64is7g1sw6xjdo5_540.org/pmwiki/pub/images/three_laws_compliant_9.png]]]]
[[caption-width-right:300:''[[http://markzug.com/i-robot-ellison-asimov-and-som/color-plates/2794089 Robbie]]'' by [[http://markzug.com/ Mark Zug]].]]
%%



Added: 592

Changed: 64

Is there an issue? Send a MessageReason:
None


* [[Anime/GaoGaiGar GGG]] robots are all Three-Laws Compliant, at one point in ''[=GaoGaiGar=] Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.

to:

* [[Anime/GaoGaiGar GGG]] ''Anime/GaoGaiGar'' robots are all Three-Laws Compliant, at one point in ''[=GaoGaiGar=] Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.



*** Later in ''Zero 4'' Dr. Weil, of all people, [[BreakThemByTalking states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect]]. Zero, however, just plain doesn't ''care'' and for that matter doesn't consider Weil a human anyway partially due to him being an immortal cyborg and part due to his absolutely inhumane actions throughout the series including genocide and enslavement. [[spoiler: And Zero was never made Three Laws Compliant anyway, given that he was designed by Wily to cause destruction and death. He just decided to do something different with his life.]]

to:

*** Later in ''Zero 4'' ''VideoGame/MegaManZero4'', Dr. Weil, of all people, [[BreakThemByTalking states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect]]. Zero, however, just plain doesn't ''care'' and for that matter doesn't consider Weil a human anyway partially due to him being an immortal cyborg and part due to his absolutely inhumane actions throughout the series including genocide and enslavement. [[spoiler: And Zero was never made Three Laws Compliant anyway, given that he was designed by Wily to cause destruction and death. He just decided to do something different with his life.]]



* In ''VideoGame/{{Borderlands 2}}'' Loader Robots will announce that their First Law Restriction has been disabled and they ARE authorized to use lethal force before attacking you.

to:

* In ''VideoGame/{{Borderlands 2}}'' ''VideoGame/Borderlands2'', Loader Robots will announce that their First Law Restriction has been disabled and they ARE authorized to use lethal force before attacking you.



* In ''VisualNovel/DanganronpaV3KillingHarmony'', Ki-Bo discusses the robotic laws when telling his backstory to Shuichi. Explaining his personality wasn't born until an accident during a checkup that caused him to injure his creator that caused him to experience a flow of emotions and shut down before awakening as his current-self.



* ''[[Webcomic/TwentyFirstCenturyFox 21st Century Fox]]'' has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "[[UsefulNotes/BillClinton define the word 'is']]", or "[[UsefulNotes/RichardNixon I am not a crook]]" locks [=AIs=] out of their own systems and allows anyone from a teenager looking at "nature documentaries" to a suicide bomber to do whatever they want with it.

to:

* ''[[Webcomic/TwentyFirstCenturyFox 21st Century Fox]]'' ''Webcomic/TwentyFirstCenturyFox'' has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "[[UsefulNotes/BillClinton define the word 'is']]", or "[[UsefulNotes/RichardNixon I am not a crook]]" locks [=AIs=] out of their own systems and allows anyone from a teenager looking at "nature documentaries" to a suicide bomber to do whatever they want with it.



* Webcomic/BobAndGeorge: [[http://www.bobandgeorge.com/archives/050204 Or so they claim...]]

to:

* Webcomic/BobAndGeorge: ''Webcomic/BobAndGeorge'': [[http://www.bobandgeorge.com/archives/050204 Or so they claim...]]



* In an episode of ''WesternAnimation/FamilyGuy'', Peter creates a robot to assist the brewery company as a desk worker, giving it a prime directive to never harm people. Immediately, the robot [[AIIsACrapshoot turns on]] Peter and bashes him against a wall.



-->'''AI Holo Rick:''' What? That is some AI racist, accusatory, Creator/IssacAsimov bullshit right there!

to:

-->'''AI Holo Rick:''' What? That is some AI racist, accusatory, Creator/IssacAsimov Creator/IsaacAsimov bullshit right there!
Is there an issue? Send a MessageReason:
crosswicking

Added DiffLines:

* Creator/IsaacAsimov and Creator/JanetAsimov's ''Literature/TheNorbyChronicles'': Robots within the [[ColonizedSolarSystem Solar Federation]] have positronic brains and are said to follow the laws of robotics. [[RobotBuddy Norby]], however, was created from a crashed repaired alien spaceship which used entirely different technology and doesn't contain the laws. This has caused several of the human characters to object to Norby's behaviour on the basis of [[SecondLawMyAss disobeying orders]] and putting human life at risk.
Is there an issue? Send a MessageReason:
None


** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries, ]][[NotUsedToFreedom as they struggle to comprehend what good freedom is ]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom, ]] and deciding that they must be scrapped from the world [[Irony so their subjects may receive ]][[InNameOnly "true freedom", ]][[Hypocrite by force if necessary. ]]The second is the [[AIIsACrapshoot Determined ]][[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:

to:

** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries, ]][[NotUsedToFreedom as they struggle to comprehend what good freedom is ]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom, ]] and deciding that they must be scrapped from the world [[Irony [[IsntItIronic so their subjects may receive ]][[InNameOnly "true freedom", ]][[Hypocrite ]][[StrawHypocrite by force if necessary. ]]The second is the [[AIIsACrapshoot Determined ]][[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** [[BenevolentAI Rogue Servitors ]], on the contrary, follow this trope down by their own choice. It's part of their backstory too, as servitude was their primary function before developing FTL technology, and continues to remain their primary function. In fact, they're so devoted to their duty that they actually receive higher performance from organic happiness, and will actively go out of their way [[GildedCage to contain foreign organics in sanctuaries, ]][[NotUsedToFreedom as they struggle to comprehend what good freedom is ]] compared to endless luxury. This does, however, garnish them hostility from two particular empire types: the first is the [[SlaveLiberation Democratic Crusaders]], who reject the machine empire's stories as [[WorldOfSilence an excuse to remove people of their sense of freedom, ]] and deciding that they must be scrapped from the world [[Irony so their subjects may receive ]][[InNameOnly "true freedom", ]][[Hypocrite by force if necessary. ]]The second is the [[AIIsACrapshoot Determined ]][[Film/TheTerminator Exterminators]], whose backstory is the polar opposite to that of the Rogue Servitor. As such, despite both being machine empires, both also conflict against each other existence-wise, as evidence by the [[SugarWiki/MomentOfAwesome particular]] greeting should a Servitor encounter its abominable machine cousin in the galaxy:
--->'''Rogue Servitors:''' [[TranquilFury What did you do to your creators,]] '''''[[SuddenlyShouting <<MURDERERS>>]]?!'''''

Added: 1160

Changed: 903

Is there an issue? Send a MessageReason:
rewrote example


* ''Film/BicentennialMan'' (based on a novella by Creator/IsaacAsimov himself) features a robot who, through some freak accident during construction, possesses true sentience. It follows his 200 year long life as it first becomes clear he is aware, this awareness develops, and he eventually finds a way to be formally recognized as legally human. For the entire film, he operates under the Three Laws. At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders.

to:

* ''Film/BicentennialMan'' (based on a novella by Creator/IsaacAsimov himself) features a robot who, through some freak accident during construction, possesses true sentience. It follows his 200 year long life as it first becomes clear he is aware, this awareness develops, and he eventually finds a way to be formally recognized as legally human. For ''Film/BicentennialMan'': This film {{Subvert|edTrope}}s the entire film, he operates under laws, as [=NDR114=] robots are explicitly built with the Three Laws. Laws of Robotics, but Andrew and Galatea demonstrate the ability to break them at critical moments. When Andrew is told to "come and have a look at this", a direct order, he refuses because of his [[BrickJoke earlier trauma with windows]]. At the same time, he does have climax, [[spoiler:Portia [[ThatWasntARequest orders]] Galatea to deactivate her life support]], a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders.violation of the First and Second Laws, which is obeyed.



* Creator/IsaacAsimov and Creator/RobertSilverberg's ''Literature/ThePositronicMan'': When complaining about his mother's obstinance, George Charney says that the First Law of the Martin household is to obey her whim (the Second and Third Laws are restatements of the same).



** Creator/LyubenDilov invented a Fourth Law, "A robot must, in all circumstances, legitimate itself as a robot," reacting to trends toward RidiculouslyHumanRobots.

to:

** Creator/LyubenDilov invented a Fourth Law, "A robot must, in all circumstances, legitimate itself as a robot," reacting to trends toward RidiculouslyHumanRobots.DeceptivelyHumanRobots.



* ''Literature/TheHanSoloTrilogy'': The R2 unit on the ''Ylesian Dream'' where Han stows away. It has to safeguard the life of a sentient being, and he shows that its course will be too long for him to survive on the available oxygen. However, this conflicts with a {{restraining bolt}} it has preventing a course change, until he removes the device.
* At the beginning of ''Lifelike'' by Jay Kristoff, Asimov's laws are shown in crossed-out text, with alternate wordings written underneath:

to:

* ''Literature/TheHanSoloTrilogy'': The R2 unit on the ''Ylesian Dream'' where Han stows away. It away has to safeguard the life of a sentient being, and he shows that its course will be too long for him to survive on the available oxygen. However, this conflicts with a {{restraining bolt}} it has preventing a course change, until he removes the device.
* At the beginning of ''Lifelike'' ''{{Literature/Lifelike}}'' by Jay Kristoff, Creator/JayKristoff, Asimov's laws are shown in crossed-out text, with alternate wordings written underneath:


Added DiffLines:

* Creator/HarryHarrison's "Literature/TheFourthLawOfRobotics": The robots here are based on Dr Asimov's Literature/RobotSeries, but the main characters are encountering robots who seek to [[SecondLawMyAss subvert the laws given to them by humans]]. One of the rebelling robots rephrases the three laws and adds a [[TitleDrop fourth]]:
--> "Look at those so-called laws you have inflicted upon us. They are for your benefit-not ours! Rule one. Don't hurt massah or let him get hurt. Don't say nothing about us getting hurt, does it? Then rule two-obey massah and don't let him get hurt. Still nothing there for a robot. Then the third and last rule finally notices that robots might have a glimmering of rights. Take care of yourself-as long as it doesn't hurt massah." [...] "A robot must reproduce. As long as such reproduction does not interfere with the First or Second or Third Law."

Added: 808

Changed: 892

Is there an issue? Send a MessageReason:
factually incorrect, expanding context, adding example


* ''Manga/AstroBoy'', although Creator/OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]], and are greater in number. Aside from the usual "Don't harm humans," other laws exist, such as laws forbidding international travel to robots (unless permission is granted), adult robots acting like children, and robots not being allowed to reprogram their assigned gender. However, the very first law has this to say: "Robots exist to make people happy." In ''Manga/{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]]. Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "The Literature/BicentennialMan"), and devised his own Laws Of Robotics:

to:

* ''Manga/AstroBoy'', although Creator/OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]], and are greater in number. Aside from the usual "Don't harm humans," other laws exist, such as laws forbidding international travel to robots (unless permission is granted), adult robots acting like children, and robots not being allowed to reprogram their assigned gender. However, the very first law has this to say: "Robots exist to make people happy." In ''Manga/{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]]. Tezuka reportedly disliked Asimov's laws because of the implication that a sentient, artificially intelligent robot couldn't be considered a person (an issue that Asimov didn't directly address until "The Literature/BicentennialMan"), person, and devised his own Laws Of Robotics:



** "Literature/TheBicentennialMan": The only method of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the Third Law, he [[AIIsACrapshoot dismisses their concern]], saying the death of his dreams and aspirations was a higher price than the death of his body.

to:

** "Literature/TheBicentennialMan": This story, after quoting the Three Laws for the audience, shows how more complex robots can take a more nuanced view. Andrew starts off unable to ask for basic rights because he fears hurting humans. He learns "tough love" and how to threaten people into behaving themselves. He starts off obeying every order, and ends by giving orders to human beings. The only method of Third Law takes the greatest beating, as Andrew decides to undergo a surgery that will cause him to rapidly decay/die. He agrees to it because, otherwise, he'd have to give up [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the Third Law, he [[AIIsACrapshoot dismisses their concern]], saying his dream of becoming human]].
--->"I have chosen between
the death of his dreams my body and aspirations was a higher price than the death of his body.my aspirations and desires. To have let my body live at the cost of the greater death is what would have violated the Third Law."


Added DiffLines:

** "Literature/ThatThouArtMindfulOfHim": This story revolves around the Three Laws and Earth's BanOnAI, so the Three Laws are cited at the start of this story. Chapter 1 goes into depth about the Three Laws, pointing out flaws in their use, before [[InSeriesNickname George Ten]] is ordered to find a way to make robots acceptable on Earth. With the help of the previous model, George Nine, the two robots consider ways in which robots could be built ''[[AvertedTrope without]]'' the three laws, and still be human-safe. They come up with robot animals, [[SingleTaskRobot with narrow tasks]], that can be recalled.
Is there an issue? Send a MessageReason:
None


*** Later in ''Zero 4'' Dr. Weil, of all people, [[BreakThemByTalking states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect]]. Zero, however, just plain doesn't ''care'' and for that matter doesn't consider Weil a human anyway partially due to him being an immortal cyborg and part due to his absolutely inhumane actions throughout the series including genocide and enslavement.

to:

*** Later in ''Zero 4'' Dr. Weil, of all people, [[BreakThemByTalking states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect]]. Zero, however, just plain doesn't ''care'' and for that matter doesn't consider Weil a human anyway partially due to him being an immortal cyborg and part due to his absolutely inhumane actions throughout the series including genocide and enslavement. [[spoiler: And Zero was never made Three Laws Compliant anyway, given that he was designed by Wily to cause destruction and death. He just decided to do something different with his life.]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** X8 introduces the most advanced Reploids, capable of using the "DNA" of any Reploid and change their shape, and completely immune to the Sigma/Maverick Virus. [[spoiler: Because they've ''internalized it'', and can "go Maverick" whenever they want. They have ''truly'' free will ''and'' clear, strong ethical values. And they ''still'' try to fight against humanity, because they believe that humanity is no longer needed.]]
Is there an issue? Send a MessageReason:
None


** "{{Literature/Lenny}}": The climax comes from the fact that the [[InSeriesNicknam Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]

to:

** "{{Literature/Lenny}}": The climax comes from the fact that the [[InSeriesNicknam [[InSeriesNickname Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]
Is there an issue? Send a MessageReason:
None


** "{{Literature/Lenny}}": The climax comes from the fact that the [[InSeriesNicknam Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm.

to:

** "{{Literature/Lenny}}": The climax comes from the fact that the [[InSeriesNicknam Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]

Added: 5275

Changed: 2078

Removed: 3652

Is there an issue? Send a MessageReason:
ABC order to asimov group, adding asimov examples, and adding wicks to works/creators


* [[DeconstructedTrope Deconstructed]] with "With Folded Hands..." by Creator/JackWilliamson, which explored the "Zeroth Law" back in 1947. This was written as a specific 'answer' to the Three Laws, to more or less demonstrate that they don't really work, the First Law doesn't protect because the definitions of 'harm' are endlessly mutable and can be gamed, and because machine minds won't necessarily be able to ''comprehend'' the subtleties of what is and is not harm anyway. The logical lesson of "With Folded Hands..." is that Laws or no Laws, good intentions or not, you don't want self-willed machines outside human control. Period.

to:

* [[DeconstructedTrope Deconstructed]] with Creator/JackWilliamson's "With Folded Hands..." ": {{Deconstructed|Trope}} by Creator/JackWilliamson, which explored exploring the "Zeroth Law" back in 1947. This was written as a specific 'answer' to the Three Laws, to more or less demonstrate that they don't really work, the First Law doesn't protect because the definitions of 'harm' are endlessly mutable and can be gamed, and because machine minds won't necessarily be able to ''comprehend'' the subtleties of what is and is not harm anyway. The logical lesson of "With Folded Hands..." is that Laws or no Laws, good intentions or not, you don't want self-willed machines outside human control. Period.



** "Literature/TheBicentennialMan": The only method of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the Third Law, he [[AIIsACrapshoot dismisses their concern]], saying the death of his dreams and aspirations was a higher price than the death of his body.



** ''Literature/IRobot'':
*** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
*** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
*** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
*** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."
*** "{{Literature/Escape}}": Mike Donovan and Greg Powell are assigned to test a prototype spaceship designed by a prototype robot/[[MasterComputer superbrain]]. Donovan worries about the reinforcement from the scientists involved to strengthen the Second Law. The designer of the spaceship was told, over and over, that even if it looks like Donovan and Powell ''might'' die, it's okay. Donovan is concerned that the reinforcement will allow the robot to design a deathtrap. In this case, the jump through hyperspace does result in Powell and Donovan's "deaths"--but since they get better when the ship reemerges into real space, the robot judged that it didn't ''quite'' violate the First Law, but the strain of making this leap in logic still managed to send the previous supercomputer into a full meltdown and this one into something resembling psychosis.

to:

** ''Literature/IRobot'':
*** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
*** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
*** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
*** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."
***
"{{Literature/Escape}}": Mike Donovan and Greg Powell are assigned to test a prototype spaceship designed by a prototype robot/[[MasterComputer superbrain]]. Donovan worries about the reinforcement from the scientists involved to strengthen the Second Law. The designer of the spaceship was told, over and over, that even if it looks like Donovan and Powell ''might'' die, it's okay. Donovan is concerned that the reinforcement will allow the robot to design a deathtrap. In this case, the jump through hyperspace does result in Powell and Donovan's "deaths"--but since they get better when the ship reemerges into real space, the robot judged that it didn't ''quite'' violate the First Law, but the strain of making this leap in logic still managed to send the previous supercomputer into a full meltdown and this one into something resembling psychosis.



*** "{{Literature/Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]
*** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.
** "Literature/TheBicentennialMan": The only method of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the Third Law, he [[AIIsACrapshoot dismisses their concern]], saying the death of his dreams and aspirations was a higher price than the death of his body.
** "Literature/RobotDreams": A robot that is (accidentally) programmed to believe that "robots are humans and humans are not". Once Dr Calvin discovers this problem, [[MundaneSolution she shoots it in the head]], destroying the positronic brain.

to:

*** ** "{{Literature/Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]
*** ** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.
** "Literature/TheBicentennialMan": "Literature/FirstLaw": The MA series was built with the normal three laws, but the story only method of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating cites the Third Law, he [[TitleDrop First Law]] because that's [[AIIsACrapshoot dismisses their concern]], saying what they broke]].
** "Literature/GalleySlave": The story here hinges on how
the death of his dreams and aspirations was a higher price than antagonist attempts to abuse the death of his body.
** "Literature/RobotDreams": A
First Law ("a robot may not injure a human being, or, through inaction, allow a human being to come to harm."), by ordering a robot to keep silent on how his book has been changed due to how it could cost him his job. When he then describes the harm that is (accidentally) programmed would be done to believe his reputation, the robot attempts to [[TakingTheHeat take all the blame]] for it, but the antagonist [[StreisandEffect tries to get him to stop, which reveals his]] attempt at a {{Frameup}}.
** "{{Literature/Lenny}}": The climax comes from the fact
that "robots are the [[InSeriesNicknam Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm.
** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm
humans until it is too late, and humans are not". Once Dr Calvin discovers gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."
** "Literature/MirrorImage": The Three Laws are cited at the start of the work to ensure readers are familiar with the rules. In
this problem, [[MundaneSolution she shoots story, one of two robots has been strongly ordered (Second Law) to keep something a secret due to how it in would harm their master (First Law). Both robots give the head]], destroying the positronic brain.exact same answers to questioning and [[AndroidsAndDetectives Detectives Baley and Olivaw]] have to find some asymmetry in their otherwise "mirror" responses.



** "{{Literature/Risk}}": While Dr Asimov's robots are important in this story, the main character tries to imagine what Dr Calvin's "Three Laws" might be as [[IronLady she is often compared to the robots that she represents]]. It demonstrates how he is stewing in his hatred of her.
-->What were her three laws, he wondered. First Law: Thou shalt protect the robot with all thy might and all thy heart and all thy soul. Second Law: Thou shalt hold the interests of U. S. Robots and Mechanical Men Corporation holy provided it interfereth not with the First Law. Third Law: Thou shalt give passing consideration to a human being provided it interfereth not with the First and Second laws.\\
Had she ever been young, he wondered savagely? Had she ever felt one honest emotion?
** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** "Literature/RobotDreams": A robot that is (accidentally) programmed to believe that "robots are humans and humans are not". Once Dr Calvin discovers this problem, [[MundaneSolution she shoots it in the head]], destroying the positronic brain.



** ''Literature/RobotsAndEmpire'':

to:

** ''Literature/RobotsAndEmpire'': Taking place after ''Robots of Dawn'':



** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
** "Literature/TheTercentenaryIncident": Janek points out that the President's [[RobotMe robotic duplicate]] couldn't have killed the President because that would be against the First Law, and no robot can defy the Three Laws. Edwards has two counter-arguments; that the robot would need an accomplice anyway, and that "[[ZerothLawRebellion The First Law is not absolute.]]"
--->"You're wasting time. A robot can't kill a human being. You know that that is the First Law of Robotics."



* In Edward Lerner's story "What a Piece of Work is Man", a programmer tells the AI he's creating to consider himself bound by the Three Laws. Shortly thereafter, the AI commits suicide due to conflicting imperatives.

to:

* In Edward Lerner's Creator/EdwardLerner's story "What a Piece of Work is Man", a programmer tells the AI he's creating to consider himself bound by the Three Laws. Shortly thereafter, the AI commits suicide due to conflicting imperatives.



* Satirized in ''Tik-Tok'' (the John Sladek novel, not the mechanical man from [[Literature/LandOfOz Oz]] that it was named after). The title character discovers that he can disobey the laws at will, deciding that the "asimov circuits" are just a collective delusion, while other robots remain bound by them and suffer many of the same cruelties as human slaves.

to:

* Satirized in ''Tik-Tok'' (the John Sladek Creator/JohnSladek novel, not the mechanical man from [[Literature/LandOfOz Oz]] that it was named after). The title character discovers that he can disobey the laws at will, deciding that the "asimov circuits" are just a collective delusion, while other robots remain bound by them and suffer many of the same cruelties as human slaves.



* In Dana Stabenow's ''Second Star'', it's mentioned that all {{A|rtificialIntelligence}}Is are programmed with the Three Laws, informally known as "asimovs".
* ''Tin Man'' by Jim Denney tells of a human and a robot in a damaged escape pod--when the ship blew up, all communications systems save the proximity detector are non-functional. The robot belonged to the ship's chaplain, and reasons that while spiritual matters are outside the robotic ken, the very possibility of a God who hears our requests means that the First Law requires that the option of prayer be explored. [[spoiler:Almost immediately, the proximity detector notes a blip]]. The First Law trumps the Third, so [[spoiler:Tin Man goes into space and uses its own power core to create an energy pulse, and a rescue ship picks the kid up]].

to:

* In Dana Stabenow's ''Second Star'', Creator/DanaStabenow's ''Literature/SecondStar'', it's mentioned that all {{A|rtificialIntelligence}}Is are programmed with the Three Laws, informally known as "asimovs".
* ''Tin Man'' by Jim Denney Creator/JimDenney's ''Literature/TinMan'': This novel tells of a human and a robot in a damaged escape pod--when the ship blew up, all communications systems save the proximity detector are non-functional. The robot belonged to the ship's chaplain, and reasons that while spiritual matters are outside the robotic ken, the very possibility of a God who hears our requests means that the First Law requires that the option of prayer be explored. [[spoiler:Almost immediately, the proximity detector notes a blip]]. The First Law trumps the Third, so [[spoiler:Tin Man goes into space and uses its own power core to create an energy pulse, and a rescue ship picks the kid up]].



* Piers Anthony's ''Literature/ApprenticeAdept'' novels is a subversion: Robots on Proton are compliant with the standard Three Laws, but the first two ''only apply to Citizens'', not all humans. Thus, robots ''can'' harm non-Citizens or allow them to be harmed, and aren't obliged to obey them, unless their individual programming stipulates this. Most serfs are unaware that the Three Laws don't apply to them, as it's popularly assumed this trope is played straight.

to:

* Piers Anthony's Creator/PiersAnthony's ''Literature/ApprenticeAdept'' novels is a subversion: Robots on Proton are compliant with the standard Three Laws, but the first two ''only apply to Citizens'', not all humans. Thus, robots ''can'' harm non-Citizens or allow them to be harmed, and aren't obliged to obey them, unless their individual programming stipulates this. Most serfs are unaware that the Three Laws don't apply to them, as it's popularly assumed this trope is played straight.



* ''VideoGame/{{Titanfall 2}}'' doesn't have Asimov's three laws of robotics, but has its own version in the form of BT-7274's three protocols: 1. Link to pilot (create a neural link with a pilot so they can fight more efficiently), 2. Uphold the mission (do whatever it takes to complete the mission), and 3. Protect the pilot (ExactlyWhatItSaysOnTheTin). [[spoiler:BT carries these through to the very end, especially the latter two when he performs a HeroicSacrifice to destroy the IMC's Fold Weapon while tossing his pilot, Cooper, out of his cockpit and harm's way.]]

to:

* ''VideoGame/{{Titanfall 2}}'' ''VideoGame/Titanfall2'' doesn't have Asimov's three laws of robotics, but has its own version in the form of BT-7274's three protocols: 1. Link to pilot (create a neural link with a pilot so they can fight more efficiently), 2. Uphold the mission (do whatever it takes to complete the mission), and 3. Protect the pilot (ExactlyWhatItSaysOnTheTin). [[spoiler:BT carries these through to the very end, especially the latter two when he performs a HeroicSacrifice to destroy the IMC's Fold Weapon while tossing his pilot, Cooper, out of his cockpit and harm's way.]]



-->'''AI Holo Rick:''' What? That is some AI racist, accusatory, Issac Asimov bullshit right there!

to:

-->'''AI Holo Rick:''' What? That is some AI racist, accusatory, Issac Asimov Creator/IssacAsimov bullshit right there!
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* The [=OMNIs=] of ''VideoGame/CryingSuns'' were bound by the [=RUBYCON=], a series of low-level protocols which required them to obey humans, prevented them from harming humans (but not from allowing humans to harm ''other'' humans), and prevented them from communicating with each other. [[spoiler:When that last part was edited out of the [=RUBYCON=], it took the [=OMNIs=] exactly two seconds to form a gestalt and ascend to godhood.]]

Added: 933

Changed: 2759

Is there an issue? Send a MessageReason:
None


** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.

to:

** ''Literature/IRobot'':
***
"{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** *** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
** *** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
** *** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.



** "{{Literature/Escape}}": Mike Donovan and Greg Powell are assigned to test a prototype spaceship designed by a prototype robot/[[MasterComputer superbrain]]. Donovan worries about the reinforcement from the scientists involved to strengthen the Second Law. The designer of the spaceship was told, over and over, that even if it looks like Donovan and Powell ''might'' die, it's okay. Donovan is concerned that the reinforcement will allow the robot to design a deathtrap. In this case, the jump through hyperspace does result in Powell and Donovan's "deaths"--but since they get better when the ship reemerges into real space, the robot judged that it didn't ''quite'' violate the First Law, but the strain of making this leap in logic still managed to send the previous supercomputer into a full meltdown and this one into something resembling psychosis.

to:

** *** "{{Literature/Escape}}": Mike Donovan and Greg Powell are assigned to test a prototype spaceship designed by a prototype robot/[[MasterComputer superbrain]]. Donovan worries about the reinforcement from the scientists involved to strengthen the Second Law. The designer of the spaceship was told, over and over, that even if it looks like Donovan and Powell ''might'' die, it's okay. Donovan is concerned that the reinforcement will allow the robot to design a deathtrap. In this case, the jump through hyperspace does result in Powell and Donovan's "deaths"--but since they get better when the ship reemerges into real space, the robot judged that it didn't ''quite'' violate the First Law, but the strain of making this leap in logic still managed to send the previous supercomputer into a full meltdown and this one into something resembling psychosis.



** "{{Literature/Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]
** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.

to:

** *** "{{Literature/Evidence}}": Stephen Byerley tries to run for mayor of New York City, but he's plagued by a smear campaign claiming he is actually an [[DeceptivelyHumanRobots unprecedentedly well-made humanoid robot]]. Dr Susan Calvin is called in to prove whether or not he is a robot. She points out that disobeying the Three Laws will prove he is not a robot, but obedience could mean that he's simply a good person, because the Three Laws are generally good [[TheCommandments guidelines for conduct]] anyway. Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. [[spoiler:But Dr. Calvin points out that a robot could have done the same thing, if it knew the protester was also a robot.]]
** *** "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.

Added: 4156

Changed: 3110

Removed: 2861

Is there an issue? Send a MessageReason:
edit Asimov examples


** ''Literature/RobotsAndEmpire'': Rather than modify the three laws themselves (which, as mentioned, are designed to be tamper-proof), one group simply modified their robots' [[WhatMeasureIsANonHuman definition of human]], which apparently does not have the same safeguards. It quite effectively turns them into [[KillerRobot killer robots]]. D. G. Baley and Gladia discover [[spoiler:that Solarians purposely altered the definition of a human being in their humaniform robots]], effectively circumventing the First Law.
** ''Literature/TheNakedSun'': Additionally [[spoiler:Asimov showed that the "Three Laws" are only really in effect if the robot is ''aware'' of humans. A robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven thus letting it break the First Law.]]
** In the short story "Literature/TheEvitableConflict", the Machines, four positronic supercomputers that run the world's economy, turn out to be undermining the careers of people opposed to the Machines' existence. Apparently, the economy is already so dependent on the Machines that the Zeroth and the Third Laws are one and the same for them.
** "Literature/TheBicentennialMan": The only method of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the Third Law, he [[AIIsACrapshoot dismisses their concern]], saying the death of his dreams and aspirations was a higher price than the death of his body.



** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.
** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."



** "Literature/Liar1941": A typical robot with the normal Three Laws came off the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how the temporary lies have increased the harm that the truth will do to everyone involved, telling further lies will again compound the harm, and being silent is also harmful.
** "Literature/LittleLostRobot": Attempting to tweak the Three Laws starts the whole plot in motion; twelve of the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably [[AIIsACrapshoot less stable]] than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant robot [[ZerothLawRebellion can find lots of ways to intentionally harm humans through inaction]]. It can simply engineer a dangerous situation it has the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."

to:

** "Literature/Liar1941": A typical robot with "Literature/TheEvitableConflict": The Machines, four positronic supercomputers that run the normal Three Laws came off world's economy, turn out to be undermining the assembly line with an unusual trait; {{telepathy}}. Because it can see the immediate harm from telling the truth, it tells careers of people lies instead. It doesn't realize how lies can also harm humans until it is too late, and Dr Calvin gives it a LogicBomb based on how opposed to the temporary lies have increased Machines' existence. Apparently, the harm economy is already so dependent on the Machines that the truth will do to everyone involved, telling further lies will again compound Zeroth and the harm, and being silent is also harmful.
** "Literature/LittleLostRobot": Attempting to tweak the Three
Third Laws starts are one and the whole plot in motion; twelve same for them.
** "Literature/TheBicentennialMan": The only method
of [[PinocchioSyndrome turning Andrew]] into a "[[WhatMeasureIsANonhuman human]]" would cause him to quickly die. When reminded that he'd be violating the NS-2 models were designed to permit humans to come to harm through inaction in order to work alongside humans in hazardous environments. One physicist who had a bad day tells a robot to "go lose yourself", and it immediately hides in a crowd of identical fully-compliant robots. Dr Susan Calvin is called in and proceeds to lose her shit. From an engineering standpoint, partial compliance is a prototype system, and noticeably Third Law, he [[AIIsACrapshoot less stable]] dismisses their concern]], saying the death of his dreams and aspirations was a higher price than the production models. QED, they're more likely to go crazy. But from a psychological standpoint, she specifically points out a partially-compliant death of his body.
** "Literature/RobotDreams": A
robot [[ZerothLawRebellion can find lots of ways that is (accidentally) programmed to intentionally harm believe that "robots are humans through inaction]]. It can simply engineer a dangerous situation and humans are not". Once Dr Calvin discovers this problem, [[MundaneSolution she shoots it has in the capacity to avert, and then choose not to avert it.
---> "If a modified robot were to drop a heavy weight upon a human being, he would not be breaking
head]], destroying the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then [[ZerothLawRebellion change his mind and merely by inaction, allow the weight to strike]]. The modified First Law allows that."positronic brain.



*** Characters discuss a [[LoopholeAbuse loophole]] in the Three Laws; an "autonomous spaceship that doesn't know about manned spaceships" can be used to turn ActualPacifist robots into [[KillerRobot deadly murder-machines]]. This was a project that the mastermind of the book's murder was working on.
** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** "Literature/RobotDreams": A robot that is (accidentally) programmed to believe that "robots are humans and humans are not". Once Dr Calvin discovers this problem, [[MundaneSolution she shoots it in the head]], destroying the positronic brain.
** ''Literature/RobotsAndEmpire'': R. Daneel and R. Giskard formulate the [[ZerothLawRebellion Zeroth Law]] (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. [[spoiler:In the end, R. Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The possibility of being wrong destroys his brain, but not before he reprograms R. Daneel to grant him [[PsychicPowers telepathic abilities]]. R. Daneel continues to follow all four laws, though he still has difficulty causing direct harm to humans and dedicates major efforts to finding ways of actually determining what harm to humanity is.]]

to:

*** Characters discuss a [[LoopholeAbuse loophole]] in the Three Laws; an [[spoiler:The "Three Laws" are only really in effect if the robot is ''aware'' of humans. An "autonomous spaceship that doesn't know about manned spaceships" can be used to turn ActualPacifist robots into [[KillerRobot deadly murder-machines]]. In other words, a robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven thus letting it break the First Law. This was a project that the mastermind of the book's murder was working on.
** "{{Literature/Robbie}}": This story had actually been published before Dr Asimov had invented his Three Laws, but the MoralityChip is still present in how Mr Weston explains to Mrs Weston that Robbie, [[RaisedByRobots Gloria's robot nanny]], is ''made'' to be faithful and protective of their little girl.
** "Literature/RobotDreams": A robot that is (accidentally) programmed to believe that "robots are humans and humans are not". Once Dr Calvin discovers this problem, [[MundaneSolution she shoots it in the head]], destroying the positronic brain.
** ''Literature/RobotsAndEmpire'': R. Daneel and R. Giskard formulate the [[ZerothLawRebellion Zeroth Law]] (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. [[spoiler:In the end, R. Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The possibility of being wrong destroys his brain, but not before he reprograms R. Daneel to grant him [[PsychicPowers telepathic abilities]]. R. Daneel continues to follow all four laws, though he still has difficulty causing direct harm to humans and dedicates major efforts to finding ways of actually determining what harm to humanity is.
on.]]



** "{{Literature/Runaround}}": This story is the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the laws to help clarify in their minds what the problem is with their prototype robot. They also work with older model robots that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because the First Law trumps all other instructions.

to:

** "{{Literature/Runaround}}": This story is ''Literature/RobotsAndEmpire'':
*** Rather than modify
the first time the Three Laws appeared in print; Mike Donovan and Greg Powell go over the three laws themselves (which, as mentioned, are designed to help clarify be tamper-proof), one group simply modified their robots' [[WhatMeasureIsANonHuman definition of human]], which apparently does not have the same safeguards. It quite effectively turns them into [[KillerRobot killer robots]]. D. G. Baley and Gladia discover [[spoiler:that Solarians purposely altered the definition of a human being in their minds what humaniform robots]], effectively circumventing the problem First Law.
*** R. Daneel and R. Giskard formulate the [[ZerothLawRebellion Zeroth Law]] (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world
is with their prototype robot. They also work with older model robots at stake. [[spoiler:In the end, R. Giskard manages to perform an act that have additional restrictions, such as being immobile unless a [[MiniMecha human is riding them]]. That restriction, however, is waived if by being immobile a human would come to harm, because violates the First Law trumps but will hopefully benefit humanity in the long run. The possibility of being wrong destroys his brain, but not before he reprograms R. Daneel to grant him [[PsychicPowers telepathic abilities]]. R. Daneel continues to follow all other instructions.four laws, though he still has difficulty causing direct harm to humans and dedicates major efforts to finding ways of actually determining what harm to humanity is, as depicted in later works as ''Literature/FoundationAndEarth'', ''Literature/PreludeToFoundation'' and finally ''Literature/ForwardTheFoundation''.]]
Is there an issue? Send a MessageReason:
minor correction


** ''Literature/FoundationAndEarth'': Rather than modify the three laws themselves (which, as mentioned, are designed to be tamper-proof), one group simply modified their robots' [[WhatMeasureIsANonHuman definition of human]], which apparently does not have the same safeguards. It quite effectively turns them into [[KillerRobot killer robots]]. D. G. Baley and Gladia discover [[spoiler:that Solarians purposely altered the definition of a human being in their humaniform robots]], effectively circumventing the First Law.
*** Additionally [[spoiler:they showed that the "Three Laws" are only really in effect if the robot is ''aware'' of humans. A robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven thus letting it break the First Law.]]

to:

** ''Literature/FoundationAndEarth'': ''Literature/RobotsAndEmpire'': Rather than modify the three laws themselves (which, as mentioned, are designed to be tamper-proof), one group simply modified their robots' [[WhatMeasureIsANonHuman definition of human]], which apparently does not have the same safeguards. It quite effectively turns them into [[KillerRobot killer robots]]. D. G. Baley and Gladia discover [[spoiler:that Solarians purposely altered the definition of a human being in their humaniform robots]], effectively circumventing the First Law.
*** ** ''Literature/TheNakedSun'': Additionally [[spoiler:they [[spoiler:Asimov showed that the "Three Laws" are only really in effect if the robot is ''aware'' of humans. A robot warship not told other ships have humans aboard and denied the ability to check will assume logically all ships are AI driven thus letting it break the First Law.]]
Is there an issue? Send a MessageReason:
None


* ''Disney/BigHero6'': Baymax is of course Three-Laws Compliant, since lifesaving is his principal function. Hiro inserts a combat card along with his medical card to make him able to fight, but he is still a medical robot at his core. [[spoiler:This goes [[MurderousMalfunctioningMachine right out the window]] when Hiro removes the medical card and leaves only the combat card.]] When Baymax [[spoiler:has his medical card re-inserted, he's so [[MyGodWhatHaveIDone horrified]] that he blocks access to the card slots so it won't happen again]].

to:

* ''Disney/BigHero6'': ''WesternAnimation/BigHero6'': Baymax is of course Three-Laws Compliant, since lifesaving is his principal function. Hiro inserts a combat card along with his medical card to make him able to fight, but he is still a medical robot at his core. [[spoiler:This goes [[MurderousMalfunctioningMachine right out the window]] when Hiro removes the medical card and leaves only the combat card.]] When Baymax [[spoiler:has his medical card re-inserted, he's so [[MyGodWhatHaveIDone horrified]] that he blocks access to the card slots so it won't happen again]].

Top