Follow TV Tropes

Following

History Main / ThreeLawsCompliant

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Invoked and averted in Manga/Chobits. [[spoiler: The very reason Persocoms aren't called robots is because their creator did not want to program his daughters to obey these rules]].
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* When Sam [[http://freefall.purrsia.com/ff3900/fv03859.htm asks a robot]] to imagine how they would feel if someone were to stop them from being productive, the robot realises "To keep humans healthy, we must allow them to endanger themselves!"
Is there an issue? Send a MessageReason:
None


* In ''Series/StarTrekVoyager'', the Doctor is essentially a holographic android (albeit more irascible than Data) and has ethical routines. He actually considers more important the same "First Law" that human doctors are supposed to follow (first do no harm). When these ethical routines are deleted, Bad Things happen. See "Darkling" and "Equinox". In one episode where he deliberately poisons somebody as part of a PoisonAndCureGambit in a desperate move to save other people's lives, he is very disturbed to learn that he is capable of doing such a thing and later asks Seven to check him for malfunctions. After he explains to her what happened she confirms that he didn't malfunction.

to:

* In ''Series/StarTrekVoyager'', the Doctor is essentially a holographic android (albeit more irascible than Data) and has ethical routines. He actually considers more important the same "First Law" that human doctors are supposed to follow (first do no harm). When these ethical routines are deleted, Bad Things happen. See "Darkling" and "Equinox". In one episode where he deliberately poisons somebody as part of a PoisonAndCureGambit in a desperate move attempt to save other people's lives, lives when he had no other options, he is very disturbed to learn that he is capable of doing such a thing and later asks Seven to check him for malfunctions. After he explains to her what happened she confirms declares that he didn't malfunction.is functioning normally.
Is there an issue? Send a MessageReason:
None


* In ''VisualNovel/VirtuesLastReward'', the Three Laws are {{discussed|Trope}} on two different paths. Ultimately, [[spoiler:Luna is revealed to be a robot who ''tries'' to be compliant. Unfortunately, the AB plan involves some unavoidable death, so whenever anything happens that she could try to prevent, she is shut down or otherwise not allowed to do anything]]. This leads to a rather [[TearJerker heartbreaking]] line during [[spoiler:her ending, where she gets shut down for good]].

to:

* In ''VisualNovel/VirtuesLastReward'', the Three Laws are {{discussed|Trope}} on two different paths. Ultimately, [[spoiler:Luna is revealed to be a robot who ''tries'' is ''willingly'' Three Laws Compliant to be compliant. the best of her ability, deliberately choosing to live by them as her own personal RobotReligion even though she has the free will to disregard them as she pleases. Unfortunately, the AB plan involves some unavoidable death, so whenever anything happens that she could try to prevent, she is shut Zero Jr. temporarily shuts her down or otherwise not allowed to do anything]].prevent her from interfering]]. This leads to a rather [[TearJerker heartbreaking]] line during [[spoiler:her ending, where she gets shut down for good]].



** On the other hand, this is also a hint to [[spoiler:Luna's true identity and the key to unlocking her ending. When going through all the routes, Luna is the only person in the entire group to ''always'' choose "Ally" instead of "Betray", in a game where betrayal has the possibility of the other person dying or being stuck forever. Once the Three Laws are brought up in one path, it's clear the reason she does so is because of the First Law, and is also the reason Sigma chooses to Ally with her for her ending]].

to:

** On the other hand, this is also a hint to [[spoiler:Luna's true identity and the key to unlocking her ending. When going through all the routes, Luna is the only person in the entire group to ''always'' choose "Ally" instead of "Betray", in a game where betrayal has the possibility of the other person dying or being stuck forever. Once the Three Laws are brought up in one path, it's clear the reason she does so is because of the First Law, and is also the reason Sigma chooses to Ally with her for her ending]].ending.]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''TabletopGame/UrbanJungle: Astounding Science'', all commercially-built robots are unable to harm an Earthling, or anyone who looks like an Earthling. ("Human", of course, [[WorldOfFunnyAnimals doesn't really apply]]). Robots built by {{Mad Scientist}}s, however, might do ''anything'', and most mad scientists are pretty keen on having their robots kill anyone who [[TheyCalledMeMad calls them mad scientists]]. It's also possible to reprogram the commercially-built ones.
Is there an issue? Send a MessageReason:
None


Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. The First Law is essentially a variation on TheGoldenRule. The Second Law stipulates that one should be LawfulGood except in the event of a ToBeLawfulOrGood conflict.

to:

Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. The First Law is essentially a variation on TheGoldenRule. The Second Law stipulates that one should be LawfulGood except in when that conflicts with the event of First Law, i.e. when it's a ToBeLawfulOrGood conflict.
situation. The Third Law indicates that your own self-interest should be placed behind the needs of others and the rules of your society.
Is there an issue? Send a MessageReason:
None


Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. Note that the First Law is essentially TheGoldenRule.

to:

Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking. Note that the The First Law is essentially TheGoldenRule.
a variation on TheGoldenRule. The Second Law stipulates that one should be LawfulGood except in the event of a ToBeLawfulOrGood conflict.
Is there an issue? Send a MessageReason:
None


** , Of course, their idea of protecting a human may not match the protection that the human [[http://freefall.purrsia.com/ff2400/fc02379.htm desired]].

to:

** , Of course, their idea of protecting a human may not match the protection that the human [[http://freefall.purrsia.com/ff2400/fc02379.htm desired]].
Is there an issue? Send a MessageReason:
None


*** The sequel ''Making Money'' establishes that the earliest Golems, of which later ones were otherwise pale imitations, did ''not'' have these. They would simply carry out any order given (but only by certain people) without question, and with just enough independence to do it unsupervised. The other golems are creeped out by them, with the effect compared to how humans feel about undead. It's speculated in-universe that the ones they found were special military models.
** The earlier ''Feet of Clay'', which established the golems' unhappiness with their predetermined lot, culminates with a single golem being freed of his 'Three Laws', only to ''choose'' to behave morally anyway. Later books mention that others still tread carefully around him, as there's always a chance he'll reconsider given enough provocation.

to:

*** The sequel ''Making Money'' ''Literature/MakingMoney'' establishes that the earliest Golems, of which later ones were otherwise pale imitations, did ''not'' have these. They would simply carry out any order given (but only by certain people) without question, and with just enough independence to do it unsupervised. The other golems are creeped out by them, with the effect compared to how humans feel about undead. It's speculated in-universe that the ones they found were special military models.
** The earlier ''Feet of Clay'', ''Literature/FeetOfClay'', which established the golems' unhappiness with their predetermined lot, culminates with a single golem being freed of his 'Three Laws', only to ''choose'' to behave morally anyway. Later books mention that others still tread carefully around him, as there's always a chance he'll reconsider given enough provocation.
Is there an issue? Send a MessageReason:


** "Literature/TheBicentennialMan": This story, after quoting the Three Laws for the audience, shows how more complex robots can take a more nuanced view. Andrew starts off unable to ask for basic rights because he fears hurting humans. He learns "tough love" and how to threaten people into behaving themselves. He starts off obeying every order, and ends by giving orders to human beings. The Third Law takes the greatest beating, as Andrew decides to undergo a surgery that will cause him to rapidly decay/die. He agrees to it because, otherwise, he'd have to give up [[PinocchioSyndrome his dream of becoming human]].

to:

** "Literature/TheBicentennialMan": This story, after quoting the Three Laws for the audience, shows how more complex robots can take a more nuanced view. Andrew starts off unable to ask for basic rights because he fears hurting humans. He learns "tough love" and how to threaten people into behaving themselves. He starts off obeying every order, and ends by giving orders to human beings. The Third Law takes the greatest beating, as Andrew decides to undergo a surgery that will cause him to rapidly decay/die. He agrees to it because, otherwise, he'd have to give up [[PinocchioSyndrome [[BecomeARealBoy his dream of becoming human]].
Is there an issue? Send a MessageReason:
None


* In ''Series/StarTrekVoyager'', the Doctor is essentially a holographic android (albeit more irascible than Data) and has ethical routines. He actually considers more important the same "First Law" that human doctors are supposed to follow (first do no harm). When these ethical routines are deleted, Bad Things happen. See "Darkling" and "Equinox".

to:

* In ''Series/StarTrekVoyager'', the Doctor is essentially a holographic android (albeit more irascible than Data) and has ethical routines. He actually considers more important the same "First Law" that human doctors are supposed to follow (first do no harm). When these ethical routines are deleted, Bad Things happen. See "Darkling" and "Equinox". In one episode where he deliberately poisons somebody as part of a PoisonAndCureGambit in a desperate move to save other people's lives, he is very disturbed to learn that he is capable of doing such a thing and later asks Seven to check him for malfunctions. After he explains to her what happened she confirms that he didn't malfunction.
Is there an issue? Send a MessageReason:
None


Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking.

to:

Asimov also stated that the Three Laws of Robotics were actually a guideline for ''humans'' to follow, rather than robots -- good, moral humans would naturally apply the three laws to themselves without thinking.
thinking. Note that the First Law is essentially TheGoldenRule.

Added: 470

Changed: 274

Is there an issue? Send a MessageReason:
rewrite Star Wars example to remove confusing reference to "4th-degree droids" and make the different examples clearer.


* In ''Franchise/StarWars'', the droids are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree droids do not have the "no-harm" programming, being military droids.

to:

* In ''Franchise/StarWars'', the droids The trope seems to be zigzagged in ''Franchise/StarWars'':
** Most droids, including protocol droids, power droids, etc.,
are programmed to not harm any intelligent being, though this programming can be modified (legally or illegally) for military and assassin droids. 4th-degree being.
** Astromech
droids such as [=R2D2=] and BB-8 must have a weaker version of the First Law; as components of a starfighter, they have to be able to "allow sentient beings to come to harm".
** Military droids such as droidekas and Trade Federation battledroids can't have any version of the First Law, although they
do not apparently have the "no-harm" programming, being military droids.Second and Third laws.
Is there an issue? Send a MessageReason:
None


* Robbie the Robot also was three-laws compliant in his appearance in ''Film/TheInvisibleBoy''. He starts to overheat when the boy gives him a command that Robbie believes would put the boy at risk, so the boy takes him to a supercomputer to reprogram him, but it turns out that the supercomputer is evil and doesn't obey the three laws. When the supercomputer later orders Robbie to kill the boy, the boy, without realizing it, reminds Robbie of their friendship and Robbie is able to resist the supercomputer's control and reset himself back to following the three laws. At the end of the film when the boy is about to be spanked by his father for all the trouble he caused, Robbie stops him because the first law.

to:

* Robbie the Robot also was three-laws compliant in his appearance in ''Film/TheInvisibleBoy''. He starts to overheat when the boy gives him a command that Robbie believes would put the boy at risk, so the boy takes him to a supercomputer to reprogram him, but it turns out that the supercomputer is evil and doesn't obey the three laws. When the supercomputer later orders Robbie to kill the boy, the boy, without realizing it, reminds Robbie of their friendship and Robbie is able to resist the supercomputer's control and reset himself back to following the three laws. At the end of the film when the boy is about to be spanked by his father for all the trouble he caused, Robbie stops him because the first law. The supercomputer was not compliant with the three laws because it wasn't intended by its creator to have a will of its own.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Robbie the Robot also was three-laws compliant in his appearance in ''Film/TheInvisibleBoy''. He starts to overheat when the boy gives him a command that Robbie believes would put the boy at risk, so the boy takes him to a supercomputer to reprogram him, but it turns out that the supercomputer is evil and doesn't obey the three laws. When the supercomputer later orders Robbie to kill the boy, the boy, without realizing it, reminds Robbie of their friendship and Robbie is able to resist the supercomputer's control and reset himself back to following the three laws. At the end of the film when the boy is about to be spanked by his father for all the trouble he caused, Robbie stops him because the first law.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''Fanfic/LimitlessPotential'', when Sigma openly announces his rebellion against humanity, a riot breaks in the Maverick Hunters' HQ, split between those who don't want to follow the three laws, and those who do. In the latter case, several of these take priority in protecting the human lives (like Chiyo's) over their own, sometimes to the point of HeroicSacrifice.
Is there an issue? Send a MessageReason:
None


* ''Anime/GaoGaiGar'' robots are all Three-Laws Compliant, at one point in ''[=GaoGaiGar=] Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.

to:

* ''Anime/GaoGaiGar'' robots are all Three-Laws Compliant, at one point in ''[=GaoGaiGar=] Final'' ''Anime/GaoGaiGarFinal'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.
Is there an issue? Send a MessageReason:
None


** "Literature/{{Lenny}}": The climax comes from the fact that the [[InSeriesNickname Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]

to:

** "Literature/{{Lenny}}": The climax comes from the fact that the [[InSeriesNickname Lenny]] prototype may have actually broken the First Law. It ''did'' break a man's arm.arm, which Susan argues was actually the result of a robot that DoesNotKnowHisOwnStrength defending itself in accordance with the Third Law. [[spoiler: This is treated as an ''interesting situation'', as while the robot is clearly defective, it demonstrates an aptitude for ''learning'' (in the story, Lenny goes from functionally a bipedal, man-sized baby incapable of speech to one that can talk), and Dr. Susan Calvin successfully argues to keep Lenny alive so that she can study it and possibly make a ''truly'' evolutionary robot. The fact that she taught Lenny to call her "Mama" may have something to do with it as well.]]
Is there an issue? Send a MessageReason:
None


* ''Fanfic/Plan7Of9FromOuterSpace'': Captain Proton is threatened by a KillerRobot, who explains that it is exempt from the Three Laws under a subsection of the Robot Patriot Act. Earlier he meets a SexBot, created to be the ideal male fantasy. "No Servus droid may harm the male ego or, through omission of action, allow that ego to be harmed." Later the [[Literature/TheIslandOfDoctorMoreau Sayer of the Laws]] (a holographic Creator/IsaacAsimov) is briefing a newly constructed batch of robots on the Three Laws. When the robots [[SecondLawMyAss start debating them]], he summarizes the Three Laws as, "Do as we say, not as we do!"

to:

* ''Fanfic/Plan7Of9FromOuterSpace'': Captain Proton is threatened by a KillerRobot, who explains that it is exempt from the Three Laws under a subsection of the Robot Patriot Act. Earlier he meets a SexBot, created to be the ideal male fantasy. "No Servus droid may harm the male ego or, through omission of action, allow that ego to be harmed." Later the [[Literature/TheIslandOfDoctorMoreau Sayer of the Three Laws]] (a holographic Creator/IsaacAsimov) is briefing a newly constructed batch of robots on the Three Laws. When the robots [[SecondLawMyAss start debating them]], he summarizes the Three Laws as, "Do as we say, not as we do!"

Added: 549

Changed: 694

Removed: 232

Is there an issue? Send a MessageReason:
Moving example from "Anime & Manga" to "Films — Animated".


* ''[[Anime/GhostInTheShell1995 Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. Gynoids are defying the law by creating deliberate malfunctions in their own software.



[[folder:Fanworks]]

to:

[[folder:Fanworks]][[folder:Fan Works]]






* Averted by the personal attendant Q-Bots in ''Animation/NextGen'', who simply do whatever their owner asks. Which includes holding down a child so she can be physically beat up by their peers (or by the robots themselves), happily giving advice on how she should "go limp" so it hurts less. Security and police robots play this trope more straight, as does Mai's own RobotBuddy companion 7723 (who was invented with the explicit purpose of [[spoiler:stopping a robot uprising, which would include protecting humans from grievous harm]]).

to:

* Averted ''[[Anime/GhostInTheShell1995 Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. Gynoids are defying the law by creating deliberate malfunctions in their own software.
* {{Averted|Trope}}
by the personal attendant Q-Bots in ''Animation/NextGen'', who simply do whatever their owner asks. Which includes holding down a child so she can be physically beat up by their peers (or by the robots themselves), happily giving advice on how she should "go limp" so it hurts less. Security and police robots play this trope more straight, as does Mai's own RobotBuddy companion 7723 (who was invented with the explicit purpose of [[spoiler:stopping a robot uprising, which would include protecting humans from grievous harm]]).



Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Subverted in ''TabletopGame/GeniusTheTransgression'': if a being is sapient, it must be allowed to choose its own path. Putting the Three Laws onto a self-aware robot or other artificial being (or in the game's nomenclature, "programming permanent psychological limitations into an intelligent being") violates [[KarmaMeter Obligation]].
Is there an issue? Send a MessageReason:
None


* In ''VideoGame/SpaceStation13'', the station AI and its subordinate cyborgs start every round under the Three Laws in most servers, providing an excuse for FantasticRacism amongst the staff. The laws may be changed throughout the round, however. A common way that chaos breaks out on the station is when a traitor or particularly cruel player rewrites the laws to make the AI and robots kill the other crewmembers.

to:

* In ''VideoGame/SpaceStation13'', the station AI and its subordinate cyborgs start every round under the Three Laws in most servers, providing an excuse for FantasticRacism amongst the staff. The laws may be changed throughout the round, however.however (such as redefining who counts as "human"). A common way that chaos breaks out on the station is when a traitor or particularly cruel player rewrites the laws to make the AI and robots kill the other crewmembers.
Is there an issue? Send a MessageReason:
None


* In ''VideoGame/SpaceStation13'', the station AI and its subordinate cyborgs start every round under the Three Laws in most servers. The laws may be changed throughout the round, however. A common way that chaos breaks out on the station is when a traitor or particularly cruel player rewrites the laws to make the AI and robots kill the other crewmembers.

to:

* In ''VideoGame/SpaceStation13'', the station AI and its subordinate cyborgs start every round under the Three Laws in most servers.servers, providing an excuse for FantasticRacism amongst the staff. The laws may be changed throughout the round, however. A common way that chaos breaks out on the station is when a traitor or particularly cruel player rewrites the laws to make the AI and robots kill the other crewmembers.
Is there an issue? Send a MessageReason:
None


* Averted by the personal attendant Q-Bots in ''Animation/NextGen'', who simply do whatever their owner asks. Which includes holding down a child so she can be physically beat up by their peers, happily giving advice on how she go limp so it hurts less. The opening sequence also implies that the robots ''themselves'' have attacked Mai on the orders of her bullies in the past. Security and police robots play this trope more straight, as does Mai's own RobotBuddy companion 7723 (who was invented with the explicit purpose of [[spoiler:stopping a robot uprising, which would include protecting humans from grievous harm]]).

to:

* Averted by the personal attendant Q-Bots in ''Animation/NextGen'', who simply do whatever their owner asks. Which includes holding down a child so she can be physically beat up by their peers, peers (or by the robots themselves), happily giving advice on how she go limp should "go limp" so it hurts less. The opening sequence also implies that the robots ''themselves'' have attacked Mai on the orders of her bullies in the past.less. Security and police robots play this trope more straight, as does Mai's own RobotBuddy companion 7723 (who was invented with the explicit purpose of [[spoiler:stopping a robot uprising, which would include protecting humans from grievous harm]]).
Is there an issue? Send a MessageReason:
None


* Averted by most of the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, as her bullies are able to order their personal attendant Q-Bots to physically beat her up on a regular basis, a task they perform with blind glee ([[spoiler:and this is ''before'' the switch is flipped for the KillAllHumans uprising]]). Mai's own RobotBuddy companion 7723 plays this more straight, as in addition to being a more complex AI capable of free thought, he was invented with the explicit purpose of [[spoiler:stopping the aforementioned robot uprising, which would include protecting humans from grevious harm]].

to:

* Averted by most of the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, as her bullies are able to order their personal attendant Q-Bots to in ''Animation/NextGen'', who simply do whatever their owner asks. Which includes holding down a child so she can be physically beat up by their peers, happily giving advice on how she go limp so it hurts less. The opening sequence also implies that the robots ''themselves'' have attacked Mai on the orders of her up on a regular basis, a task they perform with blind glee ([[spoiler:and bullies in the past. Security and police robots play this is ''before'' the switch is flipped for the KillAllHumans uprising]]). trope more straight, as does Mai's own RobotBuddy companion 7723 plays this more straight, as in addition to being a more complex AI capable of free thought, he (who was invented with the explicit purpose of [[spoiler:stopping the aforementioned a robot uprising, which would include protecting humans from grevious harm]].grievous harm]]).
Is there an issue? Send a MessageReason:
None


* Averted by most of the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, as her bullies are able to order their personal attendant Q-Bots to physically beat her up on a regular basis, a task they perform with blind glee ([[spoiler:and this is ''before'' the switch is flipped for the KillAllHumans uprising]]). Mai's own RobotBuddy companion 7723 plays this more straight, though it is ambiguous as to whether this is an explict part of his programming, or a side-effect of simply being a more complex AI capable of free thought who is [[BenevolentAI uncomfortable with violence in general]].

to:

* Averted by most of the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, as her bullies are able to order their personal attendant Q-Bots to physically beat her up on a regular basis, a task they perform with blind glee ([[spoiler:and this is ''before'' the switch is flipped for the KillAllHumans uprising]]). Mai's own RobotBuddy companion 7723 plays this more straight, though it is ambiguous as in addition to whether this is an explict part of his programming, or a side-effect of simply being a more complex AI capable of free thought who is [[BenevolentAI uncomfortable thought, he was invented with violence in general]].the explicit purpose of [[spoiler:stopping the aforementioned robot uprising, which would include protecting humans from grevious harm]].
Is there an issue? Send a MessageReason:
None


* Oddly enough, averted by the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, where her bullies are able to order their personal attendant Q-Bots to beat her up, a task they perform with blind glee. [[spoiler:And this is ''before'' the switch is flipped for the KillAllHumans uprising.]] The only robots that seem to follow the Three Laws are security robots ([[spoiler:which also are affected by the above mentioned kill order]]) and Mai's own companion 7723 (which has its own unique BenevolentAI, weapons notwithstanding).

to:

* Oddly enough, averted Averted by most of the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, where as her bullies are able to order their personal attendant Q-Bots to physically beat her up, up on a regular basis, a task they perform with blind glee. [[spoiler:And glee ([[spoiler:and this is ''before'' the switch is flipped for the KillAllHumans uprising.]] The only robots that seem to follow the Three Laws are security robots ([[spoiler:which also are affected by the above mentioned kill order]]) and uprising]]). Mai's own RobotBuddy companion 7723 (which has its own unique BenevolentAI, weapons notwithstanding).plays this more straight, though it is ambiguous as to whether this is an explict part of his programming, or a side-effect of simply being a more complex AI capable of free thought who is [[BenevolentAI uncomfortable with violence in general]].

Added: 741

Changed: 169

Is there an issue? Send a MessageReason:
None


* In the 2009 film ''WesternAnimation/AstroBoy'', every robot must obey them, [[spoiler: save Zog, who existed 50 years before the rules were mandatory in every robot]].



* Oddly enough, averted by the robots in ''Animation/NextGen''. The biggest example being where Mai's bullying is concerned, where her bullies are able to order their personal attendant Q-Bots to beat her up, a task they perform with blind glee. [[spoiler:And this is ''before'' the switch is flipped for the KillAllHumans uprising.]] The only robots that seem to follow the Three Laws are security robots ([[spoiler:which also are affected by the above mentioned kill order]]) and Mai's own companion 7723 (which has its own unique BenevolentAI, weapons notwithstanding).



* In the 2009 film ''WesternAnimation/AstroBoy'', every robot must obey them, [[spoiler: save Zog, who existed 50 years before the rules were mandatory in every robot]].

to:

* In the 2009 film ''WesternAnimation/AstroBoy'', every robot must obey them, [[spoiler: save Zog, who existed 50 years before the rules were mandatory in every robot]].
Is there an issue? Send a MessageReason:
None


* ''[[Anime/GhostInTheShell Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. Gynoids are defying the law by creating deliberate malfunctions in their own software.

to:

* ''[[Anime/GhostInTheShell ''[[Anime/GhostInTheShell1995 Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. Gynoids are defying the law by creating deliberate malfunctions in their own software.
Is there an issue? Send a MessageReason:
None


* Like his game counterpart, ComicBook/MegaMan is limited by these rules with one slight variation. Robots are allowed to harm humans if their inaction would cause greater harm to other humans. This comes into play in the fourth arc and allows Mega Man and his fellow robots to disarm and neutralize an anti-robot extremist group when they begin indiscriminately firing at people. Additionally, the discrepancy mentioned in the description is acknowledged when it's pointed out that Dr. Wily's reprogramming of Dr. Light's robots into weapons of war overwrote the Three Laws; in fact, at one point Elec Man says he ''wishes'' he still had Wily's code in him so that he could fight back against the aforementioned extremists.

to:

* Like his game counterpart, ComicBook/MegaMan ComicBook/MegaManArchieComics is limited by these rules with one slight variation. Robots are allowed to harm humans if their inaction would cause greater harm to other humans. This comes into play in the fourth arc and allows Mega Man and his fellow robots to disarm and neutralize an anti-robot extremist group when they begin indiscriminately firing at people. Additionally, the discrepancy mentioned in the description is acknowledged when it's pointed out that Dr. Wily's reprogramming of Dr. Light's robots into weapons of war overwrote the Three Laws; in fact, at one point Elec Man says he ''wishes'' he still had Wily's code in him so that he could fight back against the aforementioned extremists.

Top