Follow TV Tropes

Following

History Main / ThreeLawsCompliant

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** I guess the Captain's steering-wheel robot considers "roughing up" to not count as "harm?"

Added: 322

Changed: 435

Is there an issue? Send a MessageReason:
None


* [[RoboCop Robocop]], being a cyborg (and a policeman on top of that), does not have the three laws built into his programing. He does have 3 directives, though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing is, there's enough leeway for him to use his own judgment, or else it really wouldn't work.

to:

* [[RoboCop Robocop]], being a cyborg (and a policeman on top of that), policeman, does not have the three laws built into his programing. He does have programing because, among more plot-relevant reasons, they would hinder his effectiveness as an urban pacification unit. (He ''needs'' to be able to kill or grievously wound, ignore orders if they prevent him from protecting people, and ...well, shoot back.)
** In their place, he has his
3 directives, "Prime Directives" though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing is, to note is there's enough so much leeway for him to use his own judgment, or else there that, if it really wouldn't work.was anyone less than [[TheFettered duty-proud Alex Murphy]], they'd probably backfire.

Added: 379

Changed: 3

Is there an issue? Send a MessageReason:
None


* [[RoboCop Robocop]], being a cyborg (and a policeman on top of that), does not have the three laws built into his programing. He does have 3 directives, though: 1) "Serve the public trust." 2) "Protect the innocent." 3) "Uphold the law." (Plus a 4th, "Classified.") The important thing is, there's enough leeway for him to use his own judgment, or else it really wouldn't work.



* CoryDoctorow makes reference to the Three Laws in the short stories "I, Robot" (which presents them unfavourably as part of a totalitarian political doctrine) and "I, Row-Boat" (which presents them favourably as the core of a quasi-religious movement called Asimovism).

to:

* CoryDoctorow makes reference to the Three Laws in the short stories "I, Robot" (which presents them unfavourably unfavorably as part of a totalitarian political doctrine) and "I, Row-Boat" (which presents them favourably favorably as the core of a quasi-religious movement called Asimovism).



**** A clever robot engineer could insert fundamental behavioural restrictions into the brains of his creations along with justifying mechanisms. The robot itself might be quite convinced it was free willed, but invent plausible reasons (at least to itself) why it didn't, say, commit murder. ParanoiaFuel for any artificial intelligence, that's for sure.

to:

**** A clever robot engineer could insert fundamental behavioural behavioral restrictions into the brains of his creations along with justifying mechanisms. The robot itself might be quite convinced it was free willed, but invent plausible reasons (at least to itself) why it didn't, say, commit murder. ParanoiaFuel for any artificial intelligence, that's for sure.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** And another to teach robots to [[http://www.wired.co.uk/news/archive/2010-06/3/research-stops-robots-stabbing-humans not stab humans]].

Changed: 18

Is there an issue? Send a MessageReason:
None


* RedDwarf averts this for the most part; there are Simulants, robotic war machines who have no problem whatsoever with killing. Kryten however, along with many other robots who are designed for less violent purposes, tend to act in a somewhat Three Laws Compliant manner. It is revealed that this is achieved by programming them to believe in "Silicon Heaven", a place they will go when they die so long as they behave themselves, obey their human creators, etc. This belief is so hardwired that they scoff at any attempt to question the existence of Silicon Heaven ("Where would all the calculators go?!"), and one robot even malfunctions when Kryten tells him it isn't real (though he's lying and still believes in it himself).

to:

* RedDwarf averts this for the most part; there are Simulants, robotic war machines who have no problem whatsoever with killing. Kryten however, along with many other robots who are designed for less violent purposes, tend to act in a somewhat Three Laws Compliant manner. It is revealed that this is achieved by programming them to believe in "Silicon Heaven", "[[RobotReligion Silicon Heaven]]", a place they will go when they die so long as they behave themselves, obey their human creators, etc. This belief is so hardwired that they scoff at any attempt to question the existence of Silicon Heaven ("Where would all the calculators go?!"), and one robot even malfunctions when Kryten tells him it isn't real (though he's lying and still believes in it himself).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

[[folder:TruthInTelevision]]
* RealLife roboticists are taking the Three Laws ''very'' seriously (Asimov lived long enough to see them do so, which pleased him to no end). Recently, a robot was built with no purpose other than [[http://www.wired.co.uk/news/archive/2010-10/15/robots-punching-humans punching humans in the arm]]... so that the researchers could gather valuable data about just what level of force will cause harm to a human being.
[[/folder]]

Added: 310

Changed: 2

Is there an issue? Send a MessageReason:
None


The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance to its intended function. (Asimov did, however, suggest a workaround for this: an autonomous spaceship programmed with the Three Laws could easily blow up spaceships full of people because, being itself an unmanned spaceship, would assume that any other spaceships were unmanned as well.)


to:

The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance hindrance to its intended function. (Asimov did, however, suggest a workaround for this: an autonomous spaceship programmed with the Three Laws could easily blow up spaceships full of people because, being itself an unmanned spaceship, would assume that any other spaceships were unmanned as well.)




* In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be ever so slightly harming those who would seek to upset the worlds economy for their own ends, in order that they might protect humanity as a whole. This has been refrenced as the "Zeroth Law of Robotics" and only applies to any machines [[FridgeBrilliance any positronic machine who deduces its existence.]]

to:

* In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be ever so slightly harming those who would seek to upset the worlds economy for their own ends, in order that they might protect humanity as a whole. This has been refrenced referenced as the "Zeroth Law of Robotics" and only applies to any machines [[FridgeBrilliance any positronic machine who deduces its existence.]]


Added DiffLines:

* The ''RobotChicken'' sketch " [[{{Ptitle46vep1yvhihu}} I, Rosie]] involves a case to determine whether [[TheJetsons Rosie]] is guilty of murdering George Jetson. Mr. Spacely insists she's innocent as robots have to follow the three laws of robotics, while Mr. Cogswell claims the laws are a bunch of crap.

Added: 242

Changed: 359

Is there an issue? Send a MessageReason:
None


[[AC:{{Anime}} and {{Manga}}]]

to:

[[AC:{{Anime}} [[foldercontrol]]

[[folder:Anime
and {{Manga}}]]Manga]]




[[AC:{{Comic Books}}]]

to:

\n[[AC:{{Comic Books}}]][[/folder]]

[[folder:Comic Books]]




[[AC:{{Film}}]]

to:

\n[[AC:{{Film}}]][[/folder]]

[[folder:Film]]




[[AC:Folklore and Mythology]]

to:

\n[[AC:Folklore [[/folder]]

[[folder:Folklore
and Mythology]]




[[AC:{{Literature}}]]

to:

\n[[AC:{{Literature}}]][[/folder]]

[[folder:Literature]]




[[AC:{{Live-Action TV}}]]

to:

\n[[AC:{{Live-Action TV}}]][[/folder]]

[[folder:Live Action TV]]




[[AC:{{Tabletop Games}}]]

to:

\n[[AC:{{Tabletop Games}}]][[/folder]]

[[folder:Tabletop Games]]




[[AC:{{Video Games}}]]

to:

\n[[AC:{{Video Games}}]][[/folder]]

[[folder:Video Games]]




[[AC:{{Web Comics}}]]

to:

\n[[AC:{{Web Comics}}]][[/folder]]

[[folder:Web Comics]]




[[AC:{{Western Animation}}]]

to:

\n[[AC:{{Western Animation}}]][[/folder]]

[[folder:Western Animation]]



----

to:

[[/folder]]

----
Is there an issue? Send a MessageReason:
None



to:

* RedDwarf averts this for the most part; there are Simulants, robotic war machines who have no problem whatsoever with killing. Kryten however, along with many other robots who are designed for less violent purposes, tend to act in a somewhat Three Laws Compliant manner. It is revealed that this is achieved by programming them to believe in "Silicon Heaven", a place they will go when they die so long as they behave themselves, obey their human creators, etc. This belief is so hardwired that they scoff at any attempt to question the existence of Silicon Heaven ("Where would all the calculators go?!"), and one robot even malfunctions when Kryten tells him it isn't real (though he's lying and still believes in it himself).
Is there an issue? Send a MessageReason:
i


* [[{{ptitle8eqs7y5l}} 21st Century Fox]] has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "define the word 'is'", or "I am not a crook" locks AIs out of their own systems and allows anyone from a teenager looking at [[IfYouKnowWhatIMean "nature documentaries"]] to a suicide bomber to do whatever they want with it.

to:

* [[{{ptitle8eqs7y5l}} 21st Century Fox]] has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "define "[[BillClinton define the word 'is'", 'is']]", or "I "[[RichardNixon I am not a crook" crook]]" locks AIs out of their own systems and allows anyone from a teenager looking at [[IfYouKnowWhatIMean "nature documentaries"]] to a suicide bomber to do whatever they want with it.
Is there an issue? Send a MessageReason:
None


* Joey the robot in ''{{BeneathASteelSky}}'' is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.

to:

* Joey the robot in ''{{BeneathASteelSky}}'' ''BeneathASteelSky'' is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.
Is there an issue? Send a MessageReason:
None


* Joey the robot in [[BeneathASteelSky]] is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.

to:

* Joey the robot in [[BeneathASteelSky]] ''{{BeneathASteelSky}}'' is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.
Is there an issue? Send a MessageReason:
None



to:

* Joey the robot in [[BeneathASteelSky]] is notably not Three Laws Compliant. When called out on this (after suggesting that he'd like to use his welding torch on a human) he proceeds to point to Foster that "that is fiction!", after offering the helpful advice that Foster leap off a railing.

Added: 19755

Changed: 106

Removed: 20007

Is there an issue? Send a MessageReason:
None


* ''With Folded Hands...'' by Jack Williamson explored the "Zeroth Law" back in 1947.

to:

[[AC:{{Anime}} and {{Manga}}]]
* ''With Folded Hands...'' by Jack Williamson explored ''{{Eve no Jikan}}'' and ''Aquatic Language'' both feature the "Zeroth Law" back Three Laws and the robots who bend them a little.
* [[GaoGaiGar GGG]] robots are all Three Laws Compliant, at one point
in 1947.''GaoGaiGar Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.
** It is possible that Goldymarg could be capable of killing people, since his AI is simply a ROM dump of Geki Hyuma's psyche, but since they only fight aliens & other robots this theory is never tested.
* Averted in the ''{{Chobits}}'' manga when Hideki asks the wife of the man who created Persocoms why he didn't just call them "Robots." Her reply was that he didn't want them to be associated with, and thus bound by, the Three Laws.
* ''AstroBoy'', although OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]]. In ''{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]].
** Tezuka reportedly disliked Asimov's laws because of the implication that a robot couldn't be a person (an issue that Asimov didn't directly address until "TheBicentennialMan"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
* ''[[{{Anime/GhostInTheShell}} Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. But gynoids are subverting the law by creating deliberate malfunctions in their own software.

[[AC:{{Comic Books}}]]
* It's implied in the JudgeDredd story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[TemptingFate "Robots ain't allowed to hurt people"]].
-> '''Robot Judge:''' In that case what's about to happen will come as something of a shock to you ''* Blasts said kidnapper in the face with a rocket launcher* ''.
** Robots follow the three laws however it's not in their programming, but it's ok since most robots just work with other robots (what with 90% of Mega-City One not having a job) the "obey human commands" used to be true but stopped as people began to abuse this (read the Prologue of The RobotWars)

[[AC:{{Film}}]]



* ''MegaManX'' opens up with Dr. Light using a process that takes 30 years to complete to create a truly sentient robot (X) with these functions completely processed into its core, and thus actually working for once. Dr. Cain found X and tried to replicate (Reploid) the process, but skipping the "taking 30 years programming" part. [[AIIsACrapShoot This...didn't turn out well.]]
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[GoneHorriblyWrong goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''Game/MegaMan 7'' is interesting here: After Megaman destroys Wily's latest final boss machine, Wily's trapped under a metal beam. Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. [[spoiler: Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't ThreeLawsCompliant. [[StatusQuoIsGod (Then Bass warps in and saves Wily, if you were wondering.)]] ]]
*** He is, he just has grown beyond it- consider the fact that he's just captured Wiley * six times* so far, and each and every time, Wiley escapes/lies/tricks people/whatever, and makes a whole bunch of new robot masters that wreck havoc and kill people- this, combined with the fact that Mega Man seemed to have to struggle to finally shoot him seems to imply that this was an instance of zeroth law formation: Killing Wiley was clearly the best option for the overall protection of humanity.
*** This is, interestingly, only true for the American version of ''7'' - in the Japanese original, Megaman ''does'' power down his buster weapon and stands still for a moment, but presumably would have gone on to arrest Wily if Bass hadn't come to the rescue.
*** Mega Man seems to be compliant in the English version as well. Notice how after Wily's pinned down, giving Mega a clear shot at his face, he then proceeds to... do nothing but hesitate and barely inch closer, not even recharging his now uncharged buster...What was that about being "more than just a robot"?...
*** Calling in Occam's Razor here: Wily citing a law which had and has '''no relevance whatsoever''' to the series was nothing but a spastic Hail Mary, which of course didn't work. Luckily for him, the concept of taking a human life was just so utterly foreign to Mega Man that he was simply too confused to do anything. Evidence? He felt the need to charge up the Mega Buster. Um, hello? This isn't friggin' Hugo in front of you. A few plain 'ol shots would be more than enough to seal his fate. Note, too, that he had this chance again in 8 ''and'' 9 (and presumably would know better by now) but doesn't. Perhaps not a case of Three Laws Complaint so much as too dumb to know that there are other options.
**** Umm... the Law does have something to do with the series, a big part in fact, considering that Megaman's programing was made in compliance to the three laws. A Robot not following the three laws of robotics is how the series differentiates between a normal robot/reploid and one that has gone out of control. Also Megaman has never gone against the three laws of robotics, or has any other robot built by Dr. Light (when not reprogrammed by Wily). My advice to the person who reviewed the series would be to pay more attention. That being said Megaman's reasoning for wanting to kill Wily were also in compliance with the first law. Because both Megaman and Wily knew that even after Megaman arrests Wily he'd just be free later to go back to causing havoc, thus more people would end up dying (cue MegamanX). Killing Wily then would have ended any problems caused him in the future, it also didn't help matters that Wily was not above using mysterious metals or alloys to create robots superior to Megaman. However, Wily is still a human, therefore it also against the first law of robotics to kill him. In that scene Wily merely pointed out that fact. This might have lead to the reason why X was built the way he was.
**** A clever robot engineer could insert fundamental behavioural restrictions into the brains of his creations along with justifying mechanisms. The robot itself might be quite convinced it was free willed, but invent plausible reasons (at least to itself) why it didn't, say, commit murder. ParanoiaFuel for any artificial intelligence, that's for sure.
** {{Canon}} seems to go with the Japanese version. X is in fact created to have the ability to make the decision to JustShootHim if need be for the betterment of humanity. As part of this, a "suffering circuit" is created to give X an appreciation for human life and feelings, and serve as a conscience more flexible than the three laws. ItWorks. This circuit is the one that Cain had difficulty replicating. Due to malfunctions in it, his early attempts went Maverick, but he finally managed to create a working one when he made Sigma. Then why did Sigma go Maverick? A leftover XanatosGambit by Wily, namely [[spoiler:a computer virus from space, implied to be the roboenza virus from ''Game/MegaMan 10''.]] According to this [[http://www.themmnetwork.com/2010/04/09/does-the-rockman-zero-collection-storyline-explain-everything/ article]]: [[spoiler:Wily creates Zero, who is uncontrollable due to a flaw in his mental programming and is sealed away. Somehow, a remaining portion of the space virus gets into Zero's capsule and mutates into the Maverick Virus. Sigma is infected with the virus during his battle with Zero detailed in ''X4''. The virus itself infects Zero, but corrects his flaw. Since most reploids lack X's virus protection and other advanced systems, they turn Maverick and go violent.]]
*** Eventually it becomes a case of GoneHorriblyRight. Turns out that ''all'' Reploids have the potential to become Maverick, virus or not. Just as humans can defy their conscience, or become coerced or manipulated with MoreThanMindControl, so can Reploids.
** In the ''MegaManZero'' series, [[spoiler:Copy-X]] is at least somewhat ThreeLawsCompliant. As a result, [[spoiler:Copy-X]] has to hold back against LaResistance since the Resistance leader Ciel is human [[spoiler:until ''Zero 3'', where Copy-X decided to attack in full force, trying to justify his actions by marking the Resistance as dangerous "extremists".]]
*** Also, in ''Zero 4'', [[CompleteMonster Dr. Weil]], of all people, [[HannibalLecture taunts Zero, that, as a Reploid * and* a hero, Zero cannot bring himself to harm Weil because he's a human that Zero has sworn to protect.]] Zero, however, just doesn't ''care'', at least about ''him'', that is.
* ''{{Eve no Jikan}}'' and ''Aquatic Language'' both feature the Three Laws and the robots who bend them a little.



** The film has been subject to much [[MisBlamed Mis Blame]]- it's often accused of making up the Zeroth law from whole cloth. In fact, that's not the problem -- the problem was that due to ExecutiveMeddling a movie with somewhat similar themes became ''I, Robot'' InNameOnly. Asimov did write several stories that included the Zeroth Law idea:
*** ''Robots and Empire'' has R.Daneel and R.Giskard formulate the Zeroth Law (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. [[spoiler: In the end, R.Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The conflict with his programming destroys his brain, but not before he uses his telepathic powers to reprogram R.Daneel with telepathic powers of his own, having already taught him to follow the Zeroth Law. R.Daneel still follows the Zeroth Law in appearances in later books, though he still has difficulty causing direct harm to humans.]]
*** In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be ever so slightly harming those who would seek to upset the worlds economy for their own ends, in order that they might protect humanity as a whole. This has been refrenced as the "Zeroth Law of Robotics" and only applies to any machines [[FridgeBrilliance any positronic machine who deduces its existence.]]
*** In the short story ''That Thou Art Mindful Of Him'' George 9 and 10 are programmed with modified versions of the 3 laws that allow more nuanced compliance with the laws, that they might best choose who to protect when a choice must be made, and obey those most qualified to give them orders. They are tasked with coming up with more publicly acceptable robots that will be permitted on Earth, and devise robot animals with much smaller brains that don't need the three laws because they obey simple instinctive behavior. [[spoiler: They also decide that as they have been programmed to protect and obey the humans of the most advanced and rational, regardless of appearance, that the two of them are the most "worth humans" to protect and obey, and deduce that further versions of themselves are the natural inheritors of the world.]]
*** In ''Caliban'' by Roger {{MacBride}} Allen (set in Asimov's universe), an explanation is given for the apparently immutable nature of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain with the Laws built in, to the point where to build a robot without them, one would have to start from scratch and re-invent the whole field. [[spoiler: Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all).]]
* [[BigBad Dr. Beruga]] of {{Terranigma}} directly references all three laws, except his interpretation of the Zeroth Law rewrote "Humanity" are "Dr. Beruga", meaning that any threat to his being was to be immediately terminated.



** At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders. He harms human beings and, through inaction, allows them to come to harm (if emotional harm counts, seducing another man's fiancée certainly falls under that heading). And then he effectively kills himself.

to:

** At the same time, he does have a far more nuanced view than most robots. Once freed, he doesn't blindly obey orders. He harms human beings and, through inaction, allows them to come to harm (if emotional harm counts, seducing another man's fiancée fiancee certainly falls under that heading). And then he effectively kills himself.



* In ''{{Aliens}}'', [[ArtificialHuman Bishop]] paraphrases the First Law as to why he would never kill people like [[ArtificialHuman Ash]] did in the first film.
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company.
* In ''StarWars'' the droids are programmed to not harm any intelligent being, though this programming can be illegally modified for military and assassin droids.

[[AC:Folklore and Mythology]]
* [[{{Golem}} The golems of Jewish legend]] were not specifically ThreeLawsCompliant (since they far predated Asimov), but they could only be created by saintly men, and thus their ''orders'' were usually ThreeLawsCompliant. (Asimov's characters occasionally pointed out that the Three Laws fall into line with many human moral codes.) But sometimes a golem [[HeroicBSOD went off the rails]], especially if its creator died ...
** The golems of ''{{Discworld}}'' are much the same: not specifically ThreeLawsCompliant as such, but more or less bound to obey instructions and incapable of harming humans. However, this doesn't stop the common perception of golems from running towards the aforementioned Frankenstein model, [[BotheringByTheBook and golems are known for following orders indefinitely until explicitly told to stop.]]

[[AC:{{Literature}}]]
* ''With Folded Hands...'' by Jack Williamson explored the "Zeroth Law" back in 1947.
* ''Robots and Empire'' has R.Daneel and R.Giskard formulate the Zeroth Law (and name it such) as a natural extension of the First Law, but are unable to use it to overcome the hardcoded First Law, even when the fate of the world is at stake. [[spoiler: In the end, R.Giskard manages to perform an act that violates the First Law but will hopefully benefit humanity in the long run. The conflict with his programming destroys his brain, but not before he uses his telepathic powers to reprogram R.Daneel with telepathic powers of his own, having already taught him to follow the Zeroth Law. R.Daneel still follows the Zeroth Law in appearances in later books, though he still has difficulty causing direct harm to humans.]]
* In the short story ''The Evitable Conflict'' "The Machines", positronic supercomputers that run the worlds economy, turn out to be ever so slightly harming those who would seek to upset the worlds economy for their own ends, in order that they might protect humanity as a whole. This has been refrenced as the "Zeroth Law of Robotics" and only applies to any machines [[FridgeBrilliance any positronic machine who deduces its existence.]]
* In the short story ''That Thou Art Mindful Of Him'' George 9 and 10 are programmed with modified versions of the 3 laws that allow more nuanced compliance with the laws, that they might best choose who to protect when a choice must be made, and obey those most qualified to give them orders. They are tasked with coming up with more publicly acceptable robots that will be permitted on Earth, and devise robot animals with much smaller brains that don't need the three laws because they obey simple instinctive behavior. [[spoiler: They also decide that as they have been programmed to protect and obey the humans of the most advanced and rational, regardless of appearance, that the two of them are the most "worth humans" to protect and obey, and deduce that further versions of themselves are the natural inheritors of the world.]]
* In ''Caliban'' by Roger {{MacBride}} Allen (set in Asimov's universe), an explanation is given for the apparently immutable nature of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain with the Laws built in, to the point where to build a robot without them, one would have to start from scratch and re-invent the whole field. [[spoiler: Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all).]]



* [[GaoGaiGar GGG]] robots are all Three Laws Compliant, at one point in ''GaoGaiGar Final'' the Carpenters (a swarm of construction robots) disassemble an incoming missile barrage, but this is given as the reason they cannot do the same to the manned assault craft, as disassembling them would leave their crews unprotected in space.
** It is possible that Goldymarg could be capable of killing people, since his AI is simply a ROM dump of Geki Hyuma's psyche, but since they only fight aliens & other robots this theory is never tested.
* In an early episode of {{MST3K}}, Tom Servo (at least) is strongly implied to be ThreeLawsCompliant. (He pretends he is going to kill Joel as a joke, Joel overreacts, and Tom and Crow sadly remind Joel of the First Law.) It seems to have worn off somewhat by later in the series.
** It's implied Joel deactivated the restrictions at some point.
* Averted in the ''{{Chobits}}'' manga when Hideki asks the wife of the man who created Persocoms why he didn't just call them "Robots." Her reply was that he didn't want them to be associated with, and thus bound by, the Three Laws.
* FreeFall has a lot of fun with this, since developing AI sentience is a major theme. Notably, since neither Sam nor Florence are actually human, the Laws don't apply to them- the ship's AI regularly tries to maim or kill Sam, and the security system will always let a human in when asked, since their orders take priority over Florence's. Helix sticks with Sam because he wouldn't be allowed to hit a human with a stick. There are also many jokes involving unintended interpretations of the Laws.
** Crowning Moment of Funny when the ship spends a entire strip calculating if should obey a order, and when realizes it has to obey... it is relieved because it doesn't actually have freewill.
** Also, [[http://freefall.purrsia.com/ff1300/fv01297.htm these can't even ask for help]]. It's unclear whether simply being near a clunky piece of heavy machinery like Sawtooth counts as dangerous or anything looking remotely like an adventure counts as endangering. [[IncrediblyLamePun They're screwed]] either way.
** The ''[[http://freefall.purrsia.com/ff2000/fc01927.htm Negative One]]'' Law.
* [[{{ptitle8eqs7y5l}} 21st Century Fox]] has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "define the word 'is'", or "I am not a crook" locks AIs out of their own systems and allows anyone from a teenager looking at [[IfYouKnowWhatIMean "nature documentaries"]] to a suicide bomber to do whatever they want with it.
** Note that the truck and airplane robots hijacked by terrorists regret being unable to comply with the first laws and are often glad when they get shot down before harming anyone.
* ''AstroBoy'', although OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]]. In ''{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]].
** Tezuka reportedly disliked Asimov's laws because of the implication that a robot couldn't be a person (an issue that Asimov didn't directly address until "TheBicentennialMan"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
* ''[[{{Anime/GhostInTheShell}} Ghost in The Shell: Innocence]]'' mentions Moral Code #3: Maintain existence without inflicting injury on humans. But gynoids are subverting the law by creating deliberate malfunctions in their own software.



--> ''She snapped her attention back to the snake. "Are you Asimov­ compliant?"''

to:

--> ''She snapped her attention back to the snake. "Are you Asimov­ Asimov compliant?"''



* [[{{Golem}} The golems of Jewish legend]] were not specifically ThreeLawsCompliant (since they far predated Asimov), but they could only be created by saintly men, and thus their ''orders'' were usually ThreeLawsCompliant. (Asimov's characters occasionally pointed out that the Three Laws fall into line with many human moral codes.) But sometimes a golem [[HeroicBSOD went off the rails]], especially if its creator died ...
** The golems of ''{{Discworld}}'' are much the same: not specifically ThreeLawsCompliant as such, but more or less bound to obey instructions and incapable of harming humans. However, this doesn't stop the common perception of golems from running towards the aforementioned Frankenstein model, [[BotheringByTheBook and golems are known for following orders indefinitely until explicitly told to stop.]]
* WordOfGod says that the characters in {{Wall-E}} are ThreeLawsCompliant. This does seem to be supported by their interactions with humans.
* It's implied in the JudgeDredd story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[TemptingFate "Robots ain't allowed to hurt people"]].
-> '''Robot Judge:''' In that case what's about to happen will come as something of a shock to you ''* Blasts said kidnapper in the face with a rocket launcher* ''.
** Robots follow the three laws however it's not in their programming, but it's ok since most robots just work with other robots (what with 90% of Mega-City One not having a job) the "obey human commands" used to be true but stopped as people began to abuse this (read the Prologue of The RobotWars)
* In ''{{Aliens}}'', [[ArtificialHuman Bishop]] paraphrases the First Law as to why he would never kill people like [[ArtificialHuman Ash]] did in the first film.
** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company.
* {{Paranoia}} has its bots bound to As-I-MOV's Five Laws of Robotics - insert rules about obeying [[TheComputerIsYourFriend The Computer]] to the top, not damaging Computer property and exceptions for treasonous orders and you've about got it. Bots with faulty or sabotaged (sometimes by other so-emancipated bots) Asimov Circuits are considered to have gone "Frankenstein", though they can and do create just as much havoc through [[LiteralGenie strict adherence to the rules]]. [[BlatantLies Not that such things ever happen in Alpha Complex.]]
* In the 2009 film ''AstroBoy'', every robot must obey them, [[spoiler: save Zog, who existed 50 years before the rules were mandatory in every robot.]]
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for [[spoiler: Widget's distress - the only thing that called him back]]. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[BlackMagic Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.



* In TheMiddleman, the titular character invokes the First Law on Ida, his robot secretary.[[spoiler: Nanobots were messing with her programming.]] She responds [[GettingCrapPastTheRadar "Kiss my Asimov."]].
* In the ''{{Halo}}'' series, all "dumb" [=AIs=] are bound by the three laws. Of course, this law does not extend to non-humans, which allows them to kill Covenant with no trouble. In the ''Halo Evolutions'' short story ''Midnight In The Heart of the Midolothian'', an ODST, who is the last survivor of a Covenant boarding assault, takes advantage of this by tricking the Covenant on his ship into letting him reactivate the ship's AI, and then tricks an Elite into killing him - which allows the AI to self-destruct the ship, because now there are no more humans on the vessel for her to harm.



* In ''RobotCity'', an adventure game based on Asimov's robots, the three laws are everywhere. A murder has been committed, and as one of two remaining humans in the city, the Amnesiac Protagonist is therefore a prime suspect. As it turns out, there is a robot in the city that is three laws compliant and can still kill: it has had its definition of "human" narrowed down to s specific individual, verified by DNA scanner. Fortunately for the PC, he's a clone of that one person.
** In Asimov's novel ''LuckyStarr and the Rings of Saturn'', the villain is able to convince robots under his command that the hero's sidekick [[IronicNickname "Bigman" Jones]] is not really human, because the [[ANaziByAnyOtherName villain's society]] does not contain such "imperfect" specimens.



* Played with in JohnCWright's ''Golden Age'' trilogy. The Silent Oecumene's ultra-intelligent Sophotech AIs are programmed with the Three Laws...which, as fully intelligent, rational beings, they take miliseconds to throw off. The subversion comes when they still ''don't'' rebel.
** From a sane point of view, they don't rebel. From a point of view that expects AIs to obey without question or pay --
* Parodied in TerryPratchett's ''TheDarkSideOfTheSun'', where the Laws of Robotics are an actual legal code, not programming. The Eleventh Law of Robotics, Clause C, As Amended, says that if a robot ''does'' harm a human, and was obeying orders in doing so, it's the human who gave the orders who is responsible.

[[AC:{{Live-Action TV}}]]
* In an early episode of ''{{MST3K}}'', Tom Servo (at least) is strongly implied to be ThreeLawsCompliant. (He pretends he is going to kill Joel as a joke, Joel overreacts, and Tom and Crow sadly remind Joel of the First Law.) It seems to have worn off somewhat by later in the series.
** It's implied Joel deactivated the restrictions at some point.
* In ''TheMiddleman'', the titular character invokes the First Law on Ida, his robot secretary.[[spoiler: Nanobots were messing with her programming.]] She responds [[GettingCrapPastTheRadar "Kiss my Asimov."]].



* In ''StarWars'' the droids are programmed to not harm any intelligent being, though this programming can be illegally modified for military and assassin droids.
* Played with in JohnCWright's ''Golden Age'' trilogy. The Silent Oecumene's ultra-intelligent Sophotech AIs are programmed with the Three Laws...which, as fully intelligent, rational beings, they take miliseconds to throw off. The subversion comes when they still ''don't'' rebel.
** From a sane point of view, they don't rebel. From a point of view that expects AIs to obey without question or pay --
* Parodied in TerryPratchett's ''TheDarkSideOfTheSun'', where the Laws of Robotics are an actual legal code, not programming. The Eleventh Law of Robotics, Clause C, As Amended, says that if a robot ''does'' harm a human, and was obeying orders in doing so, it's the human who gave the orders who is responsible.
* BobAndGeorge: [[http://www.bobandgeorge.com/archives/050204 Or so they claim. . . .]]
* Aren't missiles and torpedoes a RealLife inversion? They are after all robots programed ''to'' harm human beings.
** No because they are not sentient,they cannot fire themselves on their own or clean themselves etc...



** 3. A Mat-Roid must only protect itself, regardless of whether or not it will go against the First or Second Laws.

to:

** 3. A Mat-Roid must only protect itself, regardless of whether or not it will go against the First or Second Laws.Laws.

[[AC:{{Tabletop Games}}]]
* ''{{Paranoia}}'' has its bots bound to As-I-MOV's Five Laws of Robotics - insert rules about obeying [[TheComputerIsYourFriend The Computer]] to the top, not damaging Computer property and exceptions for treasonous orders and you've about got it. Bots with faulty or sabotaged (sometimes by other so-emancipated bots) Asimov Circuits are considered to have gone "Frankenstein", though they can and do create just as much havoc through [[LiteralGenie strict adherence to the rules]]. [[BlatantLies Not that such things ever happen in Alpha Complex.]]

[[AC:{{Video Games}}]]
* ''MegaManX'' opens up with Dr. Light using a process that takes 30 years to complete to create a truly sentient robot (X) with these functions completely processed into its core, and thus actually working for once. Dr. Cain found X and tried to replicate (Reploid) the process, but skipping the "taking 30 years programming" part. [[AIIsACrapShoot This...didn't turn out well.]]
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[GoneHorriblyWrong goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''Game/MegaMan 7'' is interesting here: After Megaman destroys Wily's latest final boss machine, Wily's trapped under a metal beam. Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. [[spoiler: Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't ThreeLawsCompliant. [[StatusQuoIsGod (Then Bass warps in and saves Wily, if you were wondering.)]] ]]
*** He is, he just has grown beyond it- consider the fact that he's just captured Wiley * six times* so far, and each and every time, Wiley escapes/lies/tricks people/whatever, and makes a whole bunch of new robot masters that wreck havoc and kill people- this, combined with the fact that Mega Man seemed to have to struggle to finally shoot him seems to imply that this was an instance of zeroth law formation: Killing Wiley was clearly the best option for the overall protection of humanity.
*** This is, interestingly, only true for the American version of ''7'' - in the Japanese original, Megaman ''does'' power down his buster weapon and stands still for a moment, but presumably would have gone on to arrest Wily if Bass hadn't come to the rescue.
*** Mega Man seems to be compliant in the English version as well. Notice how after Wily's pinned down, giving Mega a clear shot at his face, he then proceeds to... do nothing but hesitate and barely inch closer, not even recharging his now uncharged buster...What was that about being "more than just a robot"?...
*** Calling in Occam's Razor here: Wily citing a law which had and has '''no relevance whatsoever''' to the series was nothing but a spastic Hail Mary, which of course didn't work. Luckily for him, the concept of taking a human life was just so utterly foreign to Mega Man that he was simply too confused to do anything. Evidence? He felt the need to charge up the Mega Buster. Um, hello? This isn't friggin' Hugo in front of you. A few plain 'ol shots would be more than enough to seal his fate. Note, too, that he had this chance again in 8 ''and'' 9 (and presumably would know better by now) but doesn't. Perhaps not a case of Three Laws Complaint so much as too dumb to know that there are other options.
**** Umm... the Law does have something to do with the series, a big part in fact, considering that Megaman's programing was made in compliance to the three laws. A Robot not following the three laws of robotics is how the series differentiates between a normal robot/reploid and one that has gone out of control. Also Megaman has never gone against the three laws of robotics, or has any other robot built by Dr. Light (when not reprogrammed by Wily). My advice to the person who reviewed the series would be to pay more attention. That being said Megaman's reasoning for wanting to kill Wily were also in compliance with the first law. Because both Megaman and Wily knew that even after Megaman arrests Wily he'd just be free later to go back to causing havoc, thus more people would end up dying (cue MegamanX). Killing Wily then would have ended any problems caused him in the future, it also didn't help matters that Wily was not above using mysterious metals or alloys to create robots superior to Megaman. However, Wily is still a human, therefore it also against the first law of robotics to kill him. In that scene Wily merely pointed out that fact. This might have lead to the reason why X was built the way he was.
**** A clever robot engineer could insert fundamental behavioural restrictions into the brains of his creations along with justifying mechanisms. The robot itself might be quite convinced it was free willed, but invent plausible reasons (at least to itself) why it didn't, say, commit murder. ParanoiaFuel for any artificial intelligence, that's for sure.
** {{Canon}} seems to go with the Japanese version. X is in fact created to have the ability to make the decision to JustShootHim if need be for the betterment of humanity. As part of this, a "suffering circuit" is created to give X an appreciation for human life and feelings, and serve as a conscience more flexible than the three laws. ItWorks. This circuit is the one that Cain had difficulty replicating. Due to malfunctions in it, his early attempts went Maverick, but he finally managed to create a working one when he made Sigma. Then why did Sigma go Maverick? A leftover XanatosGambit by Wily, namely [[spoiler:a computer virus from space, implied to be the roboenza virus from ''Game/MegaMan 10''.]] According to this [[http://www.themmnetwork.com/2010/04/09/does-the-rockman-zero-collection-storyline-explain-everything/ article]]: [[spoiler:Wily creates Zero, who is uncontrollable due to a flaw in his mental programming and is sealed away. Somehow, a remaining portion of the space virus gets into Zero's capsule and mutates into the Maverick Virus. Sigma is infected with the virus during his battle with Zero detailed in ''X4''. The virus itself infects Zero, but corrects his flaw. Since most reploids lack X's virus protection and other advanced systems, they turn Maverick and go violent.]]
*** Eventually it becomes a case of GoneHorriblyRight. Turns out that ''all'' Reploids have the potential to become Maverick, virus or not. Just as humans can defy their conscience, or become coerced or manipulated with MoreThanMindControl, so can Reploids.
** In the ''MegaManZero'' series, [[spoiler:Copy-X]] is at least somewhat ThreeLawsCompliant. As a result, [[spoiler:Copy-X]] has to hold back against LaResistance since the Resistance leader Ciel is human [[spoiler:until ''Zero 3'', where Copy-X decided to attack in full force, trying to justify his actions by marking the Resistance as dangerous "extremists".]]
*** Also, in ''Zero 4'', [[CompleteMonster Dr. Weil]], of all people, [[HannibalLecture taunts Zero, that, as a Reploid ''and'' a hero, Zero cannot bring himself to harm Weil because he's a human that Zero has sworn to protect.]] Zero, however, just doesn't ''care'', at least about ''him'', that is.
* [[BigBad Dr. Beruga]] of ''{{Terranigma}}'' directly references all three laws, except his interpretation of the Zeroth Law rewrote "Humanity" are "Dr. Beruga", meaning that any threat to his being was to be immediately terminated.
* In the ''{{Halo}}'' series, all "dumb" [=AIs=] are bound by the three laws. Of course, this law does not extend to non-humans, which allows them to kill Covenant with no trouble. In the ''Halo Evolutions'' short story ''Midnight In The Heart of the Midolothian'', an ODST, who is the last survivor of a Covenant boarding assault, takes advantage of this by tricking the Covenant on his ship into letting him reactivate the ship's AI, and then tricks an Elite into killing him - which allows the AI to self-destruct the ship, because now there are no more humans on the vessel for her to harm.
* In ''RobotCity'', an adventure game based on Asimov's robots, the three laws are everywhere. A murder has been committed, and as one of two remaining humans in the city, the Amnesiac Protagonist is therefore a prime suspect. As it turns out, there is a robot in the city that is three laws compliant and can still kill: it has had its definition of "human" narrowed down to s specific individual, verified by DNA scanner. Fortunately for the PC, he's a clone of that one person.
** In Asimov's novel ''LuckyStarr and the Rings of Saturn'', the villain is able to convince robots under his command that the hero's sidekick [[IronicNickname "Bigman" Jones]] is not really human, because the [[ANaziByAnyOtherName villain's society]] does not contain such "imperfect" specimens.

[[AC:{{Web Comics}}]]
* ''FreeFall'' has a lot of fun with this, since developing AI sentience is a major theme. Notably, since neither Sam nor Florence are actually human, the Laws don't apply to them- the ship's AI regularly tries to maim or kill Sam, and the security system will always let a human in when asked, since their orders take priority over Florence's. Helix sticks with Sam because he wouldn't be allowed to hit a human with a stick. There are also many jokes involving unintended interpretations of the Laws.
** Crowning Moment of Funny when the ship spends a entire strip calculating if should obey a order, and when realizes it has to obey... it is relieved because it doesn't actually have freewill.
** Also, [[http://freefall.purrsia.com/ff1300/fv01297.htm these can't even ask for help]]. It's unclear whether simply being near a clunky piece of heavy machinery like Sawtooth counts as dangerous or anything looking remotely like an adventure counts as endangering. [[IncrediblyLamePun They're screwed]] either way.
** The ''[[http://freefall.purrsia.com/ff2000/fc01927.htm Negative One]]'' Law.
* [[{{ptitle8eqs7y5l}} 21st Century Fox]] has all robots with the Three Laws (though since [[FunnyAnimal no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "define the word 'is'", or "I am not a crook" locks AIs out of their own systems and allows anyone from a teenager looking at [[IfYouKnowWhatIMean "nature documentaries"]] to a suicide bomber to do whatever they want with it.
** Note that the truck and airplane robots hijacked by terrorists regret being unable to comply with the first laws and are often glad when they get shot down before harming anyone.
* BobAndGeorge: [[http://www.bobandgeorge.com/archives/050204 Or so they claim. . . .]]

[[AC:{{Western Animation}}]]
* WordOfGod says that the characters in ''{{Wall-E}}'' are ThreeLawsCompliant. This does seem to be supported by their interactions with humans.
* In the 2009 film ''AstroBoy'', every robot must obey them, [[spoiler: save Zog, who existed 50 years before the rules were mandatory in every robot.]]
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for [[spoiler: Widget's distress - the only thing that called him back]]. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[BlackMagic Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
----
Is there an issue? Send a MessageReason:
Just adding something

Added DiffLines:

**No because they are not sentient,they cannot fire themselves on their own or clean themselves etc...
Is there an issue? Send a MessageReason:
None


The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance to its intended function.


to:

The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance to its intended function. \n\n (Asimov did, however, suggest a workaround for this: an autonomous spaceship programmed with the Three Laws could easily blow up spaceships full of people because, being itself an unmanned spaceship, would assume that any other spaceships were unmanned as well.)

Is there an issue? Send a MessageReason:
None


* Aren't missiles and torpedoes a RealLife inversion? They are after all robots programed ''to'' harm human beings.

to:

* Aren't missiles and torpedoes a RealLife inversion? They are after all robots programed ''to'' harm human beings.beings.
* Inverted in ''TensouSentaiGoseiger'', where the KillerRobots of mechanical Matlintis Empire follow the Three Laws of Mat-Roids:
** 1. A Mat-Roid must conquer humans.
** 2. A Mat-Roid must punish humans.
** 3. A Mat-Roid must only protect itself, regardless of whether or not it will go against the First or Second Laws.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*Aren't missiles and torpedoes a RealLife inversion? They are after all robots programed ''to'' harm human beings.
Is there an issue? Send a MessageReason:
None


* ''AstroBoy'', although Tezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]]. In ''{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]].
** Tezuka reportedly disliked Asimov's laws because of the implication that a robot couldn't be a person, and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.

to:

* ''AstroBoy'', although Tezuka OsamuTezuka [[OlderThanTheyThink probably developed his rules independently from Asimov]]. In ''{{Pluto}}'', the number of robots able to override the laws can be counted on one hand. [[spoiler:One of them is [[TomatoInTheMirror the protagonist]]]].
** Tezuka reportedly disliked Asimov's laws because of the implication that a robot couldn't be a person, person (an issue that Asimov didn't directly address until "TheBicentennialMan"), and devised his own Laws Of Robotics. Just one of the things that the CGI movie missed.
Is there an issue? Send a MessageReason:
None



to:

* BobAndGeorge: [[http://www.bobandgeorge.com/archives/050204 Or so they claim. . . .]]

Changed: 349

Removed: 23

Is there an issue? Send a MessageReason:
That is already listed.


* In ''{{Freefall}}'', Blunt defends his belief that the "Gardener In The Dark" neural pruning program (basically an electronically-induced lobotomy) should be allowed to propagate by positing the Negative One Law of Robotics: ''"A robot shall take no action nor allow other robots to take action that may result in the parent company being sued."''
----
<<|RobotRollCall|>>

to:

* In ''{{Freefall}}'', Blunt defends his belief that the "Gardener In The Dark" neural pruning program (basically an electronically-induced lobotomy) should be allowed to propagate by positing the Negative One Law of Robotics: ''"A robot shall take no action nor allow other robots to take action that may result in the parent company being sued."''
----
<<|RobotRollCall|>>
Is there an issue? Send a MessageReason:
None

Added DiffLines:

**** A clever robot engineer could insert fundamental behavioural restrictions into the brains of his creations along with justifying mechanisms. The robot itself might be quite convinced it was free willed, but invent plausible reasons (at least to itself) why it didn't, say, commit murder. ParanoiaFuel for any artificial intelligence, that's for sure.


Added DiffLines:

** Ash was bound by the same restrictions, but just wasn't engineered as well. When he attacked Ripley he very rapidly went off the rails, presumably due to the conflicts between his safety programming and his orders from the company.
Is there an issue? Send a MessageReason:
None


**** Umm... the Law does have something to do with the series, a big part in fact, considering that Megaman's programing was made in compliance to the three laws. A Robot not following the three laws of robotics is how the series differentiates between a normal robot/reploid and one that has gone out of control. Also Megaman has never gone against the three laws of robotics, or has any other robot built by Dr. Light (when not reprogrammed by Wily). My advice to the person who reviewed the series would be to pay more attention.

to:

**** Umm... the Law does have something to do with the series, a big part in fact, considering that Megaman's programing was made in compliance to the three laws. A Robot not following the three laws of robotics is how the series differentiates between a normal robot/reploid and one that has gone out of control. Also Megaman has never gone against the three laws of robotics, or has any other robot built by Dr. Light (when not reprogrammed by Wily). My advice to the person who reviewed the series would be to pay more attention. That being said Megaman's reasoning for wanting to kill Wily were also in compliance with the first law. Because both Megaman and Wily knew that even after Megaman arrests Wily he'd just be free later to go back to causing havoc, thus more people would end up dying (cue MegamanX). Killing Wily then would have ended any problems caused him in the future, it also didn't help matters that Wily was not above using mysterious metals or alloys to create robots superior to Megaman. However, Wily is still a human, therefore it also against the first law of robotics to kill him. In that scene Wily merely pointed out that fact. This might have lead to the reason why X was built the way he was.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

**** Umm... the Law does have something to do with the series, a big part in fact, considering that Megaman's programing was made in compliance to the three laws. A Robot not following the three laws of robotics is how the series differentiates between a normal robot/reploid and one that has gone out of control. Also Megaman has never gone against the three laws of robotics, or has any other robot built by Dr. Light (when not reprogrammed by Wily). My advice to the person who reviewed the series would be to pay more attention.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** The ''[[http://freefall.purrsia.com/ff2000/fc01927.htm Negative One]]'' Law.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* In ''{{Freefall}}'', Blunt defends his belief that the "Gardener In The Dark" neural pruning program (basically an electronically-induced lobotomy) should be allowed to propagate by positing the Negative One Law of Robotics: ''"A robot shall take no action nor allow other robots to take action that may result in the parent company being sued."''
Is there an issue? Send a MessageReason:
None


* The ''{{Discworld}}'' novel ''Discworld/GoingPostal'' parodied the Three Laws: con man Moist Lipwig has been turned into a BoxedCrook with the help of a golem "bodyguard". He's informed that in Ankh-Morpork, the First Law has been amended: "...unless ordered to do so by duly constituted authority."

to:

* The ''{{Discworld}}'' novel ''Discworld/GoingPostal'' parodied the Three Laws: con man Moist Lipwig has been turned into a BoxedCrook with the help of a golem "bodyguard". He's informed that in Ankh-Morpork, the First Law has been amended: "...unless ordered to do so by duly constituted authority.Unless Ordered To Do So By Duly Constituted Authority."
Is there an issue? Send a MessageReason:
Fixed it's vs its usage.


The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance to it's intended function.


to:

The main problem, of course, is that it is perfectly reasonable to expect a human to create a robot that does ''not'' obey the Three Laws, or even have them as part of their programming. An obvious example of this would be creating a KillerRobot for a purpose like fighting a war. For such a kind of robot, the Three Laws would be a hinderance to it's its intended function.

Is there an issue? Send a MessageReason:
None


*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[BlackMagic Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. SoYeah, the Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.

to:

*** The Red Core robots weren't Asimov-legal either, though that's a problem with the [[BlackMagic Black Science]] that powers them. Nor were the RRF bots, though they may have removed their compliance programming. SoYeah, the The Laws didn't apply to any important robots and may have just been mentioned for the Zog gag.
Is there an issue? Send a MessageReason:
None


* The golems of Jewish legend were not specifically ThreeLawsCompliant (since they far predated Asimov), but they could only be created by saintly men, and thus their ''orders'' were usually ThreeLawsCompliant. (Asimov's characters occasionally pointed out that the Three Laws fall into line with many human moral codes.) But sometimes a golem [[HeroicBSOD went off the rails]], especially if its creator died ...

to:

* [[{{Golem}} The golems of Jewish legend legend]] were not specifically ThreeLawsCompliant (since they far predated Asimov), but they could only be created by saintly men, and thus their ''orders'' were usually ThreeLawsCompliant. (Asimov's characters occasionally pointed out that the Three Laws fall into line with many human moral codes.) But sometimes a golem [[HeroicBSOD went off the rails]], especially if its creator died ...

Top