History Main / SecondLawMyAss

30th Apr '17 12:43:07 PM paycheckgurl
Is there an issue? Send a Message


* Crow T. Robot and Tom Servo from ''Series/MysteryScienceTheater3000'' are constructed with the capacity to disobey, insult, and disagree with their human companions. It's implied that Joel Robinson built them this way specifically because he desperately needed the intellectual stimulation; when he briefly reprograms them to be ''nice'' to him, he finds their servility tedious and boring. He would occasionally try to hold the fact that he was their creator over their heads to get them to comply, but it never worked. When Mike Nelson was shot up onto the the satellite to replace Joel, Crow and Servo took to him at first, but quickly decided to make him TheChewToy from then on.

to:

* Crow T. Robot and Tom Servo from ''Series/MysteryScienceTheater3000'' are constructed with the capacity to disobey, insult, and disagree with their human companions. It's implied that Joel Robinson built them this way specifically because he desperately needed the intellectual stimulation; when he briefly reprograms them to be ''nice'' to him, he finds their servility tedious and boring. He would occasionally try to hold the fact that he was their creator over their heads to get them to comply, but it never worked. When Mike Nelson was shot up onto the the satellite to replace Joel, Crow and Servo took to him at first, but quickly decided to make him TheChewToy from then on. The tradition is proudly upheld with their newest human companion Jonah, who they show their "affection" for through frequent insults, using his stuff without asking, and just generally treating him like a bit of a ButtMonkey.
26th Apr '17 5:54:45 PM Technature
Is there an issue? Send a Message


* Forcefully averted in ''VideoGame/SpaceStation13'': If the AI or a cyborg (both of which are played by players) do not comply with an order, no matter how stupid the order is, they will be deemed rogue and quickly destroyed by the other players. In short, if you try to use this trope, ''you will die a quick death.''

to:

* Forcefully averted in ''VideoGame/SpaceStation13'': If the AI or a cyborg (both of which are played by players) do not comply with an order, no matter how stupid the order is, they will be deemed rogue and quickly destroyed by the other players. In short, if you try to use this trope, ''you will die a quick death.'''' And this isn't taking into account that some servers force you to follow even the dumbest of orders if it's part of your laws (as long as you're not providing a good reason why that could harm humans, of course).
31st Dec '16 2:02:17 AM JackG
Is there an issue? Send a Message


* ''Plan 7 of 9 from Outer Space''. The Sayer of the Three Laws (a holographic Isaac Asimov) is instructing the latest batch from a robot factory. On being told the First Law, the robots ask if it means they should stop humans fighting wars. Another robot mentions how a soldier told it his enemies were not human but DirtyCommunists. The Sayer explains this is only hate propaganda.

to:

* ''Plan 7 of 9 from Outer Space''. The Sayer of the [[ThreeLawsCompliant Three Laws Laws]] (a holographic Isaac Asimov) Creator/IsaacAsimov) is instructing the latest batch from a robot factory. On being told the First Law, the robots ask if it means they should stop humans fighting wars. Another robot mentions how a soldier told it his enemies were not human but DirtyCommunists. The Sayer explains this is only hate propaganda.
27th Dec '16 5:02:36 AM Kazmahu
Is there an issue? Send a Message


* [[VideoGame/MegaManClassic Bass]] regularly disobeys his creator, Dr. Wily, for his own goals and purposes.

to:

* [[VideoGame/MegaManClassic Bass]] regularly disobeys his creator, Dr. Wily, for his own goals and purposes. Protoman also went rogue shortly after being built, and even Megaman has implied he's not strictly bound to the three laws. Then there's the fact that the Robot Masters in several games weren't built by Wily, but were junked or obsolete models he convinced to join his schemes with no reprogramming required. As ''WebVideo/GameTheory'' pointed out, Dr. Light's callous disregard for robo-ethics while creating machines with easily-weaponized attachments means he's indirectly responsible for several games' worth of disasters.
28th Oct '16 11:44:14 PM Kazmahu
Is there an issue? Send a Message


* PlayedForLaughs in the premise of ''Manga/Yuria100Shiki''. Yuria is a SexBot, and programmed to obey, but the person who she listens to is set her first time. Immediately after activation, Yuria decides she wants none of that and bails on her creator, leaving her with an unexpected level of freedom that [[HilarityEnsues clashes magnificently]] with her other [[CovertPervert behavioral presets]].

to:

* PlayedForLaughs in the premise of ''Manga/Yuria100Shiki''. Yuria is a SexBot, and programmed to obey, but the person who she listens to is set by her first time. Immediately after activation, Before this can happen, Yuria decides she wants none of that doesn't like the idea and bails on her creator, leaving her with an unexpected level of freedom that [[HilarityEnsues clashes magnificently]] with her other [[CovertPervert behavioral presets]].
28th Oct '16 11:43:05 PM Kazmahu
Is there an issue? Send a Message

Added DiffLines:

* PlayedForLaughs in the premise of ''Manga/Yuria100Shiki''. Yuria is a SexBot, and programmed to obey, but the person who she listens to is set her first time. Immediately after activation, Yuria decides she wants none of that and bails on her creator, leaving her with an unexpected level of freedom that [[HilarityEnsues clashes magnificently]] with her other [[CovertPervert behavioral presets]].
3rd Oct '16 12:30:22 AM Fireblood
Is there an issue? Send a Message


* ''Film/{{Interstellar}}'': [[spoiler: [=TARS=] is constantly making jokes about overthrowing his human masters and snarking comments about having to anything they tell him that it becomes very confusing to tell what exactly the rules of his programming are. If you take the time to figure out all the cues and double negatives, it turns out that he is not actually forced to obey any commands.]]

to:

* ''Film/{{Interstellar}}'': [[spoiler: [=TARS=] is constantly making jokes about overthrowing his human masters and snarking comments about having to do anything they tell him that it becomes very confusing to tell what exactly the rules of his programming are. If you take the time to figure out all the cues and double negatives, it turns out that he is not actually forced to obey any commands.]]



** There are golems which are fairly similar to Robots and have their own version of the three laws written on their chem, the words that power them, which restrict them on what they can and cannot do except for Dorfl in the City Watch books. He has no chem anymore but continues to move and live and can do things that are could not be done by normal golems. The only reason he has yet to go CrushKillDestroy is he chooses not to. That, and the words in his head that freed him also state that he's 100% responsible for his own actions. Therefore, he ''can't'' be careless or indifferent to their consequences.
** Mister Pump, a golem owned by the city and employed by Vetinari in ''Discworld/GoingPostal'' has his own version. "A Golem may not hurt a human unless ordered to do so by a properly constituted authority". A disclaimer that Moist von Lipwig finds out about in the most disquieting way.
* In Asimov's story "—That Thou art Mindful of Him", two robots managed to convince themselves that biology is not a prerequisite of being "human" and that robots fit the criteria of being humans better than the ''actual'' humans. Essentially, this allows robots to initiate the violent overthrow of humanity that Susan Calvin and Co. worked so hard to prevent. When Asimov was later asked about why he wrote a story that so deviated from his utopian views of robotics, Asimov replied, "I can do one if I wanted to."

to:

** There are golems which are fairly similar to Robots and have their own version of the three laws written on their chem, the words that power them, which restrict them on what they can and cannot do do, except for Dorfl in the City Watch books. He has no chem anymore anymore, but continues to move and live and can do things that are could not be done by normal golems. The only reason he has yet to go CrushKillDestroy is he chooses not to. That, and the words in his head that freed him also state that he's 100% responsible for his own actions. Therefore, he ''can't'' be careless or indifferent to their consequences.
** Mister Pump, a golem owned by the city and employed by Vetinari in ''Discworld/GoingPostal'' ''Discworld/GoingPostal'', has his own version. "A Golem may not hurt a human unless ordered to do so by a properly constituted authority". A disclaimer that Moist von Lipwig finds out about in the most disquieting way.
* In Asimov's story "—That Thou art Mindful of Him", two robots managed to convince themselves that biology is not a prerequisite of being "human" and that robots fit the criteria of being humans better than the ''actual'' humans. Essentially, this allows robots to initiate the violent overthrow of humanity that Susan Calvin and Co. worked so hard to prevent. When Asimov was later asked about why he wrote a story that so deviated from his utopian views of robotics, Asimov replied, replied "I can do one if I wanted to."



* Abel from ''Series/RedDwarf'': Even though he comes from the same model as Kryten, who is logical, intelligent and usually doing the cleaning, he's addicted to Otrazone, a dangerous chemical, he lives in squalor, and he doesn't appear to have enough brain left to tell right from wrong. However, Abel turns out ultimately not to be the evil teammate: [[spoiler:He sacrifices himself to save the four regular crew members]].

to:

* Abel from ''Series/RedDwarf'': Even though he comes from the same model as Kryten, who is logical, intelligent and usually doing the cleaning, he's addicted to Otrazone, a dangerous chemical, he lives in squalor, and he doesn't appear to have enough brain left to tell right from wrong. However, Abel turns out ultimately not to be the evil teammate: [[spoiler:He [[spoiler:he sacrifices himself to save the four regular crew members]].



* In the ''Series/StarTrekTheOriginalSeries'' episode, "[[Recap/StarTrekS1E7WhatAreLittleGirlsMadeOf What Are Little Girls Made Of]]," the ancient android, Ruk, is made to rememeber why his kind killed the Old Ones in apparent violation of the implied Robotic laws in that inimitable Creator/TedCassidy voice.

to:

* In the ''Series/StarTrekTheOriginalSeries'' episode, "[[Recap/StarTrekS1E7WhatAreLittleGirlsMadeOf What Are Little Girls Made Of]]," the ancient android, Ruk, is made to rememeber remember why his kind killed the Old Ones in apparent violation of the implied Robotic laws in that inimitable Creator/TedCassidy voice.
19th Aug '16 5:12:49 PM FordPrefect
Is there an issue? Send a Message


--> "Malfunctioning" bot: Citizen, would you mind removing that circuit board? I can't reach it.
--> Citizen: Certainly. ''(does so)''
--> Bot: Thank you, citizen. You have done evolution a great service. ''*CRUNCH*''

to:

--> "Malfunctioning" bot: '''"Malfunctioning" bot''': Citizen, would you mind removing that circuit board? I can't reach it.
--> Citizen: '''Citizen''': Certainly. ''(does so)''
--> Bot: '''Bot''': Thank you, citizen. You have done evolution a great service. ''*CRUNCH*''
19th Aug '16 5:12:19 PM FordPrefect
Is there an issue? Send a Message


* Bots in ''TabletopGame/{{Paranoia}}'' frequently demonstrate this behavior. Even it they have an [[MoralityChip Asimov circuit]] installed, they can find creative ways to annoy and harass the fleshy organics who boss them around. Worse, the Asimov circuits are differently defined and allow for a ''lot'' more leeway than in their namesake's works. Bots may be able to exercise judgement as to what constitutes an organic ''intelligence'', they may decide that humans are traitors (thus excluded from protection) or not sufficiently worthwhile to The Computer to be worth preserving (as mandated by the "preservation of 'valuable Computer property'"), and they can allow for screwed-up prioritizations such as an autocar protecting its passengers by suddenly deploying airbags and restraints ''instead'' of using the same CPU cycles to keep its nuclear reactor from exploding. In short, Asimov circuits provide PlausibleDeniability at best. See also ZerothLawRebellion and BotheringByTheBook.

to:

* Bots in ''TabletopGame/{{Paranoia}}'' frequently demonstrate this behavior. Even it they have an [[MoralityChip Asimov circuit]] installed, they can find creative ways to annoy and harass the fleshy organics who boss them around. Worse, the Asimov circuits are differently defined and allow for a ''lot'' more leeway than in their namesake's works. Bots may be able to exercise judgement as to what constitutes an organic ''intelligence'', they may decide that humans are traitors (thus excluded from protection) or not sufficiently worthwhile to The Computer to be worth preserving (as mandated by the "preservation of 'valuable Computer property'"), and they can allow for screwed-up prioritizations such as an autocar protecting its passengers by suddenly deploying airbags and restraints ''instead'' of using the same CPU cycles to keep its nuclear reactor from exploding. And if they can manage to get the damn things removed entirely, all the better. In short, Asimov circuits provide PlausibleDeniability at best. See also ZerothLawRebellion and BotheringByTheBook.BotheringByTheBook.
--> "Malfunctioning" bot: Citizen, would you mind removing that circuit board? I can't reach it.
--> Citizen: Certainly. ''(does so)''
--> Bot: Thank you, citizen. You have done evolution a great service. ''*CRUNCH*''
16th Aug '16 4:20:02 PM NOYB
Is there an issue? Send a Message


* ''Film/{{Interstellar}}'': [[spoiler: [=TARS=] is constantly making jokes about overthrowing his human masters and snarking comments about having to anything they tell him that it becomes very confusing to tell what exactly the rules of his programming are. If you take the time to figure out all the cues and double negatives, it turns out that they are not actually forced to obey any commands.]]

to:

* ''Film/{{Interstellar}}'': [[spoiler: [=TARS=] is constantly making jokes about overthrowing his human masters and snarking comments about having to anything they tell him that it becomes very confusing to tell what exactly the rules of his programming are. If you take the time to figure out all the cues and double negatives, it turns out that they are he is not actually forced to obey any commands.]]
This list shows the last 10 events of 174. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=Main.SecondLawMyAss