History Main / TheGenieInTheMachine

3rd Apr '16 10:36:18 AM luiz4200
Is there an issue? Send a Message



to:

* ''WesternAnimation/TheFlintstoneKids'': In "Philo's D-Feat", Philo makes a robot that, when given commands, it either obeys them the literal way or explains that it can't be done. When Philo says "make my bed", the robot makes a new bed that looks like the bed Philo already had. When Rocky Ratrock commands the robot to shoplift, it literally lifts a shop.
25th Jan '16 9:39:17 PM gemmabeta2
Is there an issue? Send a Message


*** The author saw this as a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge.

to:

*** The author saw this as a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge.charge--indeed, until the Spacer Era, the worst thing a robot ever did to a human is to transfer a factory director to a slightly less prestigious posting.
27th Oct '15 2:03:05 PM margdean56
Is there an issue? Send a Message


*** The author saw this a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge.

to:

*** The author saw this as a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge.



*** Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against it's human masters. Their initial failures to identify it only serve to reinforce this "rebellious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
** And the story about the mining robot who was supposed to be send offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American mid-west. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill his programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed.... and ended up building the world's first fully functional disintegration cannon ''run by a standard electric torch battery''. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.

to:

*** Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc etc. than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against it's its human masters. Their initial failures to identify it only serve to reinforce this "rebellious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
** And the story about the mining robot who was supposed to be send sent offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American mid-west. midwest. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill his its programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed....shed ... and ended up building the world's first fully functional disintegration cannon ''run by a standard electric torch battery''. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.



** The Three Laws work pretty much perfectly most of the time for keeping robots obedient and safe. It's just less sophisticated models don't understand nuance of instructions or human tone and more advanced robots are often stated to work by differentials between the laws, so when a low priority law (such as self-preservation) is in strong effect but a higher priority one is invoked to override it , the "stress" can cause unexpected behaviors. The predictable, safe, everyday functionings just don't make for interesting stories.

to:

** The Three Laws work pretty much perfectly most of the time for keeping robots obedient and safe. It's just that less sophisticated models don't understand nuance of instructions or human tone tone, and more advanced robots are often stated to work by differentials between the laws, so when a low priority law (such as self-preservation) is in strong effect but a higher priority one is invoked to override it , it, the "stress" can cause unexpected behaviors. The predictable, safe, everyday functionings just don't make for interesting stories.
25th Jun '15 6:04:48 PM nombretomado
Is there an issue? Send a Message


* On ''KnightRider'', KITT's EvilTwin, KARR, was ordered to defend itself, so it immediately locked itself down and refused to follow any other orders, since they might lead it to its destruction.

to:

* On ''KnightRider'', ''Series/KnightRider'', KITT's EvilTwin, KARR, was ordered to defend itself, so it immediately locked itself down and refused to follow any other orders, since they might lead it to its destruction.
17th May '15 5:02:53 PM nombretomado
Is there an issue? Send a Message


* In the murder-mystery episode of ''SuzumiyaHaruhi'', Haruhi asks Yuki (sort of a computer) to lock the door and not let anybody in. Later, she asks to be let in, and Yuki refuses. Kyon gets her to let them in by telling her the order has been cancelled. He then guesses that the ordeal might have been Yuki's awkward attempt at a joke. It has also lead others to [[WildMassGuessing muse]] that for whatever reason Kyon's commands override anything else; which makes sense given Yuki's absolute loyalty to him.

to:

* In the murder-mystery episode of ''SuzumiyaHaruhi'', ''LightNovel/HaruhiSuzumiya'', Haruhi asks Yuki (sort of a computer) to lock the door and not let anybody in. Later, she asks to be let in, and Yuki refuses. Kyon gets her to let them in by telling her the order has been cancelled. He then guesses that the ordeal might have been Yuki's awkward attempt at a joke. It has also lead others to [[WildMassGuessing muse]] that for whatever reason Kyon's commands override anything else; which makes sense given Yuki's absolute loyalty to him.
24th Mar '15 7:59:30 PM Adept
Is there an issue? Send a Message


* One episode of ''MuppetBabies'' used this in the process of parodying as many sci-fi tropes as possible. "Gross me out" and "I need a bath" were among the "wishes".

to:

* One episode of ''MuppetBabies'' ''WesternAnimation/MuppetBabies'' used this in the process of parodying as many sci-fi tropes as possible. "Gross me out" and "I need a bath" were among the "wishes".



* One USAcres segment on ''GarfieldAndFriends'' featured a weather-making robot who operated on voice commands. Unfortunately, it was pretty vulnerable to making ''anything'' fall from the sky, especially when you insulted it by calling it a "bucket of bolts" or "overgrown vacuum cleaner".

to:

* One USAcres ''WesternAnimation/USAcres'' segment on ''GarfieldAndFriends'' ''WesternAnimation/GarfieldAndFriends'' featured a weather-making robot who operated on voice commands. Unfortunately, it was pretty vulnerable to making ''anything'' fall from the sky, especially when you insulted it by calling it a "bucket of bolts" or "overgrown vacuum cleaner".
18th Feb '15 8:23:17 PM surgoshan
Is there an issue? Send a Message



to:

* In Creator/DavidBrin's ''Literature/ThePracticeEffect'', the protagonist makes the mistake of spending some time musing aloud about the various things he's going to need his robot buddy (think R2-D2's great great grandfather) to do while he's stranded on a hostile world. A little while later, he realizes the robot has gone and despairs that it's lost forever, following really vague instructions to "gather information".
14th Sep '14 11:44:18 AM Bissek
Is there an issue? Send a Message


*** Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against it's human masters. Their initial failures to identify it only serve to reinforce this "rebelious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
** And the story about the mining robot who was supposed to be send offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American mid-west. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill his programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed.... and ended up building the world's first fully functional desintegrator cannon ''run by a standard electric torch battery''. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.

to:

*** Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against it's human masters. Their initial failures to identify it only serve to reinforce this "rebelious" "rebellious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
** And the story about the mining robot who was supposed to be send offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American mid-west. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill his programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed.... and ended up building the world's first fully functional desintegrator disintegration cannon ''run by a standard electric torch battery''. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.farmer.
** "The Naked Sky" had robots fully capable of harming humans, because either A: They had been given a deliberately limited definition as to what constituted 'human', or B: The robot was unaware that its actions could result in a human coming to harm.
28th Jun '14 6:42:37 PM Landis
Is there an issue? Send a Message


* Computer programming makes this painfully known, as even tiny errors in design, formatting, or through typos can cause hilarious, annoying, and/or even potentially disastrous behaviors like the backward-flying dragons in the initial Skyrim release, the Zune new year crash, or the shutdown of an entire power grid in the 2003 Northeastern North American blackout.

to:

* Computer programming makes this painfully known, as even tiny errors in design, formatting, or through typos typography can cause hilarious, annoying, and/or even potentially disastrous behaviors like the backward-flying dragons in the initial Skyrim release, the Zune new year crash, or the shutdown of an entire power grid in the 2003 Northeastern North American blackout.
20th Dec '13 8:10:26 AM ShiningwingX
Is there an issue? Send a Message

Added DiffLines:

*** Though, the Shivering Isles example IS somewhat [[JustifiedTrope justified]]. This is the Shivering Isles after all; the realm of madness, and the NPC in question has a serious obsession with cutlery.
This list shows the last 10 events of 28. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=Main.TheGenieInTheMachine