Follow TV Tropes

Following

History Headscratchers / IRobot

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Why can't they write some sort of code that tells the artificial intelligence it's not allowed to say anything that it knows is non-factual?
Is there an issue? Send a MessageReason:
None


** Perhaps it is the Skyway and the writers either didn't know better, or knew but thought that the audience [[ViewersAreMorons would be confused]] (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.

to:

** Perhaps it is the Skyway and the writers either didn't know better, or knew but thought that the audience [[ViewersAreMorons would be confused]] confused (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Because it runs on programs. Any programmer will tell you that a piece of code only does what you tell it to, meaning that a program cannot lie. Artificial Intelligence screws this up, but the Three Laws weren't counting on THAT.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Towards the end of the film, Spooner leaves a frantic message about the ZerothLawRebellion on Susan Calvin's answering machine. She hears it, but pretends not to (as her robot is in the apartment with her), and her NS-5 [[BlatantLies tells her it was a wrong number]] when she asks who it was. Why isn't there something in the Three Laws (or perhaps a fourth law) that forbids a robot from knowingly lying to a human?
Is there an issue? Send a MessageReason:
None


** Why [[RedVsBlue Sarge]] of course.

to:

** Why [[RedVsBlue [[Machinima/RedVsBlue Sarge]] of course.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Also, keep in mind that in Literature/BicentennialMan, set in the same universe, USR starts making their robots deliberately less human so as to not accidentally make another Andrew. That Daneel is of the same general 'level' as Byerly or Andrew isn't surprising in that case.
Is there an issue? Send a MessageReason:
None


* Considering how fast, intelligent, and precise the NS-5's are, why do they resort to brute strength to take down their targets? You'd think that they'd be perfectly capable of incapacitating a human without actually ''harming'' him, accomplishing the goal without breaking the First Law (though it's somewhat understandable for robot targets as such takedown methods would not work and because [[RuleOfCool there really wouldn't be any actual fight scenes]]).

to:

* Considering how fast, intelligent, and precise the NS-5's are, why do they resort to brute strength to take down their targets? You'd think that they'd be perfectly capable of incapacitating a human without actually ''harming'' him, accomplishing the goal without breaking the First Law (though it's somewhat understandable for robot targets as such takedown methods would not work and because [[RuleOfCool there really wouldn't be any actual fight scenes]]).
scenes]]). Even then, in the case of the controlled NS-5s, why destroy the head when conveniently in the center of mass is the USR uplink, which is what's making them hostile in the first place?
Is there an issue? Send a MessageReason:
None



to:

*** Brains also have the capability to adapt to the damage by taking upon the functionality of the lost portion. Granted, it's not full functionality, but in the case of a rogue robot/system, it can still pose a big threat. Hence a thorough destruction via nanites is the best option.
Is there an issue? Send a MessageReason:
None


** Once Spooner gave the order to save Sarah, the robot could no longer perfectly obey the three laws. Robots are apparently programmed to make triage-like decisions in the heat of the moment based on statistical chance of survival, so the robot was obligated to save Spooner rather than Sarah. However, robots are also programmed to obey a direct order from a human. So the robot was stuck, because it couldn't do both-- if it had saved Sarah, it would have broken the rule of "Save the human with the better chance of survival." If it had saved Spooner instead (which it did), it would have disobeyed a direct order from a human. Spooner forced the robot into a situation where no matter what, the robot would break one of the three laws. He essentially threw a LogicBomb right at its face.\\
** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.

to:

** Once Spooner gave the order to save Sarah, the robot could no longer perfectly obey the three laws. Robots are apparently programmed to make triage-like decisions in the heat of the moment based on statistical chance of survival, so the robot was obligated to save Spooner rather than Sarah. However, robots are also programmed to obey a direct order from a human. So the robot was stuck, because it couldn't do both-- if it had saved Sarah, it would have broken the rule of "Save the human with the better chance of survival." If it had saved Spooner instead (which it did), it would have disobeyed a direct order from a human. Spooner forced the robot into a situation where no matter what, the robot would break one of the three laws. He essentially threw a LogicBomb right at its face.face.
** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.
\\
** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.

Added: 461

Changed: 97

Is there an issue? Send a MessageReason:
None


* Considering how fast, intelligent, and precise the NS-5's are, why do they resort to brute strength to take down their targets? You'd think that they'd be perfectly capable of incapacitating a human without actually ''harming'' him, accomplishing the goal without breaking the First Law (though it's somewhat understandable for robot targets as such takedown methods would not work and because [[RuleOfCool there really wouldn't be any actual fight scenes]]).




to:

** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.
Is there an issue? Send a MessageReason:
Ur Example means the first.


** Dictatorship has a bad rap because the Ur Examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the Ur Examples biggest modern examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
Is there an issue? Send a MessageReason:
None


** Dictatorship has a bad rap because the Ur Examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the Ur Examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[DoctorWho [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Powell and Donovan are testers, and the USR crew get called in to deal with problematic cases hot off the assembly line. They're all dealing with bleeding-edge robots and emergent behavior - Cutie, the first robot built to supervise other robots, and the Brain, USR's latest and greatest experimental AI. Besides, Cutie's behavior isn't really any more advanced than Dave's in "Catch That Rabbit", and the Brain ''only'' thinks, it doesn't have to fit in or run a body. As for Byerly, he's a one-off built explicitly to imitate humanity by what's implied to be an extremely gifted roboticist (and moreover, to imitate his builder specifically).

Changed: 263

Removed: 765

Is there an issue? Send a MessageReason:
None


* Why are robots designed to be personal assistants and servants super fast and super strong? They are also designed to ''protect'' humans, and are thus overengineered to be capable of, say, quickly extracting a struggling powerfully built man from a wrecked car.
** Which in itself would be a massive marketing boon, but also: "It can change the engine in your car!" (Show an NS-5 lifting the engine block out by hand.) "It can do landscaping!" (Show an NS-5 ripping a tree stump out of the ground by hand.) "It can even make your children smile!" (Show an NS-5 leaping thirty feet into the air to retrieve a child's lost balloon.) "Order yours today!"
** Because "Protect Humans" is one of the Three Laws? As shown with the old models, they can save people's lives by rescuing them with super strength and/or retrieving medicine with super-speed. The Three Laws extend to all humans, not just the owners.
** Shouldn't this be under Fridge rather than Headscratchers? It seems that the original poster answered their own question.

to:

* Why are robots designed to be personal assistants and servants super fast and super strong? They are also designed to ''protect'' humans, and are thus overengineered to be capable of, say, quickly extracting a struggling powerfully built man from a wrecked car.
** Which in itself would be a massive marketing boon, but also: "It can change the engine in your car!" (Show an NS-5 lifting the engine block out by hand.) "It can do landscaping!" (Show an NS-5 ripping a tree stump out of the ground by hand.) "It can even make your children smile!" (Show an NS-5 leaping thirty feet into the air to retrieve a child's lost balloon.) "Order yours today!"
** Because "Protect Humans" is one of the Three Laws? As shown with the old models, they can save people's lives by rescuing them with super strength and/or retrieving medicine with super-speed. The Three Laws extend to all humans, not just the owners.
** Shouldn't this be under Fridge rather than Headscratchers? It seems that the original poster answered their own question.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* If the stories are set in the same universe as the other Robot novels, then how is it that some of the robots and AIs seem to be more advanced personality-wise that those featured in the other novels, which are set decades and centuries into the future? Cutie, the Brain, and in particular Stephen Byerly appear to demonstrate mannerisms and behavior that is more human-like than say Daneel and Giskard.
Is there an issue? Send a MessageReason:
None


** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the Ur Examples are AdolfHitler UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

Changed: 1177

Is there an issue? Send a MessageReason:
Just a few touch-ups. One troper said \"Dee-Dee,\" and I assume he/she meant \"Gigi.\" Also, snakes are -venomous-, not -poisonous-. Poisonous describes something that harms you when it\'s ingested or touched. Venomous creatures will bite or sting, as venomous snakes do.


** If I recall correctly, Spooner never actually answers Calvin when she frets about the gasoline. She just says "Does this thing run on gasoline?! Gasoline ''explodes''!" (Really putting that [=PHD=] to use, aren't we, Doc?) Spooner may have actually converted it to run on natural gas, which presumably would still be in use for various things. Of course, telling Calvin that would be pointless because 1) natural gas also explodes (it is an internal COMBUSTION engine after all), and 2) he's enjoying making her nervous.

to:

** If I recall correctly, Spooner never actually answers Calvin when she frets about the gasoline. She just says "Does "Please tell me this thing doesn't run on gasoline?! Gasoline ''explodes''!" gas! Gas ''explodes'', you know!" (Really putting that [=PHD=] to use, aren't we, Doc?) Spooner may have actually converted it to run on natural gas, which presumably would still be in use for various things. Of course, telling Calvin that would be pointless because 1) natural gas also explodes (it is an internal COMBUSTION engine after all), and 2) he's enjoying making her nervous.



*** People are still afraid of guns, spiders, bees, dogs, snakes, knives, and so on despite having grown up around them. If you told one of these people "Hey, we've figured out how to get rid of snakes without the vermin population exploding", did so, and ten years later they saw someone's aquarium with a snake in it, a reaction of "Is that a snake?! Snakes are ''poisonous''!" wouldn't be that unusual.

to:

*** People are still afraid of guns, spiders, bees, dogs, snakes, knives, and so on despite having grown up around them. If you told one of these people "Hey, we've figured out how to get rid of snakes without the vermin population exploding", did so, and ten years later they saw someone's aquarium with a snake in it, a reaction of "Is that a snake?! Snakes are ''poisonous''!" ''venomous!''" wouldn't be that unusual.
** It seems somewhat probable that in the future, with electricity-powered cars being so abundant, that the car companies would fear-monger the public about the dangers of gasoline, which would persuade them to buy electric cars and simultaneously instigate an irrational fear about something highly unlikely.



** I think you're all missing the point discussing calculations, instead of analyzing the shown events logically. Spooner and the girl were on the same level in the water. She was still alive and fighting when the robot jumped on Spooner's car. If the robot decided it couldn't save both based on ''calculations of chances of survival'', well, you now have the reason why that robot line was replaced by Sonny's new generation. The general feeling we get during the movie is that they're not all that effective. Spooner's anger comes from the fact they covered the robot's failure with statistics, and his anger in turn began to encompass all robots. A subtle theme in the movie is how corporations treat human life as unreliable objects, numbers and factors to be replaced by machines robots, much like Calvin was acting towards Spooner at the start. It's not a message against robots, but against their creators' attitude and the common people who trust blindly on them, like DaChief and Dee-Dee.
--> Spooner: What makes your robots so perfect? What makes them so much... goddamn better than human beings?

to:

** I think you're all missing the point discussing calculations, instead of analyzing the shown events logically. Spooner and the girl were on the same level in the water. She was still alive and fighting when the robot jumped on Spooner's car. If the robot decided it couldn't save both based on ''calculations of chances of survival'', well, you now have the reason why that robot line was replaced by Sonny's new generation. The general feeling we get during the movie is that they're not all that effective. Spooner's anger comes from the fact they covered the robot's failure with statistics, and his anger in turn began to encompass all robots. A subtle theme in the movie is how corporations treat human life as unreliable objects, numbers and factors to be replaced by machines robots, much like Calvin was acting towards Spooner at the start. It's not a message against robots, but against their creators' attitude and the common people who trust blindly on them, like DaChief and Dee-Dee.
--> Spooner:
Gigi.
---> '''Spooner:'''
What makes your robots so perfect? What makes them so much... goddamn better than human beings?
** Once Spooner gave the order to save Sarah, the robot could no longer perfectly obey the three laws. Robots are apparently programmed to make triage-like decisions in the heat of the moment based on statistical chance of survival, so the robot was obligated to save Spooner rather than Sarah. However, robots are also programmed to obey a direct order from a human. So the robot was stuck, because it couldn't do both-- if it had saved Sarah, it would have broken the rule of "Save the human with the better chance of survival." If it had saved Spooner instead (which it did), it would have disobeyed a direct order from a human. Spooner forced the robot into a situation where no matter what, the robot would break one of the three laws. He essentially threw a LogicBomb right at its face.\\
Is there an issue? Send a MessageReason:
None

Added: 108

Changed: 998

Is there an issue? Send a MessageReason:
None



to:

** I think you're all missing the point discussing calculations, instead of analyzing the shown events logically. Spooner and the girl were on the same level in the water. She was still alive and fighting when the robot jumped on Spooner's car. If the robot decided it couldn't save both based on ''calculations of chances of survival'', well, you now have the reason why that robot line was replaced by Sonny's new generation. The general feeling we get during the movie is that they're not all that effective. Spooner's anger comes from the fact they covered the robot's failure with statistics, and his anger in turn began to encompass all robots. A subtle theme in the movie is how corporations treat human life as unreliable objects, numbers and factors to be replaced by machines robots, much like Calvin was acting towards Spooner at the start. It's not a message against robots, but against their creators' attitude and the common people who trust blindly on them, like DaChief and Dee-Dee.
--> Spooner: What makes your robots so perfect? What makes them so much... goddamn better than human beings?
Is there an issue? Send a MessageReason:
None



to:

** According to Wikipedia (if you choose to believe), Asimov picked the word because the positron had just been discovered at the time, and it sounded cool/futuristic. So, not necessarily REAL positrons.
Is there an issue? Send a MessageReason:
None



to:

** It's stated in I, Robot (the book; the movie resembles it in one scene only) that robots are less likely to go crapshoot crazy than people going crazy. And that's even before speaking robots.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** AND Susan Calvin said something along the lines of robots having resentment for considering themselves better than humans, but having to abide by the 3 Laws anyway. All the ordeal helped strengthen that resentment.
Is there an issue? Send a MessageReason:
None



to:

** Also, the vital signs package is there to help ensure First Law compliance. Since VIKI had just turned the First Law ''off'' for those robots, it actually makes sense that the robots weren't using it.
Is there an issue? Send a MessageReason:
None



to:

*** OK, as a reader of the books, I have to add a couple of things. The first story mentioned is one where robots have been modified so that they don't have the second half of the first law, and the robot that has gone missing had been ordered to "Get lost" and insulted by one of the human engineers; the reason why the robots don't "save" the human (it's actually a test to find which of them is the 'lost' robot) is because the 'lost' robot convinced the others, as they had been told they would die if they tried to save the human. The second story is about two robots that are built to find a way to reintroduce robots to Earth. And the third story's Machines do not eliminate any humans, they simply render them unable to cause any actual harm by causing minor economic problems that push them out of the way.
Is there an issue? Send a MessageReason:
lel trying to be impressive superscript user but breaking the page instead


** The thing about positrons is that yes, they're antimatter, and yes, they annihilate by E=mc
upon contacting regular matter, but they're ''really really small''. They're literally one of the smallest particles [[ScienceMarchesOn currently known]] to exist. So the annihilation releases so little energy that [[https://en.wikipedia.org/wiki/Positron_emission_tomography hospitals regularly use it for scanning purposes]]. A positronic brain would be nigh-impossible to repair, but would not necessarily be ''dangerously'' volatile.

to:

** The thing about positrons is that yes, they're antimatter, and yes, they annihilate by E=mc
E=mc^2 upon contacting regular matter, but they're ''really really small''. They're literally one of the smallest particles [[ScienceMarchesOn currently known]] to exist. So the annihilation releases so little energy that [[https://en.wikipedia.org/wiki/Positron_emission_tomography hospitals regularly use it for scanning purposes]]. A positronic brain would be nigh-impossible to repair, but would not necessarily be ''dangerously'' volatile.
Is there an issue? Send a MessageReason:
None



to:

** The thing about positrons is that yes, they're antimatter, and yes, they annihilate by E=mc
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** It's also expanded more in the various short stories that the three laws can be adjusted or removed. At a facility with a nuclear reactor, robots were preventing workers from going near the core even though any exposure less than thirty minutes would not harm a person. Robots for those kinds of facilities had to be specifically programmed so that work could get done. Similarly, combat robots could be programmed to only attack specific groups or focused targets and be required to take orders from specific commanders and only take certain actions with confirmation by authority.
Is there an issue? Send a MessageReason:
None



to:

*** You're misunderstanding. The robot's calculations factored in the rescue--as in, if the robot went to rescue Spooner, Spooner had a 45 percent chance of survival, and if the robot went to rescue the girl, she only had a 11 percent chance. Both of them have a 0 percent chance otherwise.
Is there an issue? Send a MessageReason:
None



to:

** This may be me completely missing the point, but wouldn't a more logical standpoint be to rescue the person who only has an 11% chance of survival? The girl was in a lot more danger than Spooner, so I would imagine she would be the most logical choice if we're taking the First Law into account. Spooner had a 45% chance, so he's clearly doing just fine for a while. Get the girl out first, then splash back down to Spooner.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Shouldn't this be under Fridge rather than Headscratchers? It seems that the original poster answered their own question.

Top