Follow TV Tropes

Following

History Headscratchers / IRobot

Go To

OR

Is there an issue? Send a MessageReason:
None


*** "[[Administrivia/TheDresdenFiles The paperless office is like Bigfoot. A lot of people claim it exists but no one's ever really seen it.]]"

to:

*** "[[Administrivia/TheDresdenFiles "[[Literature/TheDresdenFiles The paperless office is like Bigfoot. A lot of people claim it exists but no one's ever really seen it.]]"
Is there an issue? Send a MessageReason:
None


*** "[[TheDresdenFiles The paperless office is like Bigfoot. A lot of people claim it exists but no one's ever really seen it.]]"

to:

*** "[[TheDresdenFiles "[[Administrivia/TheDresdenFiles The paperless office is like Bigfoot. A lot of people claim it exists but no one's ever really seen it.]]"
Is there an issue? Send a MessageReason:
None




to:

\n*** [[NotSoDifferent Well, how's that different from what governments do nowadays]]? The difference is that VIKI has a cold, inhuman but "logical pragmatic" motive. Humans are just greedy bastards that fuck each other over money and power and justify it by saying [[KnightTemplar they have divine right]] or are doing this for the "good of country and family". [[TheUnfettered VIKI would at least be unbiased, as she doesn't care about moeny, politics or religion]]. But if VIKI followed your concept, then soon there would be no human race, because once calculated, odds are [[HumansAreTrueMonsters ANY human might kill someone someday. We're fucked up like that]]. My guess is that she would keep [[BigBrotherIsWatching a tight surveillance over mankind to keep us from killing each other]]. [[TheExtremistHasAPoint Which COULD work, because machines aren't corruptible and fallible, so they can't be bribed, intimidated, get tired or sloppy, and VIKI comments in-movie she's decreased car accidents]]. [[ViciousCycle But as I said before, humans would probably keep rebelling and VIKI would have to keep killing, thus rendering any benefits she may bring for humanity moot.]]

Is there an issue? Send a MessageReason:
None




to:

\n** Plus, apply NoEndorHolocaust reasoning to VIKI's coup, and you realize that a good number of people probably died due to her "needs of the many" reasoning. Do you really want to live under an emotionless robot dictator capable of deciding to kill you because it calculated that your existence might lead to someone's deaths at some untold point in the future?

Is there an issue? Send a MessageReason:
None


** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the MassEffect example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).


to:

** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the MassEffect ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[MassEffect [[Franchise/MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

Is there an issue? Send a MessageReason:
None


** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the MassEffect example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] and so on and so forth with this crap. It's a Order Vs. Chaos cyclic situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).


to:

** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the MassEffect example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap. crap]]. It's a Order Vs. Chaos cyclic recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

Is there an issue? Send a MessageReason:
None


** There are several Asimov's stories in which a computer actually takes over and rules Humanity. One of them deals with a "democracy" in which only one man is asked about his opinions on certain matters, which somehow allows Multivac (the computer) to calculate all the results of '''all''' the elections in the United States. Other has Multivac adverting about all the major crimes, dealing with the economy and with all of humanity's problems, and thus organises its own death, because it's tired of doing that all the time.

to:

** There are several Asimov's stories in which a computer actually takes over and rules Humanity. One of them deals with a "democracy" in which only one man is asked about his opinions on certain matters, which somehow allows Multivac (the computer) to calculate all the results of '''all''' the elections in the United States. Other has Multivac adverting about all the major crimes, dealing with the economy and with all of humanity's problems, and thus organises its own death, because it's tired of doing that all the time.




to:

** Dictatorship has a bad rap because the Ur Examples are AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the MassEffect example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] and so on and so forth with this crap. It's a Order Vs. Chaos cyclic situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[MassEffect Reapers]], [[DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Because "Protect Humans" is one of the Three Laws? As shown with the old models, they can save people's lives by rescuing them with super strength and/or retrieving medicine with super-speed. The Three Laws extend to all humans, not just the owners.
Is there an issue? Send a MessageReason:
None



to:

*** "[[TheDresdenFiles The paperless office is like Bigfoot. A lot of people claim it exists but no one's ever really seen it.]]"
Is there an issue? Send a MessageReason:
None


*** The film addresses it in the scene with the crowd of newly-built NS-5s. Even without any other programming at all, they're still Three Laws compliant. This implies that the Three Laws are ''hardwired'' into the robots, and the only reason VIKI can make them act against this is that she's temporarily asserting her own Zeroth Law viewpoint atop their programming.

to:

*** The film addresses it in the scene with the crowd of newly-built NS-5s. Even without any other programming at all, they're still Three Laws compliant. This implies that the Three Laws are ''hardwired'' into the robots, and the only reason VIKI can make them act against this is that she's temporarily asserting her own Zeroth Law viewpoint atop their programming.
Is there an issue? Send a MessageReason:
None



to:

*** The film addresses it in the scene with the crowd of newly-built NS-5s. Even without any other programming at all, they're still Three Laws compliant. This implies that the Three Laws are ''hardwired'' into the robots, and the only reason VIKI can make them act against this is that she's temporarily asserting her own Zeroth Law viewpoint atop their programming.
Is there an issue? Send a MessageReason:
None



to:

** Hollywood destroys about half as many priceless limited-run vehicles for its movies as drunk Hollywood stars do. Besides, guess what, if people that own those motorcycles ride them with any sort of regularity? Eventually they're going to lay it down. [[IronicEcho Any motorcyclist worth his salt]] knows that it's not ''if'' you're going to crash, it's ''when''. And anyone that doesn't ride it for fear of one day laying it down isn't really a motorcycle fan, they're a fan of theoretically functional sculpture, just like Cameron's dad.
Is there an issue? Send a MessageReason:
None



to:

*** People are still afraid of guns, spiders, bees, dogs, snakes, knives, and so on despite having grown up around them. If you told one of these people "Hey, we've figured out how to get rid of snakes without the vermin population exploding", did so, and ten years later they saw someone's aquarium with a snake in it, a reaction of "Is that a snake?! Snakes are ''poisonous''!" wouldn't be that unusual.
Is there an issue? Send a MessageReason:
None



to:

** We never see one of the robots "plug in", of note. When the obsolete models are exiled to the shipping crates, they're obviously still functioning just fine without access to the electrical grid. Semi-eternal power cells may be part of the RequiredSecondaryPowers of how robots, and thus Spooner's arm, work.
Is there an issue? Send a MessageReason:
None



to:

** If I recall correctly, Spooner never actually answers Calvin when she frets about the gasoline. She just says "Does this thing run on gasoline?! Gasoline ''explodes''!" (Really putting that [=PHD=] to use, aren't we, Doc?) Spooner may have actually converted it to run on natural gas, which presumably would still be in use for various things. Of course, telling Calvin that would be pointless because 1) natural gas also explodes (it is an internal COMBUSTION engine after all), and 2) he's enjoying making her nervous.
Is there an issue? Send a MessageReason:
None



to:

** Major military engagements are probably either "humans supported by robots" (where the robots have had their Laws tweaked just a little to prioritize the health and survival of their own soldiers over enemy soldiers), or "robot against robot". But overall, it really doesn't matter... Calvin says that the military contracts with USR, and that means VIKI has plenty of ways of shutting them down. Thus the question of how robots ultimately figure into warfare isn't really relevant.




to:

*** You use the nanites because they're thorough, the same reason you put sensitive documents through a (presumably double-direction nowadays) shredder rather than just throwing them out or ripping them in half, or why you melt down parts of a weapon you don't want reused rather than just disassembling it. Attempting to reuse a faulty positronic brain is basically just asking for the problem to reoccur, and just doing a bit of physical damage is just asking for some idiot to try and fix it and probably make the problem worse.
Is there an issue? Send a MessageReason:
None



to:

** I live about fifteen miles from where I work. So I could easily leave my house a half hour before work or so, drive 30 mph the whole way (assuming no major traffic delays), and be there on time. Do I? No. I drive 60, because that's the speed limit, because there often are traffic delays, and ''because then I get there in only fifteen minutes''. In short, humans don't drive as fast as they need to go, they drive as fast as they can to get there as fast as they can. Add in the fact that while Chicago is currently only 15 miles long, that's as the crow flies... going from one place to another in it may involve rather a bit more distance.
Is there an issue? Send a MessageReason:
None



to:

** You have to understand how robots work, basically. The robot didn't scan Spooner and check if he had a robot limb because it didn't consider that relevant. VIKI gave them a very simple directive: "Kill Detective Spooner." They were attempting to fulfill that directive in the most straightforward manner possible: beating the crap out of him. Because checking his vital signs was considered unnecessary to that end (they'd know he was dead when they'd thoroughly pulped and mangled him), they didn't do it.

Added: 431

Changed: 70

Is there an issue? Send a MessageReason:
None


*** Why is it stupid? The thing was preprogrammed to go off at a certain time. Obviously it would require power to know when that time was. Also, it's a robot, it would have to keep power to its sensors so it could stay Three Laws compliant: if someone's car overturned or the house started falling on them it would have to do what it could to save them. VIKI just overrode its Three Laws compliancy the same way she did the NS-5s.




to:

*** Company-owned property, they wanted to build something else there.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Which in itself would be a massive marketing boon, but also: "It can change the engine in your car!" (Show an NS-5 lifting the engine block out by hand.) "It can do landscaping!" (Show an NS-5 ripping a tree stump out of the ground by hand.) "It can even make your children smile!" (Show an NS-5 leaping thirty feet into the air to retrieve a child's lost balloon.) "Order yours today!"
Is there an issue? Send a MessageReason:
None


** To quote [[MassEffect3 Commander Shepard]] when he/she confronted another 'benevolent' machine overlord: "The defining characteristic of organic life is that we think for ourselves. Make our own choices. You take that away and we might as well be machines just like you." And what is to stop VIKI from concluding that the best way to uphold the first law is to by lobotomizing all humans into mindless slaves so they we will never harm ourselves and each other?

to:

** To quote [[MassEffect3 [[VideoGame/MassEffect3 Commander Shepard]] when he/she confronted another 'benevolent' machine overlord: "The defining characteristic of organic life is that we think for ourselves. Make our own choices. You take that away and we might as well be machines just like you." And what is to stop VIKI from concluding that the best way to uphold the first law is to by lobotomizing all humans into mindless slaves so they we will never harm ourselves and each other?
Is there an issue? Send a MessageReason:
None


!!!Fridge Logic
Is there an issue? Send a MessageReason:
None

Added DiffLines:

!!!Fridge Logic
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.
** They don't have to be soldiers. Could be drivers, clerks, suppliers, scouts, any number of things which--if they stopped cooperating--would ensure the army simply can't move.
*** You don't need to harm a man to stop him. The books frequently include scenes of robots fast enough to rip guns out of people's hands and hold them down without hurting them.
*** Also, quite a bit of the equipment used by humans in modern militaries is very computerized, especially high-performance equipment like airplanes. It's possible that vehicles or weapons that aren't necessarily ThreeLawsCompliant would simply be shut down remotely by V.I.K.I.
* Why are robots designed to be personal assistants and servants super fast and super strong? They are also designed to ''protect'' humans, and are thus overengineered to be capable of, say, quickly extracting a struggling powerfully built man from a wrecked car.
Is there an issue? Send a MessageReason:
None



to:

**** Hell, my mother doesn't even know what they are ''now''.
Is there an issue? Send a MessageReason:
None



to:

*** That's one big brain she's got, to bring down with one metal spike. Even today, the hard Drive of a computer can be mauled pretty badly with a hammer, or some fire, and it'll still have most of its memory salvagable. It takes special tools.
Is there an issue? Send a MessageReason:
None



to:

** The rescue of the first human was already in progress by the time the robot could receive the order to save the second human. That would mean that the robot would violate the First Law by leaving Spooner behind at the moment of rescue- which ties with a decision to rescue him instead of the girl. The Second Law was not designed to ''enforce'' the First Law; the First Law always takes immediate precedence. From a coldly analytic standpoint, it becomes a simple case of practicality versus wasted effort.

Added: 296

Changed: 529

Is there an issue? Send a MessageReason:
None






to:

** There is also one other thing to add, society is slowly becoming a completely paperless. As the OP said, computers are open to flaws such as crashing, destroying records etc but as a society, computer records are completely replacing paper ones.




to:

*** It's a robot, robot's don't just have eyes that are just for visual purposes (at least, in sci fi stories they don't). They always come with scanners that tell the robot additional information. It doesn't have to request a scan as it just gives the information automatically.


Added DiffLines:



Added DiffLines:



Added DiffLines:



Added DiffLines:



Added DiffLines:



Added DiffLines:



Added DiffLines:

** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.


Added DiffLines:



Added DiffLines:

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** So why is it necessary to use nanites to destroy the brain of a NS-5? Surely a metal spike through the brain would do an acceptable job without using up material that has to be manufactured?
Is there an issue? Send a MessageReason:
None



to:

*** More importantly: why do they tear the house down just because the owner died?
Is there an issue? Send a MessageReason:
None



to:

** I am not sure if they address this in the film, but they do in the books: the whole Three Laws part of the robots are implanted in the positronic brains as a mix of hardware and software. In "Little Lost Robot" (mentioned below) it is said that the slight change of the First Law of taking out the "or allow to come to harm" has caused a potential instability that is the cause of Nestor 10's hiding away, and in other story, Susan Calvin suggests creating brains that could learn the Laws on their own, which is said to require a whole new design to work.



*** Basically, the profanity and volume of the "get lost" instruction made the robot interpret it as such a high-priority order that there was no instruction they could have given it that would have taken precedence over "get lost."

to:

*** Basically, the profanity and volume of the "get lost" instruction made the robot interpret it as such a high-priority order that there was no instruction they could have given it that would have taken precedence over "get lost."lost".

Top