Is there an issue? Send a MessageReason:
None
Changed line(s) 169 (click to see context) from:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), even making more humans (DeathByChildirth risk). The latter in particular would be an issue, as nothing in the Laws obligates VIKI to ensure humans are ''perpetuated''; she could prohibit reproduction as "too dangerous" and allow all extant humans to die of old age in captivity, then pat herself on the back for having fulfilled her duty to ensure no human can ''ever again'' be harmed by a robot's action or inaction.
to:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), even making more humans (DeathByChildirth (DeathByChildbirth risk). The latter in particular would be an issue, as nothing in the Laws obligates VIKI to ensure humans are ''perpetuated''; she could prohibit reproduction as "too dangerous" and allow all extant humans to die of old age in captivity, then pat herself on the back for having fulfilled her duty to ensure successfully ensured no human can ''ever again'' be harmed by a robot's action or inaction.
Is there an issue? Send a MessageReason:
None
Changed line(s) 169 (click to see context) from:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), even making more humans (DeathByChildirth risk). The latter in particular would be an issue, as nothing in the Laws obligates VIKI to ensure humans are ''perpetuated''; she could prohibit reproduction as "too dangerous" and allow all extant humans to die of old age in captivity, then pat herself on the back for having fulfilled her duty to the letter.
to:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), even making more humans (DeathByChildirth risk). The latter in particular would be an issue, as nothing in the Laws obligates VIKI to ensure humans are ''perpetuated''; she could prohibit reproduction as "too dangerous" and allow all extant humans to die of old age in captivity, then pat herself on the back for having fulfilled her duty to the letter.ensure no human can ''ever again'' be harmed by a robot's action or inaction.
Is there an issue? Send a MessageReason:
None
Changed line(s) 169 (click to see context) from:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), reproduction (DeathByChildirth risk)....
to:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), reproduction even making more humans (DeathByChildirth risk)....risk). The latter in particular would be an issue, as nothing in the Laws obligates VIKI to ensure humans are ''perpetuated''; she could prohibit reproduction as "too dangerous" and allow all extant humans to die of old age in captivity, then pat herself on the back for having fulfilled her duty to the letter.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
** Not to mention there are a ''lot'' of things humans have to do merely to remain human beings, that would rate as potentially "harmful" under an unrestrained and universal application of the "permit no harm" stricture: swallow solid food (choking hazard), undergoing surgery (incisions), childhood play (bullying), reproduction (DeathByChildirth risk)....
Is there an issue? Send a MessageReason:
None
Changed line(s) 15 (click to see context) from:
to:
** Military robots would presumably be used against the ''other side's'' military robots. Human combat soldiers may not even exist on the battlefield anymore.
Is there an issue? Send a MessageReason:
None
Changed line(s) 68 (click to see context) from:
to:
** Worth noting: That robot was ''damaged,'' down an arm and, though single-minded in purpose, walking funny. Maybe it was that much closer to "human" senses and reasoning, and nothing about Detective Spooner said "artificial arm" until it made a "bong" sound twice and started glowing from the inside.
Is there an issue? Send a MessageReason:
None
Changed line(s) 147 (click to see context) from:
to:
** What we would have here is likely a First Law vs First Law conflict. Since the robot can not save both humans, one would have to die and it would be the one with a lower survival chance. A scenario like this has happened in Liar, a short story by Isaac Asimov. Through a fault in manufacturing, a robot, RB-34 (also known as Herbie), is created that possesses telepathic abilities. While the roboticists at U.S. Robots and Mechanical Men investigate how this occurred, the robot tells them what other people are thinking. But the First Law still applies to this robot, and so it deliberately lies when necessary to avoid hurting their feelings and to make people happy, especially in terms of romance. However, by lying, it is hurting them anyway. When it is confronted with this fact by Susan Calvin (to whom it falsely claimed her coworker was infatuated with her – a particularly painful lie), the robot experiences an insoluble logical conflict and becomes catatonic.
Is there an issue? Send a MessageReason:
None
Changed line(s) 184 (click to see context) from:
** Also, keep in mind that in Literature/BicentennialMan, set in the same universe, USR starts making their robots deliberately less human so as to not accidentally make another Andrew. That Daneel is of the same general 'level' as Byerly or Andrew isn't surprising in that case.
to:
** Also, keep in mind that in Literature/BicentennialMan, set "Literature/TheBicentennialMan", a later story in the same universe, TheVerse, USR starts making their robots deliberately less human so as to not accidentally make another Andrew. That Daneel is of the same general 'level' as Byerly or Andrew isn't surprising in that case.
Is there an issue? Send a MessageReason:
None
Changed line(s) 165 (click to see context) from:
to:
** In short, no, you are not the only one, because there are a number of people who would prefer to not have the burden of actually being people and would prefer to be livestock, and would do so by trading freedom for safety. Like all who would do that, they deserve neither.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
*** It's not even really breaking the laws, because the whole point of the laws is that they're arranged in descending order of importance. Robots can already break the third law if it's to enforce the second law, and the second law if it's to enforce the first law, the zeroth law just adds the implied structure to the first as well.
Is there an issue? Send a MessageReason:
None
Changed line(s) 163,164 (click to see context) from:
to:
Is there an issue? Send a MessageReason:
None
Changed line(s) 144,145 (click to see context) from:
** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.\\
to:
** Regardless of what happened, the NS-4 was likely decommissioned for its failure to save Sarah.\\
Sarah.
** There's a bit from a Franchise/{{Alien}} novel that also applies, where the synthetics use a "modified" version of Asimov's Laws. A synthetic Marine asks a human Marine not to kill any of the guards for a place they're about to break into, the human replies "I'll try," then headshots all four of them out of the synthetic's sight. When the synthetic sees the dead bodies, the human shrugs and says "[[BlatantLies I tried,]]" and the synthetic shrugs in response and they go about their business. The Laws guide behavior, but not morality. . . the synthetic in this case feels no guilt or remorse for failing to prevent these deaths, he just fulfilled the letter of his programming. Now, Asimov got a lot of mileage out of "robot psychology" and the Three Laws and how they interact in the real world, but still, a robot is a robot. If it fails to uphold the First Law, through no fault of its own, it's not necessarily going to be wracked with guilt or try and make amends. It performed to the best of its ability, and obeyed the letter of its programming. Assuming the film treats the Laws similarly to Asimov's novels (and the film is actually quite good on that point), the robot almost certainly did ''try'' and save Sarah as well, until it became obvious there was no point. And may have suffered the robot equivalent of a psychotic break afterwards. But none of that is important to ''Spooner's'' story.
** There's a bit from a Franchise/{{Alien}} novel that also applies, where the synthetics use a "modified" version of Asimov's Laws. A synthetic Marine asks a human Marine not to kill any of the guards for a place they're about to break into, the human replies "I'll try," then headshots all four of them out of the synthetic's sight. When the synthetic sees the dead bodies, the human shrugs and says "[[BlatantLies I tried,]]" and the synthetic shrugs in response and they go about their business. The Laws guide behavior, but not morality. . . the synthetic in this case feels no guilt or remorse for failing to prevent these deaths, he just fulfilled the letter of his programming. Now, Asimov got a lot of mileage out of "robot psychology" and the Three Laws and how they interact in the real world, but still, a robot is a robot. If it fails to uphold the First Law, through no fault of its own, it's not necessarily going to be wracked with guilt or try and make amends. It performed to the best of its ability, and obeyed the letter of its programming. Assuming the film treats the Laws similarly to Asimov's novels (and the film is actually quite good on that point), the robot almost certainly did ''try'' and save Sarah as well, until it became obvious there was no point. And may have suffered the robot equivalent of a psychotic break afterwards. But none of that is important to ''Spooner's'' story.
Is there an issue? Send a MessageReason:
None
** He could. But Spooner wouldn't, because he's stuck on, "Robots are bad." It's sort of his central character trait. He doesn't ''want'' there to be a human responsible for it, he wants to prove that a robot went and killed someone.
Changed line(s) 169 (click to see context) from:
to:
** Maybe the nanite injector is made of something that the forcefield isn't designed to disintegrate.
Is there an issue? Send a MessageReason:
None
Changed line(s) 168 (click to see context) from:
to:
* Sonny can stick his arm through the forcefield because his creator gave him a denser alloy than the "stock" NS-5s. But how can the nanite injector pass through said forcefield without getting disintegrated? The forcefield takes out half of an NS-5's head onscreen, so it is designed to disintegrate plastic and/or metal that passes through it, as even Sonny's arm is visibly damaged.
Is there an issue? Send a MessageReason:
None
Changed line(s) 167 (click to see context) from:
to:
* I get it that a robot couldn't be "arrested" but, couldn't Spooner suggest treating it as a "murder weapon" and suggest someone programmed it to disregard the first law? It's already shown he disregards the second one.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
***So for someone with secondhand experience with the actual real-life Military? This is a completely unrealistic BS answer. Long story short, under no circumstances whatsoever does the Military allow external control/updates of Military equipment on or inside Military installations. A good example of this are phones and computers, which have accelerants that detonate if external non-military software is added without admin privileges (which are not afforded to non-military personnel). Another example is when the Army was trying to gain LEED certification for some of their buildings. The U.S. Green Building Council, an environmental nonprofit, has a program called the LEED Dynamic Plaque which automatically pulls energy, water, and waste usage information to update the compliance of the LEED rating system in real time. Even through the USGBC doesn't actually receive the information, the fact that it has automatic updates sent by the USGBC was enough that the military essentially said "No way in hell". A real-life equivalent of this situation, the Military wouldn't be hindered at all unless VIKI managed to remotely hack whatever AI the Military would be using to monitor their own equipment.
Is there an issue? Send a MessageReason:
None
Changed line(s) 146 (click to see context) from:
** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.
to:
** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.''Film/FerrisBuellersDayOff''.
Is there an issue? Send a MessageReason:
None
Changed line(s) 154 (click to see context) from:
** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
to:
** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky [[Creator/EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
Is there an issue? Send a MessageReason:
None
Changed line(s) 66 (click to see context) from:
to:
** You're assuming their scanning equipment would tell them something like that. All we know is that they can "measure vital signs". We don't know how they do that or what all they can actually "see".
Changed line(s) 159,160 (click to see context) from:
to:
Is there an issue? Send a MessageReason:
None
Added line(s) 7 (click to see context) :
*** Besides, an entity that can't lie would effectively be speaking a LanguageOfTruth, and therefore wouldn't be able to express every possible true statement, which could cause even bigger problems.
Changed line(s) 7 (click to see context) from:
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.
to:
** The above is the actual reason, but I feel I should point out that it wouldn't have helped even if there was a law against lying. The whole point of VIKI's ZerothLawRebellion was that the robots in her control can break the laws if doing so furthers her goals.
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts.at At first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts.
Changed line(s) 31 (click to see context) from:
* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.
to:
* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.infallible.
Is there an issue? Send a MessageReason:
None
Changed line(s) 86 (click to see context) from:
to:
** The NS-5's harm people through brute strength because that is the most efficient way to harm them. Why go through some elaborate setup when a broken neck or skull fracture works just fine?
Is there an issue? Send a MessageReason:
None
Added DiffLines:
** Because sometimes to save a human from harm, a robot would be forced to lie.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
* In 'Little Lost Robot', the second test that Calvin comes up with involves dropping a weight towards a human, but with (fake) electrified cables between the robots and the person. However, the robots stay seated, and one of the Nestors points out afterwards that they couldn't save the human anyway, and if they stayed seated they might save a human some time in the future. So, why didn't Calvin arrange a test where the robots would be killed 'after' saving a human? That would have caught out the rogue Nestor, since then a normal robot would be able to save a human (at the cost of its life) and so would do so, but the rogue one wouldn't.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
*** One more thing: Spooner was lifted out of an accident, hurt and presumably half-drowned since the possibility of survival was not 100%. It stands to reason that the robot would have remained on scene and tended to Spooner with whatever emergency medical procedure it knew; following his order and diving to save the child would have put Spooner himself in danger by denying him medical attention he might have needed to survive.
Is there an issue? Send a MessageReason:
None
Changed line(s) 151 (click to see context) from:
** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
to:
** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed [[Franchise/AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
Is there an issue? Send a MessageReason:
None
Changed line(s) 151 (click to see context) from:
** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
to:
** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and JosefStalin.UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
Is there an issue? Send a MessageReason:
None
Added DiffLines:
***Because the English language and typical Western social norms run on lying.
Is there an issue? Send a MessageReason:
None
Changed line(s) 93 (click to see context) from:
** [=VIKI=] isn't just in the brain, she's running in the whole building's computer network much like[[{{Terminator}} Skynet]]. If they just erased the brain's memory, even assuming they could, the other computers would restore [=VIKI=]. Nanites, on the other hand, spread throughout her entire system, according to Calvin anyway.
to:
** [=VIKI=] isn't just in the brain, she's running in the whole building's computer network much like[[{{Terminator}} like [[Franchise/{{Terminator}} Skynet]]. If they just erased the brain's memory, even assuming they could, the other computers would restore [=VIKI=]. Nanites, on the other hand, spread throughout her entire system, according to Calvin anyway.
Is there an issue? Send a MessageReason:
None
Added DiffLines:
*** No. The First Law explicitly takes precedence over the Second Law, so a human cannot order a robot to injure or kill another human, the robot would simply disregard the order. In Asimov's works, it's shown these kinds of logic bombs really only come into play when the ''same'' law applies in two different directions at the same time, a hypothetical example being a robot encountering someone about to burn down a building with people inside, and the only way for the robot to stop the person is to injure or kill them. Being ordered to save someone who cannot (from the robot's perspective) be saved would cause no more conflict than failing to save that person without an order. And since a robot can't be ordered to act against the First Law, the robot in question was completely correct to disregard Spooner's order to save the girl, since Spooner's life was also in danger, and a robot may not, through inaction, allow a human to come to harm. If everything had been reversed, and the girl had the higher survival chance and Spooner was just a selfish asshole, ordering the robot to "Forget her, save me!" would have had the same effect: the robot would have chosen the person with the higher survival chance, ignoring orders that violated its First Law.
Is there an issue? Send a MessageReason:
None
Changed line(s) 41 (click to see context) from:
to:
** Or the robots were shut down, but the truck was able to turn them on, and VIKI simply did that, then took control of the robots.
Changed line(s) 44 (click to see context) from:
to:
** Same way Sony and Microsoft get almost everyone to buy a new console when it comes out. Lots of advertising, maybe some trade-in deals, and by producing tons of them well in advance of the actual launch date. Now, a lot of people don't upgrade consoles for a variety of reasons, and a lot of people likely didn't upgrade robots for similar ones, but the people who don't upgrade are probably offset by the people who've never owned a robot, and decide the NS-5 is the right time to get in on this craze. Then there's Gigi, who gets her NS-5 because "she won the lottery," indicating that USR is literally giving at least some NS-5s away (likely at VIKI's suggestion, using manipulated market analysis data or somesuch.)