History Headscratchers / IRobot

23rd Nov '16 7:06:05 PM nombretomado
Is there an issue? Send a Message


** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed [[Franchise/AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
12th Nov '16 6:05:14 PM nombretomado
Is there an issue? Send a Message


** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and JosefStalin.UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like StarWars, ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
24th Jul '16 2:28:12 AM Aubren
Is there an issue? Send a Message

Added DiffLines:

***Because the English language and typical Western social norms run on lying.
9th May '16 11:30:39 PM erforce
Is there an issue? Send a Message


** [=VIKI=] isn't just in the brain, she's running in the whole building's computer network much like[[{{Terminator}} Skynet]]. If they just erased the brain's memory, even assuming they could, the other computers would restore [=VIKI=]. Nanites, on the other hand, spread throughout her entire system, according to Calvin anyway.

to:

** [=VIKI=] isn't just in the brain, she's running in the whole building's computer network much like[[{{Terminator}} like [[Franchise/{{Terminator}} Skynet]]. If they just erased the brain's memory, even assuming they could, the other computers would restore [=VIKI=]. Nanites, on the other hand, spread throughout her entire system, according to Calvin anyway.
2nd May '16 9:15:18 AM ErikModi
Is there an issue? Send a Message

Added DiffLines:

*** No. The First Law explicitly takes precedence over the Second Law, so a human cannot order a robot to injure or kill another human, the robot would simply disregard the order. In Asimov's works, it's shown these kinds of logic bombs really only come into play when the ''same'' law applies in two different directions at the same time, a hypothetical example being a robot encountering someone about to burn down a building with people inside, and the only way for the robot to stop the person is to injure or kill them. Being ordered to save someone who cannot (from the robot's perspective) be saved would cause no more conflict than failing to save that person without an order. And since a robot can't be ordered to act against the First Law, the robot in question was completely correct to disregard Spooner's order to save the girl, since Spooner's life was also in danger, and a robot may not, through inaction, allow a human to come to harm. If everything had been reversed, and the girl had the higher survival chance and Spooner was just a selfish asshole, ordering the robot to "Forget her, save me!" would have had the same effect: the robot would have chosen the person with the higher survival chance, ignoring orders that violated its First Law.
2nd May '16 7:45:45 AM ErikModi
Is there an issue? Send a Message



to:

** Or the robots were shut down, but the truck was able to turn them on, and VIKI simply did that, then took control of the robots.




to:

** Same way Sony and Microsoft get almost everyone to buy a new console when it comes out. Lots of advertising, maybe some trade-in deals, and by producing tons of them well in advance of the actual launch date. Now, a lot of people don't upgrade consoles for a variety of reasons, and a lot of people likely didn't upgrade robots for similar ones, but the people who don't upgrade are probably offset by the people who've never owned a robot, and decide the NS-5 is the right time to get in on this craze. Then there's Gigi, who gets her NS-5 because "she won the lottery," indicating that USR is literally giving at least some NS-5s away (likely at VIKI's suggestion, using manipulated market analysis data or somesuch.)
16th Jan '16 8:27:36 PM RayAP9
Is there an issue? Send a Message

Added DiffLines:

*** Why can't they write some sort of code that tells the artificial intelligence it's not allowed to say anything that it knows is non-factual?
15th Jan '16 10:53:37 AM Anddrix
Is there an issue? Send a Message


** Perhaps it is the Skyway and the writers either didn't know better, or knew but thought that the audience [[ViewersAreMorons would be confused]] (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.

to:

** Perhaps it is the Skyway and the writers either didn't know better, or knew but thought that the audience [[ViewersAreMorons would be confused]] confused (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.
5th Dec '15 1:12:17 AM angelothewizard
Is there an issue? Send a Message

Added DiffLines:

** Because it runs on programs. Any programmer will tell you that a piece of code only does what you tell it to, meaning that a program cannot lie. Artificial Intelligence screws this up, but the Three Laws weren't counting on THAT.
9th Nov '15 9:43:15 PM RayAP9
Is there an issue? Send a Message

Added DiffLines:

* Towards the end of the film, Spooner leaves a frantic message about the ZerothLawRebellion on Susan Calvin's answering machine. She hears it, but pretends not to (as her robot is in the apartment with her), and her NS-5 [[BlatantLies tells her it was a wrong number]] when she asks who it was. Why isn't there something in the Three Laws (or perhaps a fourth law) that forbids a robot from knowingly lying to a human?
This list shows the last 10 events of 96. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=Headscratchers.IRobot