History Headscratchers / IRobot

24th Jan '18 10:27:08 AM DarianAshkevron
Is there an issue? Send a Message

Added DiffLines:

***So for someone with secondhand experience with the actual real-life Military? This is a completely unrealistic BS answer. Long story short, under no circumstances whatsoever does the Military allow external control/updates of Military equipment on or inside Military installations. A good example of this are phones and computers, which have accelerants that detonate if external non-military software is added without admin privileges (which are not afforded to non-military personnel). Another example is when the Army was trying to gain LEED certification for some of their buildings. The U.S. Green Building Council, an environmental nonprofit, has a program called the LEED Dynamic Plaque which automatically pulls energy, water, and waste usage information to update the compliance of the LEED rating system in real time. Even through the USGBC doesn't actually receive the information, the fact that it has automatic updates sent by the USGBC was enough that the military essentially said "No way in hell". A real-life equivalent of this situation, the Military wouldn't be hindered at all unless VIKI managed to remotely hack whatever AI the Military would be using to monitor their own equipment.
26th Dec '17 6:47:43 PM nombretomado
Is there an issue? Send a Message


** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.

to:

** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.''Film/FerrisBuellersDayOff''.
16th Dec '17 2:10:45 PM nombretomado
Is there an issue? Send a Message


** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.

to:

** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky [[Creator/EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
9th Nov '17 7:31:59 PM NoriMori
Is there an issue? Send a Message



to:

** You're assuming their scanning equipment would tell them something like that. All we know is that they can "measure vital signs". We don't know how they do that or what all they can actually "see".





to:

\n** You answered your own question: "the NS5 rebellion was a bit of a jerkass act - and it would be much more effective to subtly infiltrate the government and take control in time (possibly using a sympathetic human acting as a puppet president) rather than stage an open rebellion like that." They went against VIKI because she was trying to openly control and kill a bunch of people. If she'd been trying to better humanity with a more subtle and benevolent approach well, for one thing, they might never have found out about it in the first place. But if they had found out, maybe they wouldn't have had a problem. It was what VIKI was doing that was wrong, not the goal she was trying to achieve.

9th Nov '17 7:14:06 PM NoriMori
Is there an issue? Send a Message


*** Besides, an entity that can't lie would effectively be speaking a LanguageOfTruth, and therefore wouldn't be able to express every possible true statement, which could cause even bigger problems.



* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.

to:

** The above is the actual reason, but I feel I should point out that it wouldn't have helped even if there was a law against lying. The whole point of VIKI's ZerothLawRebellion was that the robots in her control can break the laws if doing so furthers her goals.
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at At first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.



* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.

to:

* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.infallible.
13th Aug '17 7:57:26 PM lorgskyegon
Is there an issue? Send a Message



to:

** The NS-5's harm people through brute strength because that is the most efficient way to harm them. Why go through some elaborate setup when a broken neck or skull fracture works just fine?
13th Aug '17 7:43:30 PM lorgskyegon
Is there an issue? Send a Message

Added DiffLines:

** Because sometimes to save a human from harm, a robot would be forced to lie.
23rd May '17 8:23:35 PM Atreides
Is there an issue? Send a Message

Added DiffLines:

* In 'Little Lost Robot', the second test that Calvin comes up with involves dropping a weight towards a human, but with (fake) electrified cables between the robots and the person. However, the robots stay seated, and one of the Nestors points out afterwards that they couldn't save the human anyway, and if they stayed seated they might save a human some time in the future. So, why didn't Calvin arrange a test where the robots would be killed 'after' saving a human? That would have caught out the rogue Nestor, since then a normal robot would be able to save a human (at the cost of its life) and so would do so, but the rogue one wouldn't.
29th Jan '17 5:25:44 AM Fallingwater
Is there an issue? Send a Message

Added DiffLines:

*** One more thing: Spooner was lifted out of an accident, hurt and presumably half-drowned since the possibility of survival was not 100%. It stands to reason that the robot would have remained on scene and tended to Spooner with whatever emergency medical procedure it knew; following his order and diving to save the child would have put Spooner himself in danger by denying him medical attention he might have needed to survive.
23rd Nov '16 7:06:05 PM nombretomado
Is there an issue? Send a Message


** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).

to:

** Dictatorship has a bad rap because the biggest modern examples are UsefulNotes/AdolfHitler and UsefulNotes/JosefStalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like ''Franchise/StarWars'', that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so([[LogicBomb We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth]]). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the ''Franchise/MassEffect'' example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited([[spoiler: Much like the case with the Catalyst]]), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take BodyHorror NightmareFuel options and instead tried rulling mankind forever, [[MythologyGag and had no flaws in her logic]], she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. [[AssassinsCreed [[Franchise/AssassinsCreed We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening]], [[FantasticRacism we would argue that she can't be leader because she's a robot]] [[AntiIntellectualism and that's against]] [[TheFundamentalist the laws of god]], [[MachineWorship she would decide to make a religion around herself]] [[ViciousCycle and so on and so forth with this crap]]. It's a recurring OrderVsChaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. [[Franchise/MassEffect Reapers]], [[Series/DoctorWho Cybermen]], MindControl, AssimilationPlot, etc...).
This list shows the last 10 events of 105. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=Headscratchers.IRobot