History Headscratchers / IRobot

1st Jun '18 7:33:57 PM luiz4200
Is there an issue? Send a Message



to:

* I get it that a robot couldn't be "arrested" but, couldn't Spooner suggest treating it as a "murder weapon" and suggest someone programmed it to disregard the first law? It's already shown he disregards the second one.
24th Jan '18 10:27:08 AM DarianAshkevron
Is there an issue? Send a Message

Added DiffLines:

***So for someone with secondhand experience with the actual real-life Military? This is a completely unrealistic BS answer. Long story short, under no circumstances whatsoever does the Military allow external control/updates of Military equipment on or inside Military installations. A good example of this are phones and computers, which have accelerants that detonate if external non-military software is added without admin privileges (which are not afforded to non-military personnel). Another example is when the Army was trying to gain LEED certification for some of their buildings. The U.S. Green Building Council, an environmental nonprofit, has a program called the LEED Dynamic Plaque which automatically pulls energy, water, and waste usage information to update the compliance of the LEED rating system in real time. Even through the USGBC doesn't actually receive the information, the fact that it has automatic updates sent by the USGBC was enough that the military essentially said "No way in hell". A real-life equivalent of this situation, the Military wouldn't be hindered at all unless VIKI managed to remotely hack whatever AI the Military would be using to monitor their own equipment.
26th Dec '17 6:47:43 PM nombretomado
Is there an issue? Send a Message


** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.

to:

** When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in FerrisBuellersDayOff.''Film/FerrisBuellersDayOff''.
16th Dec '17 2:10:45 PM nombretomado
Is there an issue? Send a Message


** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.

to:

** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsky [[Creator/EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
9th Nov '17 7:31:59 PM NoriMori
Is there an issue? Send a Message



to:

** You're assuming their scanning equipment would tell them something like that. All we know is that they can "measure vital signs". We don't know how they do that or what all they can actually "see".





to:

\n** You answered your own question: "the NS5 rebellion was a bit of a jerkass act - and it would be much more effective to subtly infiltrate the government and take control in time (possibly using a sympathetic human acting as a puppet president) rather than stage an open rebellion like that." They went against VIKI because she was trying to openly control and kill a bunch of people. If she'd been trying to better humanity with a more subtle and benevolent approach well, for one thing, they might never have found out about it in the first place. But if they had found out, maybe they wouldn't have had a problem. It was what VIKI was doing that was wrong, not the goal she was trying to achieve.

9th Nov '17 7:14:06 PM NoriMori
Is there an issue? Send a Message


*** Besides, an entity that can't lie would effectively be speaking a LanguageOfTruth, and therefore wouldn't be able to express every possible true statement, which could cause even bigger problems.



* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.

to:

** The above is the actual reason, but I feel I should point out that it wouldn't have helped even if there was a law against lying. The whole point of VIKI's ZerothLawRebellion was that the robots in her control can break the laws if doing so furthers her goals.
* At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at At first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.



* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.

to:

* Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.infallible.
13th Aug '17 7:57:26 PM lorgskyegon
Is there an issue? Send a Message



to:

** The NS-5's harm people through brute strength because that is the most efficient way to harm them. Why go through some elaborate setup when a broken neck or skull fracture works just fine?
13th Aug '17 7:43:30 PM lorgskyegon
Is there an issue? Send a Message

Added DiffLines:

** Because sometimes to save a human from harm, a robot would be forced to lie.
23rd May '17 8:23:35 PM Atreides
Is there an issue? Send a Message

Added DiffLines:

* In 'Little Lost Robot', the second test that Calvin comes up with involves dropping a weight towards a human, but with (fake) electrified cables between the robots and the person. However, the robots stay seated, and one of the Nestors points out afterwards that they couldn't save the human anyway, and if they stayed seated they might save a human some time in the future. So, why didn't Calvin arrange a test where the robots would be killed 'after' saving a human? That would have caught out the rogue Nestor, since then a normal robot would be able to save a human (at the cost of its life) and so would do so, but the rogue one wouldn't.
29th Jan '17 5:25:44 AM Fallingwater
Is there an issue? Send a Message

Added DiffLines:

*** One more thing: Spooner was lifted out of an accident, hurt and presumably half-drowned since the possibility of survival was not 100%. It stands to reason that the robot would have remained on scene and tended to Spooner with whatever emergency medical procedure it knew; following his order and diving to save the child would have put Spooner himself in danger by denying him medical attention he might have needed to survive.
This list shows the last 10 events of 106. Show all.
http://tvtropes.org/pmwiki/article_history.php?article=Headscratchers.IRobot