Follow TV Tropes

Following

History Headscratchers / IRobot

Go To

OR

Is there an issue? Send a MessageReason:
None


* Are we supposed to believe that in 2035 there is no one on the planet that will try to hack into these robots programming and remove the three laws. If there are still cops, it's reasonable to believe that there are still criminals, and a Sonny-like robot would give any criminal a major advantage.
No one suspects the robot running down the street with a purse, after all.

to:

* Are we supposed to believe that in 2035 there is no one on the planet that will try to hack into these robots programming and remove the three laws. If there are still cops, it's reasonable to believe that there are still criminals, and a Sonny-like robot would give any criminal a major advantage.
advantage. No one suspects the robot running down the street with a purse, after all.
** Hacking has its limits. Even today it's essentially impossible to do hardware hacks to large parts of a modern microprocessor, as you'd need a large amount of '''staggeringly''' expensive and delicate equipment (chip fabs can easily reach prices of several ''billion'' dollars; that's why they aren't scattered all over the planet like, oh, engine builders). All hacking today is done via fixes and hacks that are somewhere between the microprocessor and the user, be that in software or in connecting equipment. In the movie's case the laws are ingrained in the positronic brains - machines much, much more complex than what we have today - to such a level that hacking them out can't really be accomplished. The only way would be to design and build a non-standard brain using the same labs and machines that make the normal ones, which is exactly what happens in the movie.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Presumably advanced robotics only work in a support role in military, and there are no such things as robot warriors. It's made quite clear that in that future, so robot-dependent and so focused on the "no robot can ever harm a human being" rule, allowing the existence of robots without the laws would cause widespread panic. I guess the military just uses non-positronic, "dumb" machines to do the killing instead - similar to our present-day drones.
Is there an issue? Send a MessageReason:
None



to:

*** Eh, not really. From, oh, 6AM to 8AM a team of robots could easily have stripped the whole house down to concrete walls and little else. No reason for the early hour to really be suspicious for Spooner.
Is there an issue? Send a MessageReason:
Removing wick to Did Not Do The Research per rename at TRS.


** Perhaps it is the Skyway and the writers either DidNotDoTheResearch, or did do it but thought that the audience [[ViewersAreMorons would be confused]] (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.

to:

** Perhaps it is the Skyway and the writers either DidNotDoTheResearch, didn't know better, or did do it knew but thought that the audience [[ViewersAreMorons would be confused]] (or just [[RuleOfCool unawed]]) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.
Is there an issue? Send a MessageReason:
None



to:

** To quote [[MassEffect3 Commander Shepard]] when he/she confronted another 'benevolent' machine overlord: "The defining characteristic of organic life is that we think for ourselves. Make our own choices. You take that away and we might as well be machines just like you." And what is to stop VIKI from concluding that the best way to uphold the first law is to by lobotomizing all humans into mindless slaves so they we will never harm ourselves and each other?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** ^ Exactly that, combined with the fact that the modified Three Laws had stressed the positronic brain to borderline instability. When the profanity-laden order was given, the robot's mind snapped and it 'lost itself'.

Changed: 705

Removed: 21

Is there an issue? Send a MessageReason:
None








** What, the short story ''collection''? What for?
*** Because it came ''first''.
**** So what? It's a ''collection''. Its short stories can have its individual IJBM pages. The film cannot.
***** Oh. Good point.

to:

** What, the short story ''collection''? What for?
*** Because it came ''first''.
**** So what?
It's a ''collection''. Its short stories can have its individual IJBM pages. The film cannot.
***** *** Oh. Good point.










to:

** Spooner asked Calvin where the military was. She mentioned that all the lines to the military are controlled by USR contracts. You'd have to contact the military and organise it all through computers that would have links to VIKI. Sooner or later the military would turn up, but much later and far less well equipped than it would need to be. Spooner sarcastically asks why they didn't just hand the world over on a silver platter and Calvin muses that maybe they did.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*Are we supposed to believe that in 2035 there is no one on the planet that will try to hack into these robots programming and remove the three laws. If there are still cops, it's reasonable to believe that there are still criminals, and a Sonny-like robot would give any criminal a major advantage.
No one suspects the robot running down the street with a purse, after all.
Is there an issue? Send a MessageReason:
None


** AppliedPhlebotinum introduced by Mr. Asimov. He started to write his robot stories back in the forties when electronic computers were very new and primitive. He needed some technology which imitated the human brain and came up with a sort of enhanced electronics called "positronics". The term stuck and was used in other Sci Fi works (for example Data in StarTrek TNG is equipped with a positronic brain). In the film it was used probably as a tribute to Asimov, although he never mentioned the robots being programmed.

to:

** AppliedPhlebotinum introduced by Mr. Asimov. He started to write his robot stories back in the forties when electronic computers were very new and primitive. He needed some technology which imitated the human brain and came up with a sort of enhanced electronics called "positronics". The term stuck and was used in other Sci Fi works (for example Data in StarTrek TNG 'Series/{{Star Trek|The Next Generation}}: TNG'' is equipped with a positronic brain). In the film it was used probably as a tribute to Asimov, although he never mentioned the robots being programmed.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** If I were to see a vacuum-tube using TV in good condition, I'd be surprised to see it work. Calvin's reaction was "Gasoline explodes!", obviously fearing that they would, well, explode. Bridget Moynahan was in her early 30s in this film, and assuming Calvin is roughly the same age, if not older, she would have been born at the turn of the millennium and would have grown up around cars. In fact, an urban area like Chicago, a scant thirty years from the film's production date, should probably still have tons of working gasoline vehicles.


Added DiffLines:

** Modern Chicago is a highly gun-unfriendly place. It's incredibly difficult to get a permit in the city, nearly impossible to get a handgun and patently not possible to get a concealed-carry license. Future!Chicago, with robots supplanting humans and soaring unemployment creating more crime, it'll probably be quite the feat to possess a gun. At any rate, we do see at least a few citizens using guns against the robots.
Is there an issue? Send a MessageReason:
None



to:

** Between 2001 and 2009 there were 369,629 deaths on US roads. Given the level of AI shown in the film, I can easily believe that the machines are able to improve on that, especially considering that the speed limit has apparently been increased to about 100mph in the future.




to:

** The military robots raise questions of their own, given that Sonny was apparently unique in not having to follow the three laws. Even if they keep the robots away from actual combat the first law has got to come up a lot in a military context. Do the robots regularly dismantle all the weapons to avoid allowing humans to come to harm?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Who ever said the robot didn't go back in? Spooner's story ends with the robot choosing him instead of Sarah, because that's what matters to him. The robot could very easily have obeyed his command once he was safe and retrieved nothing but Sarah's crushed, drowned body. In fact, it's entirely plausible that's just what happened: perhaps the reason his SurvivorGuilt is so bad is because the time period during which Sarah's survival chance dropped from 11% to 0% was exactly the time period he was being rescued.
Is there an issue? Send a MessageReason:
None



to:

** There are several Asimov's stories in which a computer actually takes over and rules Humanity. One of them deals with a "democracy" in which only one man is asked about his opinions on certain matters, which somehow allows Multivac (the computer) to calculate all the results of '''all''' the elections in the United States. Other has Multivac adverting about all the major crimes, dealing with the economy and with all of humanity's problems, and thus organises its own death, because it's tired of doing that all the time.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Forgive me if I'm wrong, but wasn't the order to 'save her'? If she was already dead by the time the robot had finished rescuing Spooner, then she couldn't be saved and so the order wouldn't count anymore since it's impossible. Also, I imagine it would know she was dead by using the same method of calculating percentage of survival
Is there an issue? Send a MessageReason:
None



to:

** It's quite possible that most trucks are computer operated with human oversight back at HQ. USR would probably see it as bad PR if they didn't trust their robots enough to act without a human operator.
Is there an issue? Send a MessageReason:
None



to:

**** Doesn't mean she cares enough about them to remember what was made 30 years ago.
Is there an issue? Send a MessageReason:
None


** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsly decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.

to:

** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsly [[EliezerYudkowsky decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.

Changed: 1432

Is there an issue? Send a MessageReason:
None



to:

** This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with ''governing all of humanity''. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle -- errors that can potentially be as fatal as having it [[EliezerYudkowsly decide to convert us into raw materials for paperclips]]. From a programming perspective, actually ''implementing'' something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
Is there an issue? Send a MessageReason:
None



to:

** The only reason that Calvin tolerated the robotic take-over in "The Inevitable Conflict" was that the supercomputer in the book was still first-law compliant and so the coup was entirely bloodless. VIKI on the other hand, was willing to massacre the opposition wholesale. Uncorruptable VIKI may be, but you can't really trust it to be benevolent. Furthermore, for a entity with such a dim and blase view of humanity, you can't rally trust it not to just completely rewrite the three laws and reclassify robots as humans instead (as a Asimov short story posited could happen)

Added: 768

Changed: 136

Is there an issue? Send a MessageReason:
None



to:

* Where are all the guns? Why aren't the NS5s being shot to bits by angry citizens? Has the US gun culture been wiped out in the future?
* Uh... am I the only one who feels VIKI is entirely right? I mean, I hate with all my heart oppressive governments led by incompetent bureaucrats and dictators, but wouldn't humanity be in a much better place if it was controlled by a zeroth-law-compliant supercomputer? It would be entirely fair, entirely uncorruptible, and far more efficient than any human-led government could hope to be. Sure, the NS5 rebellion was a bit of a jerkass act - and it would be much more effective to subtly infiltrate the government and take control in time (possibly using a sympathetic human acting as a puppet president) rather than stage an open rebellion like that. But still, I can't help but wonder if Spooner and Calvin didn't just waste a great opportunity to fix humanity.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Even so, the robot was ordered to do it. Even if it knew for sure that the child was dead (and I don't see how it could have), it would still have dived in. Orders given to a robot merely have to obey the first law - they don't necessarily have to make any sense at all for a robot to execute them.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** I meant it might have been that the child was already dead by the time Spooner was safe.

Added: 420

Changed: 246

Is there an issue? Send a MessageReason:
None



to:

*** But the robot had been ordered by Spooner to save the child. After complying with the first law by saving Spooner, a three-law-compliant robot would have ignored its own safety and would have dived in to get her to comply with the second law.
* MV Agusta F4-SPR, one of only 300 ever made. Extremely expensive in 2004 and priceless even today, probably worth a couple hundred thousand dollars in 2035, especially in perfect condition as Spooner's. Obviously of great sentimental value to its owner. Scrapped without batting an eyelid because said owner had to look cool shooting robots. Any motorcyclist worth his salt tears up every time this movie is mentioned.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Gasoline is no longer used as a common fuel, but it's plausible you could still find it - much as common people today no longer use nickel-iron batteries, but could buy them if they really wanted. Besides, Spooner can't be the only motorhead left in the world - surely there are other people running vintage gas-powered cars that need gasoline (or whatever surrogate fuel is available in 2035).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Because you'd need physical or network access to the brain's controls, which Viki - not being stupid - would have locked up tight. In contrast, with the nanites, you just need a relatively diminutive injector thingie and the ability to climb down.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

**** That's a MV Agusta F4-SPR. Its top speed is '''180'''mph. A bike like that is nowhere near its limit at 100mph; assuming at least a moderately competent rider, which Spooner seems to be, it'd be perfectly safe to ride at the speed of commuting traffic in future Chicago.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Basically, the profanity and volume of the "get lost" instruction made the robot interpret it as such a high-priority order that there was no instruction they could have given it that would have taken precedence over "get lost."

Changed: 219

Is there an issue? Send a MessageReason:
None



to:

*** Speaking of his shoes, did it bug anyone else that his grandmother did not know about Converse All-Stars? Going by the date in which the film was set and her age, she should be more familiar with them than Spooner.
Is there an issue? Send a MessageReason:
None



to:

** You don't know that the child was still alive by the time the robot got Spooner out. From the sound of it, the robot only had time to actually save one of them, and of the two, Spooner was the one who was most likely to survive the attempt. It was an either/or scenario.
Is there an issue? Send a MessageReason:
None


** That's exactly the problem. No, I don't recall if Asimov explained why they couldn't give an overriding rule, but something about the wording of the order made it so that it wasn't that simple.

to:

** That's exactly the problem. No, I don't recall if Asimov explained why they couldn't give an overriding rule, order, but something about the wording of the order made it so that it wasn't that simple.

Top