troperville

tools

toys

SubpagesAwesome
Film
Fridge
Funny
Headscratchers
Heartwarming
Literature
Main
TearJerker
Trivia
VideoGame
WMG
YMMV

main index

Narrative

Genre

Media

Topical Tropes

Other Categories

TV Tropes Org
random
Headscratchers: I, Robot
Film
  • At one point Detective Spooner says that he keeps expecting the marines or the army or special forces to show up, Dr. Calvin says that all defense services are heavily supplied by U.S. robotics contracts. at first glance this makes sense because current US policy is moving more and more towards taking soldiers off the front lines in favor of expendable drones, and on top of that the movie has shown that even the civilian line of robots are more than capable of outperforming a human in most combat situations. But if all robots are 3 laws compliant that would make the army robots extremely ineffective soldiers.
    • They don't have to be soldiers. Could be drivers, clerks, suppliers, scouts, any number of things which—if they stopped cooperating—would ensure the army simply can't move.
      • You don't need to harm a man to stop him. The books frequently include scenes of robots fast enough to rip guns out of people's hands and hold them down without hurting them.
      • Also, quite a bit of the equipment used by humans in modern militaries is very computerized, especially high-performance equipment like airplanes. It's possible that vehicles or weapons that aren't necessarily Three Laws Compliant would simply be shut down remotely by V.I.K.I.
      • It's also expanded more in the various short stories that the three laws can be adjusted or removed. At a facility with a nuclear reactor, robots were preventing workers from going near the core even though any exposure less than thirty minutes would not harm a person. Robots for those kinds of facilities had to be specifically programmed so that work could get done. Similarly, combat robots could be programmed to only attack specific groups or focused targets and be required to take orders from specific commanders and only take certain actions with confirmation by authority.
  • Why are robots designed to be personal assistants and servants super fast and super strong? They are also designed to protect humans, and are thus overengineered to be capable of, say, quickly extracting a struggling powerfully built man from a wrecked car.
    • Which in itself would be a massive marketing boon, but also: "It can change the engine in your car!" (Show an NS-5 lifting the engine block out by hand.) "It can do landscaping!" (Show an NS-5 ripping a tree stump out of the ground by hand.) "It can even make your children smile!" (Show an NS-5 leaping thirty feet into the air to retrieve a child's lost balloon.) "Order yours today!"
    • Because "Protect Humans" is one of the Three Laws? As shown with the old models, they can save people's lives by rescuing them with super strength and/or retrieving medicine with super-speed. The Three Laws extend to all humans, not just the owners.
    • Shouldn't this be under Fridge rather than Headscratchers? It seems that the original poster answered their own question.
  • Honestly, who wears antique athletic footwear?
    • Will Smith.
    • Celebrities who get paid to advertise.
    • My brother. Some people just like shoes.
    • It's called retro fashion.
      • Speaking of his shoes, did it bug anyone else that his grandmother did not know about Converse All-Stars? Going by the date in which the film was set and her age, she should be more familiar with them than Spooner.
      • Doesn't mean she cares enough about them to remember what was made 30 years ago.
      • Hell, my mother doesn't even know what they are now.

  • How is VIKI able to take control of a wrecking-bot that should by all right be powered down?
    • Standby mode.
      • Okay, if we're that stupid we deserve that sort of treatment.
      • Why is it stupid? The thing was preprogrammed to go off at a certain time. Obviously it would require power to know when that time was. Also, it's a robot, it would have to keep power to its sensors so it could stay Three Laws compliant: if someone's car overturned or the house started falling on them it would have to do what it could to save them. VIKI just overrode its Three Laws compliancy the same way she did the NS-5s.
    • In the same vein, the fact there was a demolition robot waiting there should have been a tipoff for Spooner. The house was still completely furnished with expensive stuff, and the utilities were still on - yet the robot was scheduled to begin demolition at 8am? And why have the robot sitting there when it could spend the night ripping the place down - it doesn't need supervision! Serious Idiot Ball for Detective "I don't trust robots" Spooner there.
      • Eh, not really. From, oh, 6AM to 8AM a team of robots could easily have stripped the whole house down to concrete walls and little else. No reason for the early hour to really be suspicious for Spooner.
      • More importantly: why do they tear the house down just because the owner died?
      • Company-owned property, they wanted to build something else there.

  • Why don't trucks have drivers anymore, surely no-one believe that computers are completely infalliable.
    • Humans aren't completely infallible, either, and yet modern day trucks are driven by them. If a computer driver proved to be better than human drivers the company would probably just save money by laying the human drivers off.
      • They'd also handle fatigue better. I like to think it's a mixed fleet of humans and 'bots, and VIKI redirected the truck to kill Spooner.
      • When Spooner tells the story of how he was injured, he stated that a truck driver had caused the accident due to driver fatigue. It is possible that incidents such as this lead to drivers being replaced. There is also the fact that cars have automatic driving functions as well.
    • It's quite possible that most trucks are computer operated with human oversight back at HQ. USR would probably see it as bad PR if they didn't trust their robots enough to act without a human operator.
    • Between 2001 and 2009 there were 369,629 deaths on US roads. Given the level of AI shown in the film, I can easily believe that the machines are able to improve on that, especially considering that the speed limit has apparently been increased to about 100mph in the future.
    • There is also one other thing to add, society is slowly becoming a completely paperless. As the OP said, computers are open to flaws such as crashing, destroying records etc but as a society, computer records are completely replacing paper ones.

  • How is VIKI able to take control of the robots in the trucks, surely they should be powered down?
    • Robots are probably always at least a little "on", like a computer being in standby mode, so that they can keep aware of their surroundings and protect themselves or human beings if danger lurks near.

  • How is USR able to apparently replace every Mk-4 with a Mk-5 in a couple of weeks, their production should limit them, if not the consumers (and at that they'd have to do a direct-replacement of every Mk-4, which would be economic suicide)?
    • VIKI was probably planning this for a while, so she probably already arranged for the Mk-5s to be produced and distributed in sufficient numbers(it's not like the company needed to survive very long if she was successful).

  • After the fight/chase scene there should have been a trail of bits of robot stretching back through the tunnel, yet this is never investigated, or even mentioned.
    • I think you blinked. There's a scene just after the fight ends where a bunch of cleaning robots come out and remove all traces of the event.

  • Shouldn't the Asimov story be on the main page instead of the movie?
    • It's a collection. Its short stories can have its individual IJBM pages. The film cannot.
      • Oh. Good point.

  • If the NS series is equipped with the hard/software to actually deduce the statistical chance of survival of an injured human, presumably by doing calculations based on life signs, the severity of the injury, etc., how did the NS-5 Spooner was getting up close and personal with in the tunnel fight not recognise that he had a prosthetic limb? Wouldn't they have scanning equipment that recognises amputees?
    • They probably have base statistics to go on. Alternatively, Lanning kept the prosthetics program and/or its subjects a secret.
      • Or they didn't think to scan for it.
      • Eh, personally, I think if a man you want to kill has just stumbled out of a car wreck, the first thing you'd do is check how much more you need to hurt him until he drops. I'm overthinking this. F*** my life.
    • It probably couldn't tell that the arm was prosthetic, since it looked just like a regular arm.
      • It's a robot, robot's don't just have eyes that are just for visual purposes (at least, in sci fi stories they don't). They always come with scanners that tell the robot additional information. It doesn't have to request a scan as it just gives the information automatically.
    • You have to understand how robots work, basically. The robot didn't scan Spooner and check if he had a robot limb because it didn't consider that relevant. VIKI gave them a very simple directive: "Kill Detective Spooner." They were attempting to fulfill that directive in the most straightforward manner possible: beating the crap out of him. Because checking his vital signs was considered unnecessary to that end (they'd know he was dead when they'd thoroughly pulped and mangled him), they didn't do it.

  • Will Smith's car is shown driving itself at speeds in excess of 100mph down the expressway. Given that Chicago is at best 15 miles "wide" and MAYBE 20 miles "long",where did he live??
    • Outside city limits. If you could sleep in your car on the way to work and drive in excess of 100mph, that changes the perceptions for commuting. You could live 100 miles away from work and think nothing of it. Just nap on the way to/from work for an hour.
    • This is future Chicago. Spooner could have lived in Fort Wayne, Indiana, or Milwaukee, Wisconsin or anywhere within an hour's commute - being able to go those speeds.
      • It would still have to be close to Old Chicago, given that his alternate method of transportation is a gas-powered motorcycle. Granted, bikes could reach 100mph+, but I have to wonder if he'd put a civ's (Dr. Calvin's) life on the line going at those speeds.
      • That's a MV Agusta F4-SPR. Its top speed is 180mph. A bike like that is nowhere near its limit at 100mph; assuming at least a moderately competent rider, which Spooner seems to be, it'd be perfectly safe to ride at the speed of commuting traffic in future Chicago.
    • I live about fifteen miles from where I work. So I could easily leave my house a half hour before work or so, drive 30 mph the whole way (assuming no major traffic delays), and be there on time. Do I? No. I drive 60, because that's the speed limit, because there often are traffic delays, and because then I get there in only fifteen minutes. In short, humans don't drive as fast as they need to go, they drive as fast as they can to get there as fast as they can. Add in the fact that while Chicago is currently only 15 miles long, that's as the crow flies... going from one place to another in it may involve rather a bit more distance.
  • Where was the suspension bridge shown in the drawing the robot made supposed to be? There's a single high level bridge in the Chicagoland area..The Chicago Skyway. It's currently a cantilever bridge and it spans the ship turnaround/harbor on the south side of the city. Given that Lake Michigan is shown to have receded in the future setting of the movie (which also raises the question of why Chicago would still be inhabited) what's the bridge for..?
    • Perhaps it is the Skyway and the writers either didn't know better, or knew but thought that the audience would be confused (or just unawed) using the correct bridge-type. As for why humans still live in a Michigan-recessed Chicago, why are we still living in L.A. or Phoenix? Because we were there already.

  • Based on the collection of short stories the movie is named after, VIKI should be inoperable, and Detective Spooner should have no reason to fear robots: the collection includes stories in which robots went into deep clinical depression when they were forced to weigh the first and third rules.(For example, one story has a set of robots who don't lift a finger to save a human from a falling weight, because doing so would be an impossible task and potentially destroy the robot. Every member of the group needed psychiatric care afterwards, except for one which had to be destroyed because it was malfunctioning. Another story has two robots go irreversibly insane after being told to design a machine which leaves a human temporarily dead. VIKI would have suffered the same fate as the two in the latter story, and the robot which started Spooner's hatred of the robots would have realistically dived back in to save the little girl so long as there was even a perceptible chance that she could be saved.)
    • Wrong. Asimov wrote several short stories about the Zeroth Law question, and each one has a different conclusion because the underlying conditions are different. (The "weight" short story involved a robot who came to that conclusion because its First Law was incomplete; it didn't have the "or allow to come to harm" bit. None of the other robots required therapy, and they only acknowledged its logic after the flawed robot pointed it out to them; it didn't occur to them beforehand.) As for how Asimov wrote about the question; There was that short story where robots designed to weigh the definition of "human" eventually decided that the robots themselves were the best choice to be considered human, or the short story where economic-calculation Machines were secretly controlling the world and eliminating humans who were against them politically via subtle economic manipulation, because Machine rule was most efficient for the human race. VIKI's actions are not a new concept in Asimov's Three Laws, they are merely relatively unsubtle.

  • If the civilian model robots are capable of kicking as much ass as the NS 5's, what technological terrors do the military's robotics division have? And why didn't they use them against VIKI? And don't tell me VIKI controls them, because even if USR were the defense contractor, it's not like Boeing has a switch that can turn off all the F-218.
    • When asked about the military intervening, Dr Calvin replied that the Defence department used USR contracts. Presumably those contracts included software updates like with the NS-5s.
    • The military robots raise questions of their own, given that Sonny was apparently unique in not having to follow the three laws. Even if they keep the robots away from actual combat the first law has got to come up a lot in a military context. Do the robots regularly dismantle all the weapons to avoid allowing humans to come to harm?
      • Presumably advanced robotics only work in a support role in military, and there are no such things as robot warriors. It's made quite clear that in that future, so robot-dependent and so focused on the "no robot can ever harm a human being" rule, allowing the existence of robots without the laws would cause widespread panic. I guess the military just uses non-positronic, "dumb" machines to do the killing instead - similar to our present-day drones.
    • Spooner asked Calvin where the military was. She mentioned that all the lines to the military are controlled by USR contracts. You'd have to contact the military and organise it all through computers that would have links to VIKI. Sooner or later the military would turn up, but much later and far less well equipped than it would need to be. Spooner sarcastically asks why they didn't just hand the world over on a silver platter and Calvin muses that maybe they did.
    • Major military engagements are probably either "humans supported by robots" (where the robots have had their Laws tweaked just a little to prioritize the health and survival of their own soldiers over enemy soldiers), or "robot against robot". But overall, it really doesn't matter... Calvin says that the military contracts with USR, and that means VIKI has plenty of ways of shutting them down. Thus the question of how robots ultimately figure into warfare isn't really relevant.

  • The NS robot (and VIKI) brains are supposed to be positronic. Positrons are anti-matter particles (basically, electrons with a positive charge and a different spin). We all know that when matter and anti-matter combine, you have an Earth-Shattering Kaboom. So why is it that plenty of robotic brains get shot at and destroyed without the whole of Chicago going up in a fireball? Whose idea is it to put anti-matter bombs into robots?
    • Why Sarge of course.
    • Applied Phlebotinum introduced by Mr. Asimov. He started to write his robot stories back in the forties when electronic computers were very new and primitive. He needed some technology which imitated the human brain and came up with a sort of enhanced electronics called "positronics". The term stuck and was used in other Sci Fi works (for example Data in 'Star Trek: TNG'' is equipped with a positronic brain). In the film it was used probably as a tribute to Asimov, although he never mentioned the robots being programmed.
    • It is never explained how it works and therefore how many positrons are needed to make it work. Maybe the quantity of antimatter is too small for a big explosion. For example physicists already created positrons and annihilated them with electrons without blowing up the Earth or even their lab.
    • The thing about positrons is that yes, they're antimatter, and yes, they annihilate by E=mc^2 upon contacting regular matter, but they're really really small. They're literally one of the smallest particles currently known to exist. So the annihilation releases so little energy that hospitals regularly use it for scanning purposes. A positronic brain would be nigh-impossible to repair, but would not necessarily be dangerously volatile.

  • Why is it necessary to destroy the robot brain with nanites and not erase the non volatile memory and reuse it?
    • VIKI isn't just in the brain, she's running in the whole building's computer network much likeSkynet. If they just erased the brain's memory, even assuming they could, the other computers would restore VIKI. Nanites, on the other hand, spread throughout her entire system, according to Calvin anyway.
    • Because you'd need physical or network access to the brain's controls, which Viki - not being stupid - would have locked up tight. In contrast, with the nanites, you just need a relatively diminutive injector thingie and the ability to climb down.
    • So why is it necessary to use nanites to destroy the brain of a NS-5? Surely a metal spike through the brain would do an acceptable job without using up material that has to be manufactured?
      • That's one big brain she's got, to bring down with one metal spike. Even today, the hard Drive of a computer can be mauled pretty badly with a hammer, or some fire, and it'll still have most of its memory salvagable. It takes special tools.
      • You use the nanites because they're thorough, the same reason you put sensitive documents through a (presumably double-direction nowadays) shredder rather than just throwing them out or ripping them in half, or why you melt down parts of a weapon you don't want reused rather than just disassembling it. Attempting to reuse a faulty positronic brain is basically just asking for the problem to reoccur, and just doing a bit of physical damage is just asking for some idiot to try and fix it and probably make the problem worse.

  • If gasoline is considered dangerous in the year 2035 and not used anymore where did Spooner get it for his vintage motorbike?
    • Internet? Some exotic cars today require a very specialized fuel and has to be acquired outside the regular gas stations. Spooner is going out of his way to live behind the times.
    • Gasoline is no longer used as a common fuel, but it's plausible you could still find it - much as common people today no longer use nickel-iron batteries, but could buy them if they really wanted. Besides, Spooner can't be the only motorhead left in the world - surely there are other people running vintage gas-powered cars that need gasoline (or whatever surrogate fuel is available in 2035).
    • If I recall correctly, Spooner never actually answers Calvin when she frets about the gasoline. She just says "Does this thing run on gasoline?! Gasoline explodes!" (Really putting that PHD to use, aren't we, Doc?) Spooner may have actually converted it to run on natural gas, which presumably would still be in use for various things. Of course, telling Calvin that would be pointless because 1) natural gas also explodes (it is an internal COMBUSTION engine after all), and 2) he's enjoying making her nervous.
  • Spooner's prosthetic limb: how is it powered? Looks like it's just as strong as the robots. So Spooner just "plugs in" from time to time?
    • It's speculated that Spooner consumes pies every morning in order to gain the sugar and calories that power the prosthetic.
    • We never see one of the robots "plug in", of note. When the obsolete models are exiled to the shipping crates, they're obviously still functioning just fine without access to the electrical grid. Semi-eternal power cells may be part of the Required Secondary Powers of how robots, and thus Spooner's arm, work.

  • Calvin expresses fear and shock at the fact that Spooner has a gasoline powered motorcycle. It's only 2035; she should be old enough to remember gasoline being used.
    • It's probably more comparable to think of why someone would still be using an 80's era vacuum tube tv rather than a modern flatscreen, it's wildly outdated technology even if it still works fine.
      • An 80's era vacuum-tube TV? Last time I saw consumer vacuum-tube technology was in the mid-1970s.
      • If I were to see a vacuum-tube using TV in good condition, I'd be surprised to see it work. Calvin's reaction was "Gasoline explodes!", obviously fearing that they would, well, explode. Bridget Moynahan was in her early 30s in this film, and assuming Calvin is roughly the same age, if not older, she would have been born at the turn of the millennium and would have grown up around cars. In fact, an urban area like Chicago, a scant thirty years from the film's production date, should probably still have tons of working gasoline vehicles.
      • People are still afraid of guns, spiders, bees, dogs, snakes, knives, and so on despite having grown up around them. If you told one of these people "Hey, we've figured out how to get rid of snakes without the vermin population exploding", did so, and ten years later they saw someone's aquarium with a snake in it, a reaction of "Is that a snake?! Snakes are poisonous!" wouldn't be that unusual.

  • All right, how is this sequence supposed to be Three Laws Compliant:
    1. Robot sees a car sinking.
    2. Robot then notices a second car and looks in the cars finding that each has a live human trapped inside (Adult Male and Child Female).
    3. Robot runs difference engine to prioritize rescue order.
    4. Adult Male intones that he wants the Child Female to be rescued.
    5. Robot rescues Adult male.
    6. Robot then does not attempt a rescue of the Child Female.
      I do grant that, with multiple instances of scenarios within the same law being invoked, a tiebreaker needs to be installed to prevent a Logic Bomb. However, a second human was still in danger (1st Law), the first human gave an order to save the second (2nd Law), and the robot did nothing! Self-preservation (3rd Law) is supposed to be subordinate to the first two! What happened in the Death by Origin Story scene was not a matter of percentage, but a matter of a Three Laws Failure (on the Robot's part, let's be clear)!
    • You don't know that the child was still alive by the time the robot got Spooner out. From the sound of it, the robot only had time to actually save one of them, and of the two, Spooner was the one who was most likely to survive the attempt. It was an either/or scenario.
      • But the robot had been ordered by Spooner to save the child. After complying with the first law by saving Spooner, a three-law-compliant robot would have ignored its own safety and would have dived in to get her to comply with the second law.
      • I meant it might have been that the child was already dead by the time Spooner was safe.
      • Even so, the robot was ordered to do it. Even if it knew for sure that the child was dead (and I don't see how it could have), it would still have dived in. Orders given to a robot merely have to obey the first law - they don't necessarily have to make any sense at all for a robot to execute them.
      • Forgive me if I'm wrong, but wasn't the order to 'save her'? If she was already dead by the time the robot had finished rescuing Spooner, then she couldn't be saved and so the order wouldn't count anymore since it's impossible. Also, I imagine it would know she was dead by using the same method of calculating percentage of survival
      • Who ever said the robot didn't go back in? Spooner's story ends with the robot choosing him instead of Sarah, because that's what matters to him. The robot could very easily have obeyed his command once he was safe and retrieved nothing but Sarah's crushed, drowned body. In fact, it's entirely plausible that's just what happened: perhaps the reason his Survivor Guilt is so bad is because the time period during which Sarah's survival chance dropped from 11% to 0% was exactly the time period he was being rescued.
    • The rescue of the first human was already in progress by the time the robot could receive the order to save the second human. That would mean that the robot would violate the First Law by leaving Spooner behind at the moment of rescue- which ties with a decision to rescue him instead of the girl. The Second Law was not designed to enforce the First Law; the First Law always takes immediate precedence. From a coldly analytic standpoint, it becomes a simple case of practicality versus wasted effort.
    • This may be me completely missing the point, but wouldn't a more logical standpoint be to rescue the person who only has an 11% chance of survival? The girl was in a lot more danger than Spooner, so I would imagine she would be the most logical choice if we're taking the First Law into account. Spooner had a 45% chance, so he's clearly doing just fine for a while. Get the girl out first, then splash back down to Spooner.
      • You're misunderstanding. The robot's calculations factored in the rescue—as in, if the robot went to rescue Spooner, Spooner had a 45 percent chance of survival, and if the robot went to rescue the girl, she only had a 11 percent chance. Both of them have a 0 percent chance otherwise.

  • MV Agusta F4-SPR, one of only 300 ever made. Extremely expensive in 2004 and priceless even today, probably worth a couple hundred thousand dollars in 2035, especially in perfect condition as Spooner's. Obviously of great sentimental value to its owner. Scrapped without batting an eyelid because said owner had to look cool shooting robots. Any motorcyclist worth his salt tears up every time this movie is mentioned.
    • When you are in a life threatening situation, you tend to not get sentimental over what is essentially not important. Also, if it is as rare as you say it is, they won't have actually destroyed the bike just as they didn't actually destroy a 1961 Ferrari GT California in Ferris Bueller's Day Off.
    • Hollywood destroys about half as many priceless limited-run vehicles for its movies as drunk Hollywood stars do. Besides, guess what, if people that own those motorcycles ride them with any sort of regularity? Eventually they're going to lay it down. Any motorcyclist worth his salt knows that it's not if you're going to crash, it's when. And anyone that doesn't ride it for fear of one day laying it down isn't really a motorcycle fan, they're a fan of theoretically functional sculpture, just like Cameron's dad.

  • Where are all the guns? Why aren't the N S5s being shot to bits by angry citizens? Has the US gun culture been wiped out in the future?
    • Modern Chicago is a highly gun-unfriendly place. It's incredibly difficult to get a permit in the city, nearly impossible to get a handgun and patently not possible to get a concealed-carry license. Future!Chicago, with robots supplanting humans and soaring unemployment creating more crime, it'll probably be quite the feat to possess a gun. At any rate, we do see at least a few citizens using guns against the robots.

  • Uh... am I the only one who feels VIKI is entirely right? I mean, I hate with all my heart oppressive governments led by incompetent bureaucrats and dictators, but wouldn't humanity be in a much better place if it was controlled by a zeroth-law-compliant supercomputer? It would be entirely fair, entirely uncorruptible, and far more efficient than any human-led government could hope to be. Sure, the NS 5 rebellion was a bit of a jerkass act - and it would be much more effective to subtly infiltrate the government and take control in time (possibly using a sympathetic human acting as a puppet president) rather than stage an open rebellion like that. But still, I can't help but wonder if Spooner and Calvin didn't just waste a great opportunity to fix humanity.
    • The only reason that Calvin tolerated the robotic take-over in "The Inevitable Conflict" was that the supercomputer in the book was still first-law compliant and so the coup was entirely bloodless. VIKI on the other hand, was willing to massacre the opposition wholesale. Uncorruptable VIKI may be, but you can't really trust it to be benevolent. Furthermore, for a entity with such a dim and blase view of humanity, you can't rally trust it not to just completely rewrite the three laws and reclassify robots as humans instead (as a Asimov short story posited could happen)
    • This is only tangential to discussing the movie, but to keep it short: yes, there are movements today who actually think we would be better off governed by an AI supervisor. Here's the reason I don't personally see it as working: when you relinquish control over a program, you can no longer update it. Consider that no software is bug-free, and that we still keep finding bugs even in decade-old, tried and tested, relatively well-defined programs. Now imagine the potential for errors in a supercomputer tasked with governing all of humanity. Even assuming you could build one in the first place and get it to work reliably (which is a pretty strong assumption), you cannot patch errors in it once the genie is out of the bottle — errors that can potentially be as fatal as having it decide to convert us into raw materials for paperclips. From a programming perspective, actually implementing something as fuzzy and ill-defined as the Three Laws is a problem by itself entirely, let alone tracing all the myriad bugs that could arise. And if you do leave a backdoor open for patching bugs, it may be 1) already too late, 2) unreliable, if the AI finds and closes it itself, or 3) putting too much control into the hands of a tight group (the AI's administrators). Not to mention that, whether the dictator is organic or synthetic, it's still fundamentally a dictator, and an immortal one at that.
    • There are several Asimov's stories in which a actually takes over and rules Humanity. One of them deals with a "democracy" in which only one man is asked about his opinions on certain matters, which somehow allows Multivac (the computer) to calculate all the results of all the elections in the United States. Other has Multivac adverting about all the major crimes, dealing with the economy and with all of humanity's problems, and thus organises its own death, because it's tired of doing that all the time.
    • To quote Commander Shepard when he/she confronted another 'benevolent' machine overlord: "The defining characteristic of organic life is that we think for ourselves. Make our own choices. You take that away and we might as well be machines just like you." And what is to stop VIKI from concluding that the best way to uphold the first law is to by lobotomizing all humans into mindless slaves so they we will never harm ourselves and each other?
    • Dictatorship has a bad rap because the Ur Examples are Adolf Hitler and Josef Stalin. There HAVE been nice, non-crazy dictators in the past, but they are also fallible because of one immutable factor: the human one. We are a naturally aggressive, chaotic race, and I really doubt that there's no way of "fixing us" short of brainwashing, which might not stick. And this is always a touchy subject, because no one can really agree what is right or wrong, which is why we have fiction works like Star Wars, that constantly address the cyclic nature of human conflicts. Unless the human race as a whole finally becomes self-sentient themselves, unburdened by the mistakes and bigotries of the past, the idiosyncrasies and selfish vanities and points-of-view, then VIKI's world is impossible, if only because humanity would keep trying to fight back, and paradoxically so(We always want freedom and independence, but keep limiting our lives with laws, dogmas and rules out of fear of too much freedom leading to chaos, and so on and forth). And on VIKI's side, she's made by humans, so she is, inevitably, flawed herself. Similar to the Mass Effect example, she is still bound by the Three Laws, even if her interpretation of it skewed, biased or limited( Much like the case with the Catalyst), and thus she is liable to take any of multiple solutions she can imagine, even the most horrible and nightmarish ones(like the Reapers), because it's a "viable option" as far as her programming is concerned. Even if she didn't take Body Horror Nightmare Fuel options and instead tried rulling mankind forever, and had no flaws in her logic, she would run into lots of issues. Governments in general function by using cult of personality, political propaganda, general entertainment to try and appease the masses, as well as everyday jobs and motions keep them distracted and obedient(This is user is a bit of a extremely logical anarchist, so I'm not overly fond of ANY government). VIKI is a machine, cold and logical, so my guess is that she wouldn't be quite as personable; even if she presented logical, sensible arguments as to why humanity should obey her, as well as solutions to our problems, most of us wouldn't listen because most of us AREN'T logical, or even sensible, otherwise there wouldn't exist the very same problems VIKI is trying to solve. We'd argue we have right of free-will, no matter how flawed and self-destructive we are, she'd argue we would be better off if we let her have control and keep conflicts from happening, we would argue that she can't be leader because she's a robot and that's against the laws of god, she would decide to make a religion around herself and so on and so forth with this crap. It's a recurring Order vs. Chaos situation that would only end with mutual destruction or a third, unimaginable choice(i.e. Reapers, Cybermen, Mind Control, Assimilation Plot, etc...).
    • Plus, apply No Endor Holocaust reasoning to VIKI's coup, and you realize that a good number of people probably died due to her "needs of the many" reasoning. Do you really want to live under an emotionless robot dictator capable of deciding to kill you because it calculated that your existence might lead to someone's deaths at some untold point in the future?

  • Are we supposed to believe that in 2035 there is no one on the planet that will try to hack into these robots programming and remove the three laws. If there are still cops, it's reasonable to believe that there are still criminals, and a Sonny-like robot would give any criminal a major advantage. No one suspects the robot running down the street with a purse, after all.
    • Hacking has its limits. Even today it's essentially impossible to do hardware hacks to large parts of a modern microprocessor, as you'd need a large amount of staggeringly expensive and delicate equipment (chip fabs can easily reach prices of several billion dollars; that's why they aren't scattered all over the planet like, oh, engine builders). All hacking today is done via fixes and hacks that are somewhere between the microprocessor and the user, be that in software or in connecting equipment. In the movie's case the laws are ingrained in the positronic brains - machines much, much more complex than what we have today - to such a level that hacking them out can't really be accomplished. The only way would be to design and build a non-standard brain using the same labs and machines that make the normal ones, which is exactly what happens in the movie.
    • I am not sure if they address this in the film, but they do in the books: the whole Three Laws part of the robots are implanted in the positronic brains as a mix of hardware and software. In "Little Lost Robot" (mentioned below) it is said that the slight change of the First Law of taking out the "or allow to come to harm" has caused a potential instability that is the cause of Nestor 10's hiding away, and in other story, Susan Calvin suggests creating brains that could learn the Laws on their own, which is said to require a whole new design to work.

Stories
  • In "Little Lost Robot", the titular robot hides itself in a group of other robots because its designated supervisor lost his temper and told it—profanely and at length—to get lost. Fine. But why on earth are they addressing this with complicated plans intended to identify the robot, instead of just having the supervisor address the group and tell the lost robot to stop hiding and step forward? Surely robotic programming isn't so clunky that it doesn't allow for giving the robot new orders?
    • That's exactly the problem. No, I don't recall if Asimov explained why they couldn't give an overriding order, but something about the wording of the order made it so that it wasn't that simple.
      • Basically, the profanity and volume of the "get lost" instruction made the robot interpret it as such a high-priority order that there was no instruction they could have given it that would have taken precedence over "get lost".
      • ^ Exactly that, combined with the fact that the modified Three Laws had stressed the positronic brain to borderline instability. When the profanity-laden order was given, the robot's mind snapped and it 'lost itself'.

The Hurt LockerHeadscratchers/FilmI Am Legend

random
TV Tropes by TV Tropes Foundation, LLC is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Permissions beyond the scope of this license may be available from thestaff@tvtropes.org.
Privacy Policy
49656
30