The Genie in the Machine

As it turns out, not all genies have to be quasi-mystical creatures who arise by having their lamp stroked. In Sci-Fi, it is quite common for a genie to take another form entirely—that of the well-meaning but hopelessly logical computer program.

It just turns out that these are the breaks. Computers have a Viewer-Friendly Interface, an Omniscient Database, and can recognize plain speech, but they are notoriously bad at understanding figurative language. So if you tell your Robot Buddy or AI to "Give me a break", it will try to snap your legs. Tell it to "get lost" or "take a hike", and it'll wander off by itself. And so on.

Another common interpretation comes to us courtesy of the old days of MS-DOS. The computer prompt, not having any idea what your ultimate goal is, needs everything explained to it one step at a time. And heaven help the poor computer user who can't figure out what the correct command prompts are. You Can't Get Ye Flask is a common result of this kind of confusion.

Of course, You Can't Get Ye Flask is also much less fun than this trope, since a normal computer will just not do anything when given instructions it can't understand. So when wackiness is the goal, only The Genie in the Machine can provide the fun we desire while still maintaining a vestige of non-magical command prompts.

The machine version of the Literal Genie.


    open/close all folders 

     Anime and Manga  

  • In the murder-mystery episode of Haruhi Suzumiya, Haruhi asks Yuki (sort of a computer) to lock the door and not let anybody in. Later, she asks to be let in, and Yuki refuses. Kyon gets her to let them in by telling her the order has been cancelled. He then guesses that the ordeal might have been Yuki's awkward attempt at a joke. It has also lead others to muse that for whatever reason Kyon's commands override anything else; which makes sense given Yuki's absolute loyalty to him.


  • The plot of the movie Space Camp is jumpstarted when Max wishes he could go into space within earshot of the robot Jinx. The robot (having run a simulation that indicates a 4.9 million year wait for the specific accident he wants) engineers the fatal shuttle accident that forces Mission Control to launch the shuttle by activating its booster rockets (Jinx ignited a single booster rocket, which would have flipped the shuttle right into a nosedive).


  • After working out three simple laws that would render all robots totally safe, just about every single Isaac Asimov robot story would show how a robot could stick to the letter of the law and still cause an awful lot of trouble.
    • In the short story "The Evitable Conflict", this leads to the robots outright taking over the world, since the laws of robotics insist that they always take action to save human lives when possible, which precludes them standing idly by while we get on with the killing of each other. Mind you, the author saw this as a good thing.
      • The author saw this as a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge—indeed, until the Spacer Era, the worst thing a robot ever did to a human is to transfer a factory director to a slightly less prestigious posting.
    • Another story, appropriately titled "Little Lost Robot", deals with a robot named Nestor that was told to "lose itself" by a disgruntled employee. Nestor does precisely what he's told, disguising himself among 62 other robots, which are physically identical but lack Nestor's modified version of the First Law of Robotics.
      • Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc. than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against its human masters. Their initial failures to identify it only serve to reinforce this "rebellious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
    • And the story about the mining robot who was supposed to be sent offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American midwest. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill its programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed ... and ended up building the world's first fully functional disintegration cannon run by a standard electric torch battery. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.
    • "The Naked Sky" had robots fully capable of harming humans, because either A: They had been given a deliberately limited definition as to what constituted 'human', or B: The robot was unaware that its actions could result in a human coming to harm.
    • The Three Laws work pretty much perfectly most of the time for keeping robots obedient and safe. It's just that less sophisticated models don't understand nuance of instructions or human tone, and more advanced robots are often stated to work by differentials between the laws, so when a low priority law (such as self-preservation) is in strong effect but a higher priority one is invoked to override it, the "stress" can cause unexpected behaviors. The predictable, safe, everyday functionings just don't make for interesting stories.
  • In Larry Niven's novel A World Out Of Time, the protagonist averts this trope by remembering at the last moment not to tell his computer to "Forget about it."
  • In Alastair Reynolds' "Nightingale," the insane computer running the hospital ship promises to return the protagonists "in one piece." When the last woman standing takes the computer up on her offer, she discovers that what the computer really means is all of the characters alive...and welded together into one body, so perfectly that nobody can figure out how to undo it.
  • In David Brin's The Practice Effect, the protagonist makes the mistake of spending some time musing aloud about the various things he's going to need his robot buddy (think R2-D2's great great grandfather) to do while he's stranded on a hostile world. A little while later, he realizes the robot has gone and despairs that it's lost forever, following really vague instructions to "gather information".
  • In Terry Pratchett's book Strata, a genie appears. Weather it is a machine or not is open to interpretation.

     Live Action TV  

  • On Knight Rider, KITT's Evil Twin, KARR, was ordered to defend itself, so it immediately locked itself down and refused to follow any other orders, since they might lead it to its destruction.
  • A hologram of Professor Moriarty was gifted with sentience on Star Trek: The Next Generation ("Elementary My Dear Data") after Geordi foolishly told the holodeck to, "Create an adversary capable of defeating Data," rather than capable of defeating Data's portrayal of Sherlock Holmes.


  • Red vs. Blue has Lopez the robot building an army of robots for Omnicidal Maniac O'Malley. O'Malley then orders them to attack, and they charge ... at a pace slower than walking. Why did they go slow? 'You asked for a day of victory.' The robots were set to win in exactly 24 hours.

     Video Games  

  • In The Elder Scrolls IV: Oblivion, the "Radiant AI" for providing NPCs with aspirations and goals was much less impressive than originally hyped. Apparently, it proved far more difficult to do well than Bethesda expected.
    • In one example documented during beta testing, one NPC was assigned to rake leaves and another to sweep, but the raker was given the broom and the sweeper the rake. Rather than trade their respective instruments, the one with the rake killed the other, looted the broom from its corpse, and began sweeping.
    • Another problem they had was that the world continued running in the background at all times. So one plot that required you to talk to a drug dealer always failed, because the drug dealer was always killed by the addicts for his drugs before you got that far in the game. This problem was never completely fixed—in the expansion, Shivering Isles, a certain quest was almost impossible to complete because the NPC was killed for stealing spoons before the player could talk to him. The fix for this was a patch that just made the NPC immortal.
      • Though, the Shivering Isles example IS somewhat justified. This is the Shivering Isles after all; the realm of madness, and the NPC in question has a serious obsession with cutlery.
    • They also tried to include other adventurers, but this also turned out not to work right. Their programming told them to adventure, which they did—much, much better than the player. They hogged all the items so that the player couldn't get them.


  • While not quite a computer, Castle Heterodyne of Girl Genius tends to interpret orders in whatever way allows it to have the most fun (read: cause the most casualties). Thus, when Agatha tells it there are people after her, it immediately tries to send helpful minion Moloch through a trap door ("Ah. Then perhaps you should have said: 'The people after us.'"), and when she takes it up on its suggestion to keep her enemies out of Mechanicsburg airspace, it interprets this as permission to send the Torchmen not only after the fake Heterodyne's airship, but Castle Wulfenbach as well.
    Agatha: "I am going to have to think twice about everything I say to you, aren't I?"
    Castle Heterodyne: "It'll be fun!"

     Western Animation  

  • One episode of Muppet Babies used this in the process of parodying as many sci-fi tropes as possible. "Gross me out" and "I need a bath" were among the "wishes".
  • Mercenaries steal the X-1 jet in an episode of The Venture Bros.. H.E.L.P.Er., the show's Robot Buddy, is on board. Brock instructs him to return to base. Instead of turning the plane around as hoped, H.E.L.P.Er obediently jumps out of the moving aircraft and promptly craters into the ground.
  • One U.S. Acres segment on Garfield and Friends featured a weather-making robot who operated on voice commands. Unfortunately, it was pretty vulnerable to making anything fall from the sky, especially when you insulted it by calling it a "bucket of bolts" or "overgrown vacuum cleaner".
  • The Flintstone Kids: In "Philo's D-Feat", Philo makes a robot that, when given commands, it either obeys them the literal way or explains that it can't be done. When Philo says "make my bed", the robot makes a new bed that looks like the bed Philo already had. When Rocky Ratrock commands the robot to shoplift, it literally lifts a shop.

     Real Life  

  • Those tropers old enough to have worked with type-in programs, as well as DOS on a daily basis in general, know that this was all too often Truth in Television. Pre-Windows MS Word especially seemed to have at least three keys, any one of which would instantly delete all your work.
  • Computer programming makes this painfully known, as even tiny errors in design, formatting, or typography can cause hilarious, annoying, and/or even potentially disastrous behaviors like the backward-flying dragons in the initial Skyrim release, the Zune new year crash, or the shutdown of an entire power grid in the 2003 Northeastern North American blackout.