As it turns out, not all genies have to be quasi-mystical creatures who arise by having their lamp stroked. In Sci-Fi, it is quite common for a genie to take another form entirely—that of the well-meaning but hopelessly logical computer program.
It just turns out that these are the breaks. Computers have a Viewer-Friendly Interface, an Omniscient Database, and can recognize plain speech, but they are notoriously bad at understanding figurative language. So if you tell your Robot Buddy or AI to "Give me a break", it will try to snap your legs. Tell it to "get lost" or "take a hike", and it'll wander off by itself. And so on.
Another common interpretation comes to us courtesy of the old days of MS-DOS. The computer prompt, not having any idea what your ultimate goal is, needs everything explained to it one step at a time. And heaven help the poor computer user who can't figure out what the correct command prompts are. You Can't Get Ye Flask is a common result of this kind of confusion.
Of course, You Can't Get Ye Flask is also much less fun than this trope, since a normal computer will just not do anything when given instructions it can't understand. So when wackiness is the goal, only this can provide the fun we desire while still maintaining a vestige of non-magical command prompts.
The machine version of the Literal Genie.
- In the murder-mystery episode of Haruhi Suzumiya, Haruhi asks Yuki (sort of a computer) to lock the door and not let anybody in. Later, she asks to be let in, and Yuki refuses. Kyon gets her to let them in by telling her the order has been cancelled. He then guesses that the ordeal might have been Yuki's awkward attempt at a joke. It has also lead others to muse that for whatever reason Kyon's commands override anything else; which makes sense given Yuki's absolute loyalty to him.
- Iznogoud: In "Iznogoud and the Magic Computer", Iznogoud buys a magical computer from a character who tells him a genie lives inside it and answers every question.
- The plot of the movie SpaceCamp is jumpstarted when Max wishes he could go into space within earshot of the robot Jinx. The robot (having run a simulation that indicates a 4.9 million year wait for the specific accident he wants) engineers the fatal shuttle accident that forces Mission Control to launch the shuttle by activating its booster rockets (Jinx ignited a single booster rocket, which would have flipped the shuttle right into a nosedive).
- After working out three simple laws that would render all robots totally safe, just about every single Isaac Asimov robot story would show how a robot could stick to the letter of the law and still cause an awful lot of trouble.
- In the short story "The Evitable Conflict", this leads to the robots outright taking over the world, since the laws of robotics insist that they always take action to save human lives when possible, which precludes them standing idly by while we get on with the killing of each other. Mind you, the author saw this as a good thing. The author saw this as a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge—indeed, until the Spacer Era, the worst thing a robot ever did to a human is to transfer a factory director to a slightly less prestigious posting.
- "Little Lost Robot": One of a special batch of NS-2 robots (with a slight, but potentially very dangerous modification to the fundamental laws governing its behavior) was told to "get lost" by a disgruntled employee. The robot (though capable of understanding the nuance of a command to "go lose yourself") decided to take it literally as a way of acting out against its human masters, and hides itself among a new batch of 62 other NS-2 robots. Dr Susan Calvin, robopsychologist, is called in to help figure out how to determine which NS-2 is the lost robot, which requires her to outsmart it.
- She also has to explain to the men in charge exactly how dangerous the situation they've created is. The "minor" modification they had made to the set of perfect laws in the missing robot's programming still prevents it from "directly" causing harm to a human, but it can let one to come to harm through inaction. She explains that, combined with this trope, the missing robot is perfectly capable of committing all kinds of deliberate murder (like by releasing a heavy object over a man's head and then letting gravity be the "direct" cause of it crushing his skull). She points out that the robot's decision to invoke this trope by getting "lost" is already evidence that is thinking along these lines and expressing resentment towards how it's programming forces it into subservience towards beings that are dumber, weaker, and generally inferior to it.
- And the story about the mining robot who was supposed to be sent offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American midwest. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill its programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed ... and ended up building the world's first fully functional disintegration cannon run by a standard electric torch battery. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.
- "The Naked Sky" had robots fully capable of harming humans, because either A: They had been given a deliberately limited definition as to what constituted 'human', or B: The robot was unaware that its actions could result in a human coming to harm.
- The Three Laws work pretty much perfectly most of the time for keeping robots obedient and safe. It's just that less sophisticated models don't understand nuance of instructions or human tone, and more advanced robots are often stated to work by differentials between the laws, so when a low priority law (such as self-preservation) is in strong effect but a higher priority one is invoked to override it, the "stress" can cause unexpected behaviors. The predictable, safe, everyday functionings just don't make for interesting stories.
- In Larry Niven's novel A World Out of Time, the protagonist averts this trope by remembering at the last moment not to tell his computer to "Forget about it."
- In Alastair Reynolds' "Nightingale," the insane computer running the hospital ship promises to return the protagonists "in one piece." When the last woman standing takes the computer up on her offer, she discovers that what the computer really means is all of the characters alive...and welded together into one body, so perfectly that nobody can figure out how to undo it.
- In David Brin's The Practice Effect, the protagonist makes the mistake of spending some time musing aloud about the various things he's going to need his robot buddy (think R2-D2's great great grandfather) to do while he's stranded on a hostile world. A little while later, he realizes the robot has gone and despairs that it's lost forever, following really vague instructions to "gather information".
- The Eldraeverse has an in-universe fairy tale in which an "Unwise GenAI" was told by a foolish couple that all they wanted was to live happily ever after and love forever, so he fired them into a stable orbit around the event horizon of a black hole.
- On Knight Rider, KITT's Evil Twin, KARR, was ordered to defend itself, so it immediately locked itself down and refused to follow any other orders, since they might lead it to its destruction.
- A hologram of Professor Moriarty was gifted with sentience on Star Trek: The Next Generation ("Elementary My Dear Data") after Geordi foolishly told the holodeck to, "Create an adversary capable of defeating Data," rather than capable of defeating Data's portrayal of Sherlock Holmes. The computer takes this instruction, from the chief-engineer of the ship who thus has high-level clearance and command authorization over it, and creates a brilliant villain who, as part of the computer himself, essentially has full access to the ships controls (even if he still doesn't completely understand them).
- Doctor Who, "The Stones of Blood": After the Doctor uses the expression "Anyone for tennis?", Romana asks Robot Buddy K-9 what tennis is, and then, deciding it's not important, tells him to forget it. K-9 proceeds to erase all information about tennis from his data banks.
- Probe's "Untouched by Human Hands": While trying to figure out what happened in the nuclear labs, Austin figures out that there's a flaw in the programming that they use for the Serendip robots; they have an automatic loop function, causing them to repeat previous commands endlessly.
- In The Elder Scrolls IV: Oblivion, the "Radiant AI" for providing NPCs with aspirations and goals was much less impressive than originally hyped. Apparently, it proved far more difficult to do well than Bethesda expected.
- In one example documented during beta testing, one NPC was assigned to rake leaves and another to sweep, but the raker was given the broom and the sweeper the rake. Rather than trade their respective instruments, the one with the rake killed the other, looted the broom from its corpse, and began sweeping.
- Another problem they had was that the world continued running in the background at all times. So one plot that required you to talk to a drug dealer always failed, because the drug dealer was always killed by the addicts for his drugs before you got that far in the game. This problem was never completely fixed—in the expansion, Shivering Isles, a certain quest was almost impossible to complete because the NPC was killed for stealing spoons before the player could talk to him. The fix for this was a patch that just made the NPC immortal.
- Though, the Shivering Isles example IS somewhat justified. This is the Shivering Isles after all; the realm of madness, and the NPC in question has a serious obsession with cutlery.
- They also tried to include other adventurers, but this also turned out not to work right. Their programming told them to adventure, which they did—much, much better than the player. They hogged all the items so that the player couldn't get them.
- While not quite a computer, Castle Heterodyne of Girl Genius tends to interpret orders in whatever way allows it to have the most fun (read: cause the most casualties). Thus, when Agatha tells it there are people after her, it immediately tries to send helpful minion Moloch through a trap door ("Ah. Then perhaps you should have said: 'The people after us.'"), and when she takes it up on its suggestion to keep her enemies out of Mechanicsburg airspace, it interprets this as permission to send the Torchmen not only after the fake Heterodyne's airship, but Castle Wulfenbach as well.
Agatha: "I am going to have to think twice about everything I say to you, aren't I?"Castle Heterodyne: "It'll be fun!"
- One episode of Muppet Babies used this in the process of parodying as many sci-fi tropes as possible. "Gross me out" and "I need a bath" were among the "wishes".
- Mercenaries steal the X-1 jet in an episode of The Venture Bros.. H.E.L.P.Er., the show's Robot Buddy, is on board. Brock instructs him to return to base. Instead of turning the plane around as hoped, H.E.L.P.Er obediently jumps out of the moving aircraft and promptly craters into the ground. In another episode H.E.L.P.Er. won't stop playing electronic drums, only stopping when ordered to mute all sound on the Venture compound. It's only shown later that apparently he also muted the compound's security lasers when henchmen intruders realize they're under fire after they're already on fire.
- One U.S. Acres segment on Garfield and Friends featured a weather-making robot who operated on voice commands. Unfortunately, it was pretty vulnerable to making anything fall from the sky, especially when you insulted it by calling it a "bucket of bolts" or "overgrown vacuum cleaner".
- The Flintstone Kids: In "Philo's D-Feat", Philo makes a robot that, when given commands, it either obeys them the literal way or explains that it can't be done. When Philo says "make my bed", the robot makes a new bed that looks like the bed Philo already had. When Rocky Ratrock commands the robot to shoplift, it literally lifts a shop.
- Those tropers old enough to have worked with type-in programs, as well as DOS on a daily basis in general, know that this was all too often Truth in Television. Pre-Windows MS Word especially seemed to have at least three keys, any one of which would instantly delete all your work.
- Computer programming makes this painfully known, as even tiny errors in design, formatting, or typography can cause hilarious, annoying, and/or even potentially disastrous behaviors like the backward-flying dragons in the initial Skyrim release, the Zune new year crash, or the shutdown of an entire power grid in the 2003 Northeastern North American blackout.