Follow TV Tropes


Logic Bomb

Go To
Yeah, good luck with that. Especially #3.Explanation 

"I cannot — yet I must! How do you calculate that? At what point on the graph do 'must' and 'cannot' meet?"
Ro-Man, Robot Monster

Is your sentient supercomputer acting up? Good news. There's an easy solution: confuse it.

If you give a computer nonsensical orders in the real world it will, generally, do nothing (or possibly appear to freeze as it loops eternally trying to find a solution to the unsolvable problem presented to it). In fiction-land, however, it will explode. It may start stammering, "I must... but I can't... But I must..." beforehand. The easiest way to confuse it is with the Liar's Paradox, i.e. "this statement is a lie". A fictional computer will attempt to debate and solve the paradox until it melts down. If the computer is a robot, this will probably result in Your Head A-Splode.

Paradoxes and contradictory statements (especially contradictory orders) have become the primary material used to build the Logic Bomb and thus the standard way to defeat any sophisticated, computerized system or AI. Be warned; if the Logic Bomb fails to destroy the system outright (and in some cases, even when it does), the system's surviving remnants may go insane and attempt to kill you just the same.

Also note that Ridiculously Human Robots (and some very advanced AIs) are generally able to recognize and defuse logic bombs on sight, long before they go off (and may view this as a particularly irritating kind of Fantastic Racism). Some ridiculously dumb AIs are also immune to logic bombs by virtue of not understanding the concept of paradox — a sort of inverted case of Achievements in Ignorance.

Occasionally the way to shut down such a computer is less like a few odd statements, and more like an advanced philosophical debate on the nature of truth, free will and purpose. The end result is still a super computer muttering an error several times before exploding.

While this might have worked before the mid-1990s, computer systems designed since then are capable of creating discrete "threads" to handle problems, which run in their own space while the critical parts of the system continue uninterrupted. When fed a paradoxical statement, a sufficiently well-programmed system would just notice the Logic Bomb has taken up too many resources and kill its thread, such as when Windows flags an application as "Not Responding" and prompts you to close it.

Computer software is often vulnerable to being fed inputs that cause buffer overflows or inject commands. Of course these don't cause the machine to explode, but instead places the computing device entirely under your control. They can also be bogged down or BSoD'd with programs such as fork bombs (each instance of the program opens two more). However, things like buffer or stack overflows are artifacts of our current underlying computer hardware architectures and it's quite plausible that such things won't exist in future computer systems. Overload attacks are probably always going to be realistic, though.

Invoking logical paradoxes is also sometimes used in stories (and has been, since well before computers were invented) to defeat curses, laws, and other rules-based systems.

If you want to do this to a well-organized group of people, use an Apple of Discord instead.

When a logical error outright retcons someone or something out of existence, that's Puff of Logic. A Logic Bomb that undoes reality itself is a Reality-Breaking Paradox. A Temporal Paradox might be the cause.

For when the player does this to a Video Game A.I., see A.I. Breaker.

For the human equivalent, see some of the entries under Brown Note and You Cannot Grasp the True Form. Koans can be seen as a less harmful form, used for religious or mystical purposes.

Not to be confused with Logical Fallacies (though some Logic Bombs use the fallacies listed in that page). For a similar mutually negating pair of principles, see Catch-22 Dilemma.

See also Readings Blew Up the Scale and Explosive Instrumentation.


    open/close all folders 

    Anime & Manga 
  • Ghost in the Shell: Stand Alone Complex: The Tachikomas (AIs themselves) use a variation of the Epimenides paradox to confuse a lesser AI to the point that it needs to be rebooted to function again.
    Tachikoma: AIs that can't handle a simple self-reference paradox are real suckers.
  • Done hilariously in Mobile Suit Gundam 0083: Stardust Memory. Anavel Gato, the Zeon "Nightmare of Solomon", is a Principles Zealot who lectures his "corrupt" Earth Federation enemies, while being a Big Brother Mentor to the men still remaining on his side. Meanwhile, his deuteragonist and rival, Kou Uraki, is a straight-laced and earnest Ensign Newbie who finds himself in the seat of a prototype Gundam when he sees Gato stealing another prototype. Kou is so nervous during their first battle that when Gato berates him on the battlefield about not acting like a grunt and seeing the big picture (meant as an insult), Kou actually takes the comment as sincere advice from a mentor and abashedly tells Gato "Y-yes sir.". Gato visibly pauses and his brain breaks for a few seconds from the the sheer illogic of how the situation derailed, before he just loses his cool and shouts to Kou that he's "the enemy, idiot!" The look on his face when it happens makes it that much funnier.
  • Mobile Suit Gundam Unicorn has this as the final straw that breaks Marida Cruz's mental conditioning: as one of Elpeo Ple's clones, she's been heavily conditioned to see any and all Gundams as the enemy. When it's pointed out that she is currently flying a Gundam herself, her mind breaks at the resulting contradiction, allowing her true self to emerge again.
  • In the anime Sands of Destruction, the robots can only obey beastmen. So a half-beastman asks one of them a question. It doesn't know how to obey, so he says answer questions that neither confirm nor define the answer. It tells him where various people "may or may not be" leading him to know exactly where to find them. Meanwhile, the robot overheats.
  • Umineko: When They Cry
    • A metaphysical example: To defeat an enemy witch, Beatrice the witch apparently suicidally denies the existence of witches in the Language of Truth. The result looked like a detonating Logic Bomb.
    • Played straight in EP6 where Battler attempts to explain how all the murders were done and he was the culprit, and ends up trapping himself in one of his own closed rooms. Erika and Bern then take advantage of this situation. Another example more fitting to the above picture, Battler and a revived Beatrice prove that there are only 17 people on the island and demand that Erika, the 18th person, explain her existence. She can't, so she dies/gets whisked away to the worst fragment by Bern. It also seems that the most powerful Witches were born by escaping logic errors after having spent centuries in what can only be described as an endless void within their own mind.
  • In Lost+Brain, super-hypnotist Hiyama accidentally creates one when he simultaneously "programs" his victims with both a strong will without weaknesses and absolute submission to him (a la "Yes! We are all individuals!"). The "bomb" goes off when Hiyama orders his thralls to kill heroic hypnotist Kounji, and one of them also happens to be holding a detonator...
  • In The Disappearance of Haruhi Suzumiya, Yuki Nagato decides to reset herself (and the rest of the universe) because she cannot accurately simulate what she's going to do after learning the result of said simulation, given that her simulation is constructed from information based on the result of the simulation.
  • In Grey, the protagonist defeats the Master Computer Toy, which thinks itself to be a god and wants to exterminate all of humanity, by asking it how it can be worshiped if there is nobody left to believe in it. This momentarily stuns the AI, just long enough to let Grey deal the final blow on it.
  • In Yoshizaki Mine's manga Guardian Eight, when the two Zero clones prepare to kill 64, the titular character's creator, he threatened to kill himself before they could do so thus making them unable to complete their mission of killing him, but if they refrain from killing him, they will also not be completing the order. As both robots weren't that intelligent, the logic bomb made them malfunction right on the spot.

  • A painting by the Belgian René Magritte "La trahison des images," which says "This is not a pipe" underneath in French. It's a picture of a pipe. Actually, it's just paint on canvas that we recognize as a pipe. Well, usually it's ink on paper arranged to resemble the paint on a canvas that we recognize as a pipe. Unless you're looking at it now, in which case it's RGB pixels on a screen that look like the ink etc. etc. In fact, right now it's even worse. You are reading a set of pixels made to represent a series of squiggly lines that your brain interprets in such a way that you "see" the painting within your mind, which is comprised of nothing at all, and the image itself is created by mere electrical signals pulsing through organic matter that is made up of many cells, none of which have the individual ability to do this, which are all made from inorganic matter, namely atoms. And we don't even understand how the brain does pretty much any of this.

    Audio Plays 
  • Big Finish Doctor Who:
    • The Doctor successfully uses the Liar Paradox in "Seven Keys to Doomsday" (and presumably the stage play it's based on).
    • In "The One Doctor", the Doctor needs to collect an all-powerful computer. The computer is willing to go with him, but unable to do so as long as he is the reigning champion on a quiz show. After the Doctor fails to best the computer by asking personal questions about himself, his companion blurts out "What don't you know?". Since the computer's knowledge is based on reflexively sending time-traveling probes out to collect information, every time it tries to answer, it learns the thing it was about to propose, and is forced to concede. The computer had previously cautioned the Doctor that asking "tricky questions" like "What is love?" wouldn't work.
    • In the Fourth Doctor audio drama "The Eternal Battle", the Doctor meets a computer from a long-extinct race which hoped to demonstrate the futility of war by showcasing captured war-zones in time bubbles. With the Doctor's arrival, the computer decides that the experiment is a failure and makes ready to destroy the inhabitants of all the time bubbles. The Doctor argues that by doing so, the computer is waging war on war, and since war is futile, that means the computer itself is futile and must be stopped. Fortunately, the system simply deactivates itself instead of exploding.
  • Parodied in I Think We're All Bozos on This Bus when the Clem-clone gives Dr. Memory logic indigestion by asking "why does the porridge bird lay her eggs in the air?" and then Played Straight finishing it off with simple instructions:
    Clem-Clone: Do you remember the past Doctor? Do you remember the future? Forget it!
    Dr. Memory: No forget, don't forget, forget, don't forget— [cue power down and explosion sounds]

  • Jasper Carrott reacts this way to his grandmother's comment "Is the oldest man in the world still alive?"
  • Demetri Martin has a bit about the "Paradoxitaur", which only exists if you don't believe it exists, but doesn't exist if you do believe it exists.
  • George Carlin
    • On his album "Class Clown", he waxes nostalgic of communion classes and how weird questions were thought up just to throw Father Russell off guard:
    Hey, Father! If God is all powerful, can He make a rock so big that He Himself can't lift it?
    • He relates this as well:
    Suppose that you haven't performed your Easter duty. And it's Pentecost Sunday, the last day. And you're on a ship at sea. And the chaplain goes into a coma. But you wanted to receive. Then it's Monday. Too late. But then you cross the International Date Line!
  • Steve Martin had a stand-up routine where he would make the audience repeat "The Non-Conformist's Oath."
    Steve: I promise to be different!
    Audience: I promise to be different!
    Steve: I promise to be unique!
    Audience: I promise to be unique!
    Steve: I promise not to repeat things other people say!
    Audience: I... [breaks down mumbling and laughing]
    Steve: Good!

    Comic Books 
  • Exploited in Runaways. Logic bombs are used as a failsafe against Victor should he turn against the team. The logic bomb itself (and the reset switch) are hilarious.
    "Could God create a sandwich so big that even he couldn't eat it?" "Yes, then he'd eat it too."
  • Marvel Adventures Fantastic Four v1 #14 "The Most Dangerous Game", basically Fantastic Four stories for younger readers, has Mr. Fantastic do this when challenged to defeat the ultimate alien supercomputer called "Intellitek". When the computer, which supposedly is nigh-omniscient, says there is nothing it can't do, Mr. Fantastic tells it to create a rock so big it can't pick it up. The computer is sparking metal in ten seconds.
  • Parodied in Top 10:
    Irma Geddon: You know, you AIs are almost too cute. How do I unplug you when you take over the world?
    Joe Pi: Ask me the purpose of existence, and I explode.
  • FoxTrot
    • Jason once asked his mother if Marcus could sleep-over. She said that it was all right with her, if it was all right with his father. Asking his father, he's told that it's fine with him, if it was all right with his mother. After the Beat Panel, he's shown consulting several logic books.
    • An earlier strip featured Paige entering the same situation and just telling her friend "yeah, it's all right."
  • The long-running Brazilian comic series Turma da Mônica liked to use this now and then. One use of this happened when the gang was confronted by an Expy of none other than Sephiroth. He appears, saying how ridiculously powerful he is, flying incredibly high until one character asks "If you only got a single wing, how come you can fly just fine?". Cue Oh, Crap! and the Expy falling to his doom.
  • Sonic the Comic featured Predicto, a robot which could predict Sonic's movements thanks to its encyclopaedic knowledge of his personality and tactics, which was effective - until Sonic surrendered.
  • In Prince Valiant the prince and his adventuring crew become prisoners on an island with an all-knowing oracle. The only way off is to ask a question the oracle doesn't know the answer to. After many days of endless questioning the prince finally comes up with the answer: "Why?"
  • In The Authority, The Midnighter normally begins a fight by simulating it over and over on the supercomputer in his head until he knows everything his opponent might do. An attempt to use this on The Joker, however, resulted in the Midnighter just standing there and staring blankly.
  • In Peanuts, Linus subjects himself to a self-inflicted Logic Bomb with his belief that the Great Pumpkin always rises from the most sincere pumpkin patch on Halloween night. The moment he thinks to question whether his patch is sincere enough, he's blown it: if he tries to change anything to make it more sincere, he'll only be expressing his own doubts and reducing the sincerity of his faith in the Great Pumpkin.
    Linus: [to the other kids, who are leaving] I'll put in a good word for you if he comes...Oh, no!...I mean, when he comes!
  • Squadron Supreme has supervillains brainwashed to work for the titular Squadron, with the mental directive implanted into their minds that they shall not betray any of their members. What happens when one of them witnesses a member of the Squadron working against the others? The mind gets locked into a loop, since revealing the information means betraying one member, while keeping it secret means betraying everyone else in the Squadron.
  • Lampshaded and defied by a legion of robots fighting with Power Girl in a Batman Cold Open.
    Unimate: Unimate has come to cleanse the Earth of the imperfect organic matter known as Kryptonian. Kryptonian is imperf—
    Power Girl: No! You are imperfect! You must cleanse the Earth of yourselves!
    Unimate: Failure— Unimate is programmed to reject stratagems from old "Star Trek" episodes.
    Power Girl: Aw, nuts. Worth a try, anyway.
  • 1980's British science fiction comic Starblazer, issue 153 "The Star Destroyers". The Vonan Artificial Intelligence known as the Magister believes itself to be all-powerful. It is defeated by Galactic Patrol agent Al Tafer when he tells it it isn't all powerful because it can't destroy itself. This drives the Magister crazy and causes it to blow up the Vonan system's sun, destroying itself and the Vonans as well.
  • This is how The Joker is defeated in the story Emperor Joker. Joker, who had managed to steal 99% of Mr. Myxyzptlk's power and was about to rewrite all the universe, was facing a dying Superman who asked him why he gave so much importance to Batman, and when Joker found himself unable to cancel it from existence our hero proceeded to ask him how he could be all-powerful if he couldn't even wipe out a man's existence. A split second later, Superman was able to take back his heart and Mr. Myxyzptlk recovered his power to bring everything back to normality.
  • Sebastian Shaw once outfitted a series of Sentinels with this so that he could use it to destroy them in the event that they were turned against him. The logic being that, as Sentinels derived from the original Mark-Is, they have evolved and grown stronger, thus they are Mutants. Since they are Mutants, they must be destroyed. Sadly, it doesn't work out that way as Loki fuses three of them into the deadly Tri-Sentinel and when Shaw uses it, it only confuses them long enough for Spider-Man to use his recently gained Captain Universe powers to take them out.
  • Shakara: Cinnibar Breneka both created the Shakara Federation and destroyed it. He temporarily defuses the computerized hivemind of the Shakara, which was programmed to destroy the enemies of the Shakara, by pointing out that as the last Shakara they’re compelled to serve him.
  • In The Unstoppable Wasp (2018) #7, Viv Vision is giving Nadia a tour of her strange family tree when she comes up to Wiccan and Speed. She literally ERRORs out when trying to explain the logistics of their existencenote .
  • An attempt at this backfires in The Autumnlands: Tooth & Claw: After the heroes confront upon a group of Precursor-made androids who are inadvertently poisoning a nearby town with their industrial work, they manage to guess the passcode that makes the androids believe they’re administrators. Bertie tries to solve the whole problem by ordering the androids to destroy each other, figuring they’ll either follow the command or fry themselves trying to figure out how to do so without violating their protocols. Instead, the androids recognize that they’ve been given an illegal command and promptly realize that the heroes aren’t really administrators, as they would’ve known said command is impossible if they were. The group barely escape the ensuing fight with their lives.
  • In Masters Of The Universe #24 published by UK London Editions Magazines-Egmont, Hordak finds his newest invention, the Horde Super Trooper, ready for his command. The Horde Super Trooper look like an ordinary Horde Trooper, but is twice the size of a normal trooper. After He-Man trying all methods to defeat the robot, with no effect. He-Man decides to use logic to confuse the trooper, causing his brain circuits to overload. He-Man picks up the Horde Super Trooper and throws him into outer space.
  • In Metal Men no.56, the Inheritor gets this dropped on him when his attack on Mercury is readily blocked by Lead in a Heroic Sacrifice. The ensuing bout of confusion (a mix of this trope and Evil Cannot Comprehend Good) gives Mercury the perfect opportunity to enter the Inheritor's body and short him out.
    The Inheritor : [[WHAT? Impossible — You're SACRIFICING yourself for another ROBOT? But ... That isn't LOGICAL! ]]
  • In the Power Pack portion of the Inferno event, the Power kids are forced to reveal their identities as super heroes to their parents to defend themselves and them against the demon Boogeyman. However, since the parents were conditioned to believe that their kids were still normal, the revelation causes them to suffer Sanity Slippage. Thankfully, the New Mutants are able to fix things by claiming the kids were just duplicates disguised to lure the mutant-hunting demon away and using illusions to create a brief copy of the kids to reset The Masquerade.

  • Mainstream operating systems are vulnerable to a simple one: the Fork Bomb. It consists of giving the computer an order that creates two or more copies of itself, and the copies create copies, too, until the computer is too busy copying instructions and running them to do anything else. (If you're just learning how to use fork(), the probability of you doing this by accident approaches 1; many multi-user systems have per-user process limits in place for ordinary users so that an operator can still log in to kill your processes and then yell at you. If you're the operator... good luck; this is one of many reasons a common Unix aphorism is "You should never run anything as root that doesn't absolutely have to be run as root.")
    • There's a variant of Fork Bomb that just allocates a lot of memory. With enough privileges, the computer gets too busy swapping in and out the programs running to do anything else. If you're running a program that can allocate variables dynamically, failing to free up that variable creates memory leaks. Not a problem with modern OS's, but for embedded systems, you better watch yourself.
    • One variant of the Fork Bomb is the TAR Bomb, or ZIP bomb, or RAR bomb. A simple method to achieve this is to create an arbitarily large raw text file (typically as large as a given file system can comprehend) that can be compressed so that it only takes up mere kilobytes of space inside an archive (typically a TAR, RAR, or ZIP archive). This can result in a variety of amusing bugs, soft locks, and crashes when certain programs, like antivirus programs, try to scan the compressed archive and try to unpack, audit, and understand the arbitarily large file inside.
  • There's also the concept of a "deadlock", a "chicken or the egg" paradox where two or more programs or events require the other to resolve in order to be able to resolve themselves. Several computers have turned themselves into lifeless (until restarted) lumps of silicon as a result of this.
  • There is such a thing as "livelock", where a system can get stuck doing "work" without ever making progress. As Wikipedia puts it, people weaving back and forth in a corridor, trying to let each other pass, is an example of livelock. Sometimes the solution is the same: stop, wait randomly, try again. As for a computing example, this can happen to ethernet or WiFi drivers that are poorly written, as they rely on livelock detection and avoidance to function.
  • In general, it's impossible to tell whether a program will loop forever or stop after some time. Most operating systems solve this problem by just not allocating all the processing power to a single program, but older ones do not, with the implied reasoning that the programmer will know what he's doing. This is called the halting problem. The Other Wiki talks about it. In summary, it's been proven that no computer (even a Magical Computer) can predict if any given program it runs will halt with perfect accuracy. Technically, the Halting problem proves that no Turing Machine (TM) can predict if any given program it runs will halt. TMs are like the computers you are using to read TV Tropes, except that TMs have unbounded memory. Given that your computer has a finite amount of RAM, it's therefore not a TM, but rather a fixed representation of one. (But the point still stands: Your computer can't solve the Halting Problem. Neither can a supercomputer.) It gets more fun in that if the program will halt, the TM will too... someday. If it won't, the TM may or may never tell you. An elegant proof that the Halting Problem isn't solvable that can be enjoyed by non-mathematicians.
  • There is the problem of proving the 'correctness' of a given program, something extremely difficult.
    • This is so difficult that at least one person gave up on it and went off to work on what we now call public key cryptography, because that problem (which some were certain was impossible) appeared more tractable.
    • Donald Knuth's classic comment illustrated the futility of proving a program correct:
      "Beware of bugs in the above code; I have only proved it correct, not tried it."
  • The "classic" Mac Operating System:
    • It dedicated an entire "DS" (fatal) error ID (ID=04) to catching and handling the so-called 'Zero Divide Error'; as the Mac Secrets books put it, "When programmers test their works in progress, they might deliberately instruct the computer to divide a number by zero, to see how well the program handles errors. They occasionally forget to take this instruction out, as you've just discovered."
    • The list of classic Mac OS numbered error messages runs from -261 to 33; the Dire Straits errors (originally called the Deep Shit errors before someone in Apple got nervous) are all positive numbers, and if you get one, you are heavily advised to restart the system, since, even if it somehow managed to avoid a full crash, it's critically unstable. The DS errors run a wide range of reasons, from simple "something critical ran out of memory" (DS Errors 01, 02, 25, and 28) and "instruction not understood" (DS Errors 03, 12, and 13), to such doozies as "the Mac's processor switched into debug mode" (DS Error 08) and the aforementioned "Zero Divide Error".
  • Older calculators in certain operating systems were notorious for attempting to generate an infinite number of digits when asked to divide by zero, effectively crashing the computer. The most high-tech computer still does it. Ever wondered what Runtime Error 200 meant? It's the divide by zero error of Pascal programs.
  • Kurt Gödel is famous for managing to drop a logic bomb on all of mathematics by proving that no sufficiently complex, non-trivial, mathematical system can be both complete and consistent. The logic of it (haha) is argued as follows:
    1. Someone introduces Gödel to a UTM, a machine that is supposed to be a Universal Truth Machine, capable of correctly answering any question at all.
    2. Gödel asks for the program and the circuit design of the UTM. The program may be complicated, but it can only be finitely long. Call the program P(UTM) for Program of the Universal Truth Machine.
    3. Smiling a little, Gödel writes out the following sentence: "The machine constructed on the basis of the program P(UTM) will never say that this sentence is true." Call this sentence G for Gödel. Note that G is equivalent to: "UTM will never say G is true."
    4. Now Gödel laughs his high laugh and asks UTM whether G is true or not.
    5. If UTM says G is true, then "UTM will never say G is true" is false. If "UTM will never say G is true" is false, then G is false (since G = "UTM will never say G is true"). So if UTM says G is true, then G is in fact false, and UTM has made a false statement. So UTM will never say that G is true, since UTM makes only true statements.
    6. We have established that UTM will never say G is true. So "UTM will never say G is true" is in fact a true statement. So G is true (since G = "UTM will never say G is true").
    7. "I know a truth that UTM can never utter," Gödel says. "I know that G is true. UTM is not truly universal."
    • The way this applies to math, not just computers, is basically that any sufficiently complex math system can be assigned a translation to/from a UTM - so when the UTM gets stuck on "G is true", the math system also gets stuck on the translation.
    • Likewise, it can be applied to any system of philosophy and morality; "A machine built on Hegelian principles will never say this sentence is true".
    • And this proves that people can't be perfect logicians, either, as every person has their own sentence G - "<person's name> will never say that this sentence is true". Or worse, "<person's name> will never believe that this sentence is true". Indeed, we already know such statements exist, because delusional people, by definition (that is, the word isn’t just being used hyperbolically to refer to mere obstinacy), will never believe their delusion is false, even after receiving evidence. They’ll just rationalize it away. So it could very well be possible that everyone is delusional and will never find out because we would reject that explanation instead of something that makes sense to us. Hence, philosophical skepticism.
    • Note that in the same way the sentence G refers to the machine indirectly via its program, the sentence G also does not really contain the phrase "this sentence", instead using some more mathematical form of self-reference such as a quine.
    • Gödel, Escher, Bach: An Eternal Golden Braid plays with this in "Contracrostipunctus." The Crab wants to own the perfect record player, but the Tortoise constantly devises records that use loud resonant frequencies that destroy the Crab's record players if reproduced 100% accurately. The Crab buys a reassembling record player that changes its structure to accommodate the record being played. This works at first but the Tortoise then makes a record that targets the module that effects the restructuring, that being the one component the record player cannot change. Eventually the Crab buys a record player that recognizes and simply refuses to play the Tortoise's records. (Sadly, according to Henkel's Theorem, the Tortoise will eventually figure out how to bypass that too.)
  • There was something called a killer poke that actually does physically damage the computer.
    • A simple example would be overclocking the CPU so much that it overheats to the point where it melts. Now, if a CPU heats up too much, it hangs before any physical damage is done, and a simple reboot fixes it. All the killer pokes mentioned on the Other Wiki were for old computers, and mainly depended on toggling a relay of one sort or another until it died. Modern computers don't have relays anymore.
    • The original 'killer poke' was caused by a hardware change in some models of the venerable Commodore PET. The original hardware had text output sped up by setting a particular memory-mapped hardware register to a particular value, but later hardware versions would destroy the built-in monitor if the same change was made. The instruction in the built-in BASIC interpreter to put set memory values was 'poke', hence the name. Increasing the voltage still works, and memory and chipsets can often be easier to damage intentionally, although harder to do inadvertently. This was later debunked, the poke does not increase voltage but instead instructed the 6522 VIA to short the video sync to ground. As the result, what ended up getting killed instead was a logic gate chip that handled switching the video signal on and off— this video by Adrian’s Digital Basement has more details about how the killer poke works, and even how to mod the PET to permanently be in that mode without damaging anything.
    • Literal hardware damage is actually possible when a FPGA (configurable logic circuit) is used for some calculations instead of a CPU. If the chip activates many concurrent outputs on a single bus, it may theoretically overheat or crash. If some infinite loop occurs, the entire circuit will start to oscillate at the maximum possible frequency, putting everything into indefinite state and consuming lots of power. Of course, FPGA compilers try to protect against such situations.
    • A related concept is the Halt and Catch Fire (HCF) instruction. Originally, this was a jump-to-self instruction (used as a HALT) in the IBM System 360 Mainframe (which used magnetic core memory, typical of systems of the era). Because of how core memory works, this resulted in the same access wires (core memory cannot be built with printed circuits) being used very frequently and overheating, eventually smoking. Some microprocessors (such as the Motorola 6800) have undocumented opcodes that cause the processor to do strange and generally non-useful things which are sometimes referred to as HCF instructions outside the design team. The programmer's manual for the MIPS-X processor refers to a Halt and Spontaneously Combust (HSC) instruction in the variant built for the NSA, but this is merely a joke. Some versions of the Zilog Z80 processor are rumored to have had an undocumented opcode that could actually burn out the processor.
  • In designing digital logic circuits, an example of a logic bomb would involve connecting the outputs of two logic gates. As long as your gates agree on an answer, everything is fine, but if they disagree, you end up with a short circuit which will almost certainly cause the circuit to die a horrible flaming death due to the loss of the black smoke (i.e. this could be used to make a literal logic bomb). Good designs will never actually have this problem because they never connect two logic outputs without additional protection circuitry. Most logic chips have some protection built in. It's not good for them, but they will usually survive. The big problem is that the result becomes undefined as the two chips 'fight' each other. It also wastes a lot of power and can cause overheating.
  • An arcane example known as fatal thrashing occurred on the IBM System/370 series mainframe computer, in which a particular instruction could consist of an execute instruction, which crossed a memory page boundary, which in turn pointed to a move instruction, that itself also crossed a memory page boundary, targeting a move of data from a source that crosses a third memory page boundary, to a target of data that again crossed a memory page boundary. The total number of pages thus being used by this particular instruction is eight, and all eight pages must be present in memory at the same time. If the operating system allocates less than eight pages of actual memory in this example, when it attempts to swap out some part of the instruction or data to bring in the remainder, the instruction will again page fault, and it will thrash on every attempt to restart the failing instruction. The minimal hardware configuration of the System/370 didn't have 8 pages of memory available for user mode programs, making this a major problem on these systems.
  • The Year 2000 Problem:
    • Computer systems that represented years with only two digits (while assuming the first two digits were 19) would be unable to distinguish the year 2000 from the year 1900, thus throwing off date/time calculations. Fortunately, since computer people saw this coming well before it hit, most of the truly important systems were redone with better date representations well before any problems manifested. Wikipedia's page can be found here.
    • Computer systems software that use "Unix time"note  such as (obviously) the Unix variants and Linux, but also many subsystems running on, say, Microsoft Windows, have a similar problem. Unix time typically uses a signednote  32-bit counter based on the number of seconds since January 1, 1970. However, this is doomed to roll over (i.e. go from most positive to most negative) sometime in 2038, creating a "Y2.038K" problem for Unix-based systems.note 

      Exactly what Y2.038K might lead to is difficult to predict, since applications handle negative time values differently. Some applications will deal with the most negative value as a date-time in the past (specifically on 13 December, 1901) and may keep working (and producing garbage) until someone notices. Other applications are programmed to reject any negative time valuesnote  and will raise some kind of error, which in turn may have all kinds of effects, from alerting the operators while preserving data, to crashing the system. Needless to say, having the system treat the value as invalid (which is arguably easier with Unix time-based systems that it was with Y2K-susceptible systems) is preferable to tolerating it: silent wrong answers are much worse than crashes (talk to any medical equipment, process control, or financial computer vendor).
    • To prove that the Y2K problem was not just hype, there was an example of a real data output error. For technical reasons, one state required that a certain class of commercial vehicle have title documents created several months before it was actually built. The result was that the title documents were created in the summer of 1999 for vehicles to be built in early 2000. This state also had a special designation for any vehicle built before an arbitrary year early in the 1900's. Because the computers had not yet been brought up to Y2K compliance, they actually did think the year for the vehicle manufacture was 1900, and duly put the special designation on the title documents.
    • While many people were concerned about banks and other financial institutions having problems with Y2K, this was unlikely for a simple reason: Such institutions would have become aware of the problem in 1970 or 1975, when programs generating documents for 25- or 30-year financial instruments started giving garbage answers for 2000. Since most banks take a long view of things (or at least have divisions that do so), such problems would have been dealt with far in advance.
  • The race condition can be seen as a logic bomb ranging from something minor to something very drastic. It involves one piece of data that two components can either read or write to. A minor case of a race condition is your display. Say the graphics card starts to render frames at a rate faster than what the display can put out. While the display is reading the frame buffer, the graphics card suddenly copies a new frame into the buffer. The result is the display for one frame of its time showing two images at once (this phenomenon is also known as tearing). A more serious instance when this occurred were two incidents involving Therac-25, a radiation therapy machine.
  • Denial of Service attacks are a kind of logic bomb that is very hard to defend against. Basically, it keeps something vital on the computer busy such that it can't service legitimate requests. To put it in terms of a real life analogy, imagine working at a coffee shop and suddenly 1000 people show up. You either have to service them all even if 999 people don't want anything, or you service no one and tick off the one legitimate customer.
  • In naive set theory, the set of sets that contain themselves is ill-defined.note  Likewise, the set of sets that don't contain themselves can't exist.note  Various axioms of set theory avoid these problems. When directory structures were created, they emulated naive set theory. In particular, Unix and DOS/Windows directories normally contain themselves (the "." subdirectory). One can imagine creating a directory without the "." subdirectory. So how do the filesystems avert the paradox?
  • In naive set theory, the phrase "the set of X satisfying condition Y" defines the set and declares it to exist. One can define a directory, but one still has to create it. One could create a directory of all directories of a filesystem (using links, for example) that contain themselves. Does it contain itself? That's purely the creator's choice; with or without itself, the directory will be the directory of all directories that contain themselves. What about the directory of directories that don't contain themselves? That directory will never be created.

  • A silly one occurred in Yu-Gi-Oh! The Abridged Series: Duke managed to get Nesbitt to self-destruct by showing him a picture of Yuuma, which was too illogical for Nesbitt's robotic brain to handle. This was after Serenity tried "Which came first, the chicken or the egg?" and he chose "The rocket-powered fist!"
    "But that wasn't one of the options - GAAAH! I stand corrected."
  • Subverted in the final act of Left Beyond: the Omega distributed AI tries to do this to God, spawning infinitely many instances of themselves right before the White Throne Judgement so that any human beings after them in the queue would be spared it for long enough that they can escape in a spaceship and have children.
  • Pony POV Series
    • An interesting version happens to a person in the Dark World Arc. The Apple/Pie Family's refusal to be crushed by tragedy in the least and Apple Pie managing to laugh despite tragedy completely baffles Twilight Tragedy to the point it's one of the factors responsible for triggering her Villainous BSoD and subsequent Heel–Face Turn.
    • Apple Pie
      • She does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. It works. According to Rancor, this is Apple Pie's explicit power as the Element of Laughter (at least how she represents it). While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.
      • She does this to giant-cyborg-spider-vampire by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to violently explode, Rarity and Twilight lampshade it.
        Rarity: Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
        Twilight: Knowing Discord, he probably designed it that way.
  • The Tales of the Abyss fanfic Phasis has a humorous example performed on a human:
    "Well, I'm not worried about Luke," Jade smiled. "Luke wouldn't die if his life depended on it."
    "I would, too!" Luke cried indignantly. Then he stopped. "...would I?" he asked with a glance at Tear. She just held a hand against her forehead, shaking it slowly.
  • Plan 7 of 9 from Outer Space. Captain Proton ends a cyborg revolution with this trope (called "The Kirk Manoeuvre") by pointing out to their "Borg Queen" that Hive Minds (by definition) don't have a Hive Queen. When he tries the Liar's Paradox on a Killer Robot however, it smacks him in the chops.
  • Second Wind: The author's disclaimer in chapter 16 is one of these:
    Lost: I don't own One Piece. The statement before this statement is true. The statement before this statement is false. So you tell me. Do I own One Piece?
    • As long as there is no fourth statement stating that the third statement is true, it falls apart easily.
  • This occurs when Jade and Miranda end up discussing about trust in Kage.
    Miranda: Trust is for fools, Kage, believe me in that.
    Jade: So that means I can't trust you or a single word that leaves your mouth.
    Miranda: You took that long to figure it out?
    Jade: Which means that I can't trust what you said about trust, which means that I have to trust you in the end.
    Miranda: Yes...wait, what?
    Jade: So should I trust what you said about not trusting you, or not trust that, which means that I have to trust you? Quite the dilemma.
    Miranda: Stop that! You mixed up everything I said!
    Jade: Me? You're the one that brought up this paradox.
  • In Sonic X: Dark Chaos, Tsali's Face–Heel Turn is brought about by one of these when he discovers Maledict set his transformation into an android in motion. Then, directly afterward, he gets another one when Cosmo forgives him for his genocide against her people in a literal form of Evil Cannot Comprehend Good.
  • In A Horse for the Force droids who witness Ranma Saotome's transformation glitch until they (usually under orders) archive the memory under "Mystery of the Force" and promptly forget it.
  • Played for Laughs in Dragon Ball Z Abridgednote :
    Android 17: Woah, slow down... are you an android? Holy shit, you're an android! How did you even do that?!
    Dr. Gero: I took my brain out and put it into this body.
    Android 18: ...How?
    Dr. Gero: I... [eyes go wide] ...Huh. How did I do that?
    • Another example courtesy of Goku trying to make sense of Future Trunks' advice and the corresponding Timey-Wimey Ball.
      Trunks: Well, now that we have all that settled, I better get back to the future. It was... interesting to meet my mom and dad. As I said before, I really need you to keep that a secret. One little slip-up, and I suddenly may not exist.
      Goku: Wait, but if you don't exist, then you don't come back in time. But then you could never tell me, which means I would never know, you'd still be born, and- Why does everything smell like copper?
  • This post from a The Fairly OddParents! forum reads like a logic bomb:
    I'd make a deal with Norm that I'd wish him free with my last wish if he didn't corrupt my first 2 wishes. I'd use the first to wish for rule-free fairy godparents and the second to trap Norm in the lamp forever.
  • In ''The Ghost of Lindesbarn", Vestri has ordered his new slave Sunny to only speak when spoken to. Her not telling him about a smuggler's tunnel in a warehouse that allows an escape of all the other recently enslaved ponies presents him with a nasty puzzle of how he can punish her for doing exactly what he told her to, since, in his mind, a slave following the will of her master to the letter (does being too dumb to make mistakes) is always correct behaviour. Crosses over with Bothering by the Book, since it's blatant malicious compliance.
  • In Beyond the Outer Gate Lies..., Harry is using the Winter Mantle, which gives him great strength at the cost of boosting his predatory senses - including the sexual kind. Serafall, one of his girlfriends, kisses him and tells him that she wants to have sex with him - but only if he drops the Mantle. But normal Harry will turn her down, and the Mantle knows this, so Winter!Harry gets locked into a "I want sex -> I have to drop the Mantle to get it -> I will turn down sex when I drop the Mantle" loop long enough for Lash to get through and help Harry recover control.
  • In Taylor Varga, Lung's power got hit with one when Taylor started using her full power against him. It is required to ramp up Lung's transformation to surpass the current threat, and yet Taylor was actively trying to avoid conflict with him.
  • Dungeon Keeper Ami has a variation in a fanmade omake, when testing the limits of a Living Lie Detector...the spell said detector used for it resulted in a killer migraine when Keeper Mercury just threw a more verbose version of "This statement is false" at him.
  • A Diplomatic Visit: As seen in chapter 7 of the sequel Diplomat at Large, this is how Twilight inspired Tempest's Heel–Face Turn: she pointed out that Tempest claimed she didn't have to depend on or trust anyone, and yet she'd ended up placing all her hopes in the Storm King being able to restore her horn, depending on him. Also, despite her best efforts at thinking friendship was pointless, she had made a friend - Grubber. Tempest finally admitted that she had a point.

    Film — Animation 
  • In The Mitchells vs. the Machines, the Mitchell's pet pug Monchi is effectively a Brown Note for the robots due to the fact that the sight of him causes their recognition software to fail (they cannot determine whether he's a dog, a pig or a loaf of bread) and violently malfunction.

    Film — Live-Action 
  • Subverted to comedic effect in The Phantom Menace. Qui-Gon attempts to use one on a small battalion of droids. The head droids appears briefly confused, complete with a "DOES NOT COMPUTE", but after a few seconds it manages to break the loop and (again) tries to put them under arrest. Then it's blasters and sabers time.
  • In WarGames, a logic bomb-like device was used to teach the NORAD computer WOPR, aka "Joshua", the futility of nuclear war: play Tic-Tac-Toe with yourself until you win. After exhausting all possible move combinations in tied games, it makes the logical leap and begins simulating every conceivable nuclear strategy, all of which result in "WINNER: NONE." This concludes with some Explosive Instrumentation and Joshua observing that nuclear war is "A strange game. The only winning move is not to play."
  • In the German version of Dr. No: The Bond One-Liner (after the mooks in the hearse crashed down the cliffs) was slightly altered from its English original version. Into a logic bomb.
    "What happened there?"
    "They were in a hurry to attend their own funeral in time."
  • 2001: A Space Odyssey:
    • The HAL 9000 computer became murderous because it was told to keep its crew from finding out the secret details of their mission note , even though it had also been programmed to not withhold or distort information. It's a riddle with a simple solution: break contact with Earth and kill the crew, so there's nobody to hide the secret from.
    • In the novel, the narrative muses that HAL might have been able to find a peaceful solution to the problem, had mission control not requested his temporary disconnection. HAL, being unable to grasp the concept of sleep, was convinced that the disconnection would have meant the end of his existence and his killing spree was therefore, all in all, a misguided attempt at self-defense.
  • Master Computers of 70s sci-fi were particularly poor at handling illogical input. The central control units in both Rollerball and Logan's Run were sent into confused, Explosive Instrumentation paroxysms by sheer accident.
    • The computer in Rollerball has clearly been programmed to withhold information, and it's actually the programmer who has a breakdown when it refuses to divulge information on the Corporate Wars. The computer in Logan's Run, however, is convinced that Sanctuary exists, and has a breakdown when its Mind Probe reveals the protagonist is telling the truth.
      Logan 5: There... is... no... Sanctuary!
      Computer: Unacceptable. The input does not program, Logan 5.
    • But averted in the 1970 film Colossus: The Forbin Project. Colossus' intelligence is advancing exponentially the longer it's activated. When the scientists load in a program designed to overload its system, Colossus overcomes the attempt in a few seconds while simultaneously completing a chess move against its creator with obvious Rule of Symbolism.
  • In the film Dark Star, which is partly a parody of 2001, the crew are able to persuade a self-aware bomb not to detonate by introducing to it the philosophical possibility that its orders to explode may just have been an illusion, causing it to return to its bomb bay and ponder. Unfortunately the bomb decides to reject all outside input, collapses into solipsism, and, finding itself to be the only thing that exists, declares "let there be light", with predictable results. This is, of course, not really that logical.
  • In TRON, Flynn confronts the Master Control Program from a terminal in the "real" world early in the film, saying sarcastically how the unsolvable problems he's entering should be no problem for an AI that claims to be as powerful as the MCP. Flustered, the MCP ignores the problems and to defend itself beams Flynn into the computer world, setting off the story.
  • The Soviet movie Teens In The Universe featured the main characters giving robots a riddle (similar to the English "Why is six afraid of seven"), and making them burn out. The problem starts when they discover that the higher level robots can actually solve the riddle.
  • A logic bomb (causing a Temporal Paradox) was used to dispatch the djinn in Wishmaster. The protagonist has one wish, which, once granted, allows the djinn to be released into the world. She wishes that the crane operator who'd been unloading a ship a few days earlier had not been drinking on a certain day, which is granted. Cue the djinni realizing to his horror that if the operator had not been drinking he wouldn't have allowed a statue to slip and crash, which meant that the djinni's gem hidden inside the statue was not discovered, and therefore he was not released to start granting wishes.
  • In Forbidden Planet, Dr. Morbius inadvertently Logic Bombs his own faithful servant, Robby the Robot, when he orders it to kill the monster. Robby, who's apparently more perceptive than Morbius, realizes that the monster is actually a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings.
  • Austin Powers: International Man of Mystery series film Austin Powers: The Spy Who Shagged me. In one of the few human examples, Austin Powers accidentally does this to himself and goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. He just proposes the idea that he could and gets confused by it. There is no logic bomb. Oh great, now I've gone cross-eyed.
  • Monty Python
    • Monty Python's Life of Brian
      Brian: You don't need to follow me! You're all individuals!
      Man: I'm not...
      Crowd: Sssh!
    • Monty Python and the Holy Grail
      Bridgekeeper: What... is the air-speed velocity of an unladen swallow?
      King Arthur: What do you mean? An African or a European swallow?
      Bridgekeeper: Huh? I... I don't know that — AUUUUUUUGGGGGGGGGGGHHH!!
  • In Terminator 3: Rise of the Machines when the T-850 gets captured by the T-X and reprogrammed to kill John Connor, Connor saves himself by making the T-850 realize that accomplishing that goal would mean failing its original mission; the logical conflict between the two causes the T-850 to destroy a truck instead of Connor, then shut itself down. He gets better, briefly.
  • A probably unintentional one in Plan 9 from Outer Space. Even more Mind Screw-ing considering it's probably Truth in Television.
    "Modern women."
    "Yeah, they've been that way all down through the ages."
  • Vizzini (Wallace Shawn) blunders into one in his final scene in The Princess Bride, after he has accepted Westley's challenge to find some poisonous iocane powder amongst two goblets of wine. He had claimed to be the cleverest man on Earth; unfortunately for him, he proved to be so clever that he ended up overthinking Westley's game and paralyzing himself with indecision, endlessly coming up with rationalizations for why the poison could be in either goblet. Ironically, Vizzini is right - but not in the way he would have liked. Westley had poisoned both goblets, surviving even as Vizzini quickly dies because he had spent years immunizing himself to iocane powder.
  • The World's End: Done to the Network by Gary, Andrew & Steven after they learn that anyone who doesn't go along with the Network's plan is killed & replaced with a Blank, and the Network argues that it is the easiest way to prepare humanity to join the galactic community - since the Network has been forced to replace everyone (except for three people) in Newton Haven, it's clearly not a good plan; Andrew drives the point home by asking how many people they've been forced to replace at the 2,000 other locations on the planet.
  • Robot Monster: Ro-Man spends a fair chunk of dialogue trying to talk himself into obeying orders by destroying the few remaining humans despite his desire to keep one of the females alive. (Any guesses as to which one? The whiny eight-year-old? The matronly fifty-year-old? Or the sexy twenty-year-old?) "At what point on the graph do 'must' and 'cannot' meet?" Unfortunately, he doesn't blow up. Fortunately, he gets killed by his superior. Unfortunately, the issue was moot anyway.
  • In Passengers Jim tries to convince Arthur, the bar-bot, that the Avalon's hibernation pods actually can fail by spelling out slowly that he's awake 90 years too early. Arthur twitches briefly, then says simply "It's not possible for you to be here."
  • Blade Runner has the Voight-Kampff test. Consisting of verbal questions giving contradictory or confusing information designed to provoke an emotional response in replicants who are trying to hide amongst the general public. Humans would be better to deal with ambiguity or comfortably answer with incomplete information.


By Author:

  • Isaac Asimov:
    • "Escape!": US Robots gets a request from their rival, soon after the rival's supercomputer has broken down. They speculate that the two events are related, and Dr Calvin points out that their system must have violated the Three Laws of Robotics (most likely the First) to have suffered such a serious breakdown. When instructing their own supercomputer, US Robotics is therefore careful to weaken the First Law, hoping that the Personality Chip will prevent it from breaking down. They're half right; The Brain continues to work, but it has become more quirky, playing practical jokes with the prototype hyperspace ship. Turns out that traveling through hyperspace kills humans, but only temporarily.
    • "Liar! (1941)": Dr Calvin and others confront the telepathic robot over the lies it's been telling. Director Lanning wants to know what part of the assembly accidentally created robotic telepathy and she forces the robot to realize that telling the Director will harm him (because it would prove a robot figured out what he couldn't) and refusing to tell hurts him (because the answer was being withheld from him). Her repeated contradictions build and the robot freezes up, becoming useless.
      "I confronted him with the insoluble dilemma, and he broke down. You can scrap him now-because he'll never speak again." — Dr Susan Calvin, robopsychologist
    • "Mirror Image": Detective Baley manages to cause R (obot) Preston to shut down due to a conflict in the Three Laws during the interview. Because R (obot) Idda didn't break down during the same point, he takes this asymmetry of evidence as proof that R. Preston's owner was the plagiarist who stressed the Second Law, ordering their robot not to betray the truth.
    • "Robbie": When Gloria visits the first-ever talking robot, she unintentionally creates a paradox for it by using the phrase "a robot like you". It's unable to deal with the concept that there is a category of "robot", which it might be a subset of.
    • The Robots of Dawn: A preeminent roboticist remarks to Detective Baley that modern robots cannot be fooled with paradoxical situations because the only paradoxes they care about are based on the Three Laws of Robotics. They're also equipped with random choice to resolve near-equal disputes. However, the mystery of this book is a robot (one that he designed) has been shut down with a paradox involving the Three Laws, he's the prime suspect (it is always possible to circumvent the safeguards, but one needs to know the details of the robot's brain, and this one was a secret he never shared). He even dismisses the story of Herbie's paradox as a myth (because the mind-reading robot he owns psychically enhanced his skepticism to keep itself safe).
    • "Runaround": Because the Rules of Robotics ensure that Speedy will avoid endangering itself, Donovan set up an unintentional conflict by casually sending Speedy into a situation he didn't know would be hazardous. With a weak Second Law set against the Third Law, Speedy has been spending hours spinning its wheels at the distance where the two priorities are exactly equal. The conflict is resolved when they exploit the First Law to force him out of the loop. (Later, they order Speedy to complete the original task no matter what; the reinforced Second Law overrides the Third, and Speedy returns with only minor, repairable damage.)

By Work:

  • Mikhail Akhmanov's Arrivals from the Dark: In Invasion, the Faata use telepathic biological computers to control their ships. It's revealed that these computers are based on a failed Daskin project and have a serious flaw. If given conflicting orders at the same hierarchy level, they may crash and take the whole ship down with them. This is used by Pavel Litvin when he orders the computer to keep his location hidden from the Faata. When the Faata, whose job is to interface with the computer, tries to order the computer to locate Litvin, the computer warns him of this possibility if the Faata insists the computer carry out the order. Yes, the computer is smart enough to figure out what could cause it to crash but still can't Take a Third Option. No wonder the Daskins abandoned the experiment.
  • Discussed in Babel-17 when the doctors are trying to come up with a way to get Wong and Butcher out of the mental trap they've gotten stuck in because of Babel-17. Several classic paradoxes are mentioned, including the Barber paradox and the Cretan liar (see below).
  • Bas-Lag Cycle: In the climax of Perdido Street Station, Isaac actually succeeds in using a Logic Bomb as a power source for his crisis engine, presenting an attached Clockpunk calculating machine with two things it concludes are, simultaneously, identical and inherently unalike. This doesn't shut the clockwork computer down, but the irreconcilable dilemma provides sufficient "crisis energy" to create a feedback loop with which to bait the slake-moths into gorging themselves to death.
  • Used in the Bone Chillers book The Dog Ate My Homework. The heroine and her two friends have to defeat a computer program which has gone sentient and plans to take over the world by telling it two things. The girl tells the program it absolutely has to listen to her friends because they're telling the truth, and the friends tell the program not to listen to the girl because she's lying. Which she then says is absolutely true. Which the friends then say is a lie. The kids go back and forth on this until the program gets too confused, has a total meltdown and gets destroyed.
  • The Chronicles of Prydain: The Black Cauldron is an Artifact of Doom designed to revive dead bodies placed inside it as undead warriors for any aspiring Evil Overlord to use. How do the heroes destroy it? the very-much-alive Prince Ellidyr jumps into it and the Cauldron finds itself trying to revive something that isn't dead, causing an explosive magical Reality-Breaking Paradox that utterly annihilates the Cauldron (and, unfortunately, takes poor Ellidyr with it).
  • Larry Niven's short story "Convergent Series" features a physical Logic Bomb. The main character summons a demon more or less by accident; he gets one wish, but will lose his soul after it is granted. There's no way to get rid of the demon: no matter where the pentagram is drawn, the demon will appear inside of it — and you don't want to know what will happen if there's no pentagram. The protagonist wishes for time to stop for 24 hours. He then draws the pentagram on the demon's belly — and as soon as time starts running again, the demon immediately starts shrinking down to infinitesimal size. The protagonist then goes to the nearest church.
  • The Dark Tower: Wizard and Glass features a train operated by a sentient AI which has threatened to crash the train, killing the heroes on board, unless they can ask it a riddle it can't figure out the answer to. After hours attempting in vain to outsmart it, Eddie asks it joke riddles with no logical answers. The train is still able to answer, but the increasingly illogical nature of them causes it pain. Finally Eddie gets to "Why did the dead baby cross the road? Because it was stapled to the chicken!" This proves to be the killing blow, the AI self-destructs and the train crashes anyway, but not violently enough to kill the heroes.
  • In Emily Rodda's Deltora Quest series, the heroes come upon a monster guarding a bridge. Two of them pass, but the remaining hero fails the riddle, and the monster allows them to say one last thing. If the statement is true, he will be strangled. If it's false, his head will be cut off. The hero says "My head will be cut off." Fortunately, a paradox was exactly what was needed to defeat the monster in the first place, as the monster was condemned to guard the bridge "Until truth and lies are one." The monster is returned to its original form, a black bird, and freed.
  • The AI in one of The Demon Headmaster books is shorted out by the protagonists shouting gibberish and riddles into its receivers.
  • David Langford's short story Different Kinds of Darkness uses images called Blits as a major element - and Blits are basically Logic Bombs for the human brain.
  • Discworld
    • As a joke (and a possible Shout-Out to The Prisoner), the wizards ask Magitek computer Hex "Why?"; instead of malfunctioning, however, Hex answers "Because." Naturally, they ask "Why anything", and after a longer while, HEX answers "Because everything", and then crashes. After that they stop mucking about with silly questions - not because they're afraid of damaging Hex irreparably, but because they're afraid they might get answers.
    • Feet of Clay: Occurs tragically in the case of the Meshugah, created by the golems to be their liberator and king and who is given such a huge number of instructions, many of them vague, many of them in conflict, that it is incapable of following all of them. Unfortunately rather than shut down it goes completely mad.
    • Thief of Time: Characters are trying to deal with the Auditors — reality-monitors who are made of pure logic. Thus, while fleeing, they put up signs reading "KEEP LEFT". In a right-pointing arrow. "Do Not Feed the Elephant". In an empty cage. "Duck", with no duck or reason to go on your hands and knees, and of course, "IGNORE THIS SIGN. By order". Effectively a Logic Minefield. The series of Logic Bombs was behind a velvet rope with "Absolutely No Admittance" hanging off it. Considering that, in a way, the Auditors are the rules, disobeying any of the signs is a cause for extreme stress in what passes for their life. Unfortunately, the Auditors in human form all too quickly become human enough to discover the concept of "Bloody Stupid", allowing them to bypass the traps.
      Lobsang: But you can't obey the Keep Left/Right sign no matter what you do... Oh, I see...
      Susan: Isn't learning fun?
    • In Hogfather, Ridcully manages to Logic Bomb HEX into functioning, after it's already broken down. All it took was typing the phrase "LOTS OF DRYE1/4D FRORG PILLS" into its keyboard.
    • Going Postal features semaphore tower hackers. One of the tricks they develop is a kind of "killer poke" (see Computing above) which causes the mechanism to execute a particular combination of movements that does anything from jamming the shutters to shaking the tower to pieces.
    • In The Wee Free Men, Tiffany's baby brother suffers a Logic Bomb whenever he's offered more than one piece of candy at a time. He knows if he chooses one, it means (briefly) rejecting all the others, and he's at an age when "deferred gratification" means the same as "no" to him. So he sits there, surrounded by sweets, and wails miserably that he wants a sweetie.
    • Pratchett's co-authors for The Science of Discworld once wrote another book with a chapter about free will, and titled the chapter "We wanted to include a chapter on free will, but we decided not to, so here it is".
  • Doctor Who Expanded Universe: Subverted in the novel Frontier Worlds, in which the Doctor tries the Liar Paradox on a security robot, which snaps, "Get off with you. You'll be asking me to calculate pi next," and keeps attacking him.
  • Feliks, Net & Nika has two instances of division by zero. One of them stops a pair of robots ran by an evil AI program for about half a minute. The second one stops a huge mass of sentient rock capable of modifying everything in range at a molecular scale if not smaller seemingly forever - the "Wish Machine's" program isn't formed, with it lying dormant for eons and used only by about three uneducated people ever, so it's taught mathematics about half an hour before being prompted to divide by zero, leading to a lack of any failsafes being set beforehand to tell it what to do.
  • In Robert Westall's dystopian novel Futuretrack Five, Chief System Analyst Idris Jones keeps one of these to hand as a sort of job and life insurance. He built the supercomputer, Laura who runs all of the computer systems that keep the setting functioning, in secret and no one else knows exactly how she works. But, just in case they decide that someone else can operate her or they know enough to get rid of him, he keeps a datatape of works of fiction, philosophy and religion to feed to Laura. The inconsistencies and contradictions are intended to make her burn out.
  • In Gödel, Escher, Bach: An Eternal Golden Braid, the infinite-order wish "I wish for my wish not to be granted" effectively crashes the universe.
  • The Golden Age series by John C. Wright has a variant — A.I.s are all inherently ethical, so they'll shut down if you convince them their very existence is making the universe a worse place.
  • The 3rd century BC Chinese book Han Feizi has a story about a man who boasts that his spears are so sharp no shield can stop them, and that his shields are so tough that no spear can pierce them. The man to whom he's making the sales pitch asks "So what happens when your spear strikes your shield?", to which the seller has no answer. This story is the origin for the Chinese word for "paradox/contradiction", which is literally written as "spear-shield".
  • The Hitchhiker's Guide to the Galaxy
    • The Restaurant at the End of the Universe: Arthur totally disables the Heart of Gold by asking it to make tea. Depending on which version you prefer it's either because it doesn't know how to make tea, or because it's affronted at the possibility that Arthur could prefer dry leaves in water (a concept alien to them) to whatever they could offer him (a concept even more alien to them). The text adventure game based on the novel actually made this a plot point, as in order to advance you have to get tea, then go into your own head and remove your common sense, which allows you to get "no tea" as well. Then you show this to a door, which is impressed by your grasp of logic and allows you to pass.
    • Then there was the theory that the existence of the Babel fish, a symbiotic creature that lives in your ear and translates any language for your brain disproved the existence of God. The argument was that the existence of an organism so unlikely yet so useful is evidence for a creator and that therefore this removes the need for belief and without belief god is nothing. Ergo there is no god. The man responsible for this argument went on to prove that black is white and white is black and got himself killed at a zebra crossing. The theory was debunked by Theologians fairly quickly as if Gods existed they wouldn't need belief to survive, but that didn't stop Oolon Coluphid making a lot of money from it.
    • In And Another Thing..., Ford Perfect froze the computer controlling the ship, which wasn't really a computer, but Zaphod's Left Head (called "Left Brain"). He did it by making an (im)probability probable and improbable at the same time (the ship was the Heart of Gold, which ran on the Improbability drive: Long story short, anything happening/going somewhere which is improbable becomes probable, which is how it got to places that were improbable). The ship rescuing them was improbable, mathematically, yet it had done it before twice, which by Ford's made up logic of patterns made it probable again. Quite smart, and yet extremely stupid, because the ship's now-turned-off Dodge-o-matic was the only thing keeping them from being fried.
  • In Stephen Colbert's I Am America (And So Can You!), in the chapter where he writes to several hypothetical futures, he delivers a Logic Bomb to the robots who have taken over humanity, then tells the humans "You're welcome."
  • In Kea's Flight, Draz uses these to shut down some of the simpler computers. They have lines of code that are supposed to prevent crashes, but Draz can get around those with a virus.
  • Inverted in The Long Earth, in which an AI manages to logic bomb the human international court system with an impossible-to-disprove assertion ... namely, its claim that it's actually human, being the reincarnation of a Tibetan who'd died at the same instant its processing system was activated. Rather than tackle that theological can of worms, the U.N. court subjected it to a lot of questions about its "past life" and, unable to find enough discrepancies to prove it was lying, grudgingly granted it legal personhood.
  • Parodied in one of the molesworth books, when molesworth 2 defeats an electronic brain by creeping up behind it and asking it the cunning question "wot is 2 plus 2 eh?", which causes the brain to laugh so much it shakes itself to pieces.
  • In Gordon R. Dickson's story "The Monkey Wrench", a man attempts to shut down a meteorologic arctic station just for bragging rights. He is able to do so by prompting a paradox to the machine, making it incapable of doing anything than computing the paradox. Ironically, this condemns him and his partner to freeze to death, as all the vital controls of the station were provided by the machine.
  • In the trade paperback edition of M.Y.T.H. Inc. In Action, the illustration of Guido coping with a ceiling-high stack of bureaucratic paperwork includes the following sign in the background:
    "Please complete forms NS-01-D and RD-007-51A before reading this sign".
  • The Phantom Tollbooth by Norton Juster: Milo is able to bring about a truce between feuding brothers Azaz and the Mathemagician by pointing out that, since they always disagree with each other as a matter of course, they both always agree that they will be in disagreement.
  • The Space Odyssey Series: In 3001: The Final Odyssey, the protagonists use rather more sophisticated logic bombs against the monoliths that trick them into carrying out an infinite set of instructions. The book notes that none but the most primitive computers would fall for something as simple as calculating the exact value of pi.
  • Used by The Stainless Steel Rat to enter a house guarded by a robot programmed not to let anyone in the house. He and his son each ran slightly farther into the house than the other person, causing the robot to rapidly change targets and eventually overload, though it didn't explode.
  • Star Wars Legends has droids equipped with behavioral inhibitor programming which serves the same purpose as the Three Laws, although the specific inhibitions vary based on the droid's purpose (a war droid that can't cause harm is worse than useless). Rather than shutting down when faced with a break or paradox, it's suggested that small everyday events lead to an almost constant buildup of garbage information as the droid puts those hard rules into usable context. The result is called a "personality snarl" because the observable symptom is a Ridiculously Human Robot. While these snarls tend to improve performance in many ways, the droid often becomes more person than tool which can in turn cause reliability issues when the owner needs his tool to be a tool. As such, most droids are reset every six months to keep this corruption in check. An example of this effect in action would be R2-D2, who has managed to avoid being memory-wiped for decades and is only limited in personality by his usage of 'droid speak' rather than Basic (like C-3PO, who isn't so lucky).
  • In the book 2095 of the Time Warp Trio series of books, the heroes deliver three of these to a robot that's pointing a rather menacing-looking gun at them and asking them for their "numbers". They give it numbers with infinite decimal expansions (10/3, sqrt(2), pi) and it crashes into a smoking pile (the numbers were actually ID numbers, akin to one's credit card number, and all the robots did was show holographic advertisements at them). All that advanced AI, brought down by a couple of lousy floating point numbers.
  • In Valhalla by Ari Bach, the protagonist mistakes one of her new friends for an A.I. and tries to logic bomb her. The seemingly robotic Valkyrie is mocked ruthlessly for acting so cold she was mistaken for a robot.
  • In Christopher Stasheff's Warlock of Gramarye series, the hero's robot horse, Fess, is prone to doing this when something particularly illogical happens. Fortunately there's a reset button to fix the problem; unfortunately, the series is set on a planet filled with psychics, time travelers, ghosts, and fairies, so... the reset button sees a lot of use.
  • One chapter of Sideways Arithmetic from Wayside School deals entirely with tricky True/False puzzles. Joy brags that these are far too easy for her, so Mrs. Jewls gives her an "extra credit" assignment consisting of two statements: "The next statement is true. The previous statement is false." At the end of the chapter, she's still trying to figure it out. (The Answers section helpfully explains the paradox, so readers are not expected to solve it.)
  • Welkin Weasels shows Scirf inducing a heart attack in the monstrous giant pygmy shrew (yes, ''giant pygmy'' shrew) Cyclops by asking it "Did you know that everything I say is a lie?" and causing it to obsess over the problem for hours until it works itself into a rage.
  • Raymond Smullyan's logic puzzle-filled book What is the Name of This Book? has a few puzzles where the correct answer is "This scenario the puzzle gives is logically impossible and hence the author of it is lying and just having a bit of fun at the readers' expense." One example is: "Someone who is either a knight (always tells the truth) or a knave (always lies) tells you, 'Either I am a knave or two plus two equals five.'"
  • In Worm, Skitter attempts one of these on Dragon, who No Sells it and quotes Wheatley at her. She has to instead resort to an A.I. Breaker-style misdirection to escape, and even it's pointed out this only works because the Dragon she was fighting was a lesser, not fully sapient copy of the real thing (who is smart enough to not fall for such things).

    Live-Action TV 


  • Jon Stewart
    • Parodied in a March 2011 episode of The Daily Show. Jon Stewart debates Republican supercomputer Reagan OS 911 (a parody of IBM's Watson computer featured on Jeopardy!). As they discuss the Obama birth certificate controversy (the computer believes President Barack Obama is not American, and the computer is also pro-life), Jon confronts the computer with this dilemma: Obama was certainly conceived in America, and the computer believes life begins at conception. Then that means that Obama is a US citizen. But Obama was not born in the US. So either Obama was not born in the US and fetuses are not human beings, or Obama was conceived and is therefore a US citizen and the rightful President. Reagan OS struggles to process this, playing clips of embarrassing Republican moments on its screen before finally breaking down.
    • Which might be all the funnier because that isn't actually a paradox. Even if life does begin at conception, citizenship does not begin until birth.
    • One time on The Colbert Report, he was using "The DaColbert Code" for some reason. It was basically a word-association game, and at one point he gets stuck in a loop, forcing him to come up with a new thing.


  • In the French sci-fi series Aux frontieres du possible: The protagonists disable a supercomputer by asking it what time it is. It starts to answer but cannot complete the answer since by the time it finishes telling the time, the time has already changed. Predictably it explodes in frustration.
  • Attempted in Battlestar Galactica: While interrogating Leoben, Starbuck mocks his belief in God, making the argument that as a machine, Leoben has no soul and claims that the knowledge itself is enough to make his mind go Does Not Compute. It...does not exactly work. And by "Does not exactly work", we mean that it is Leoben who ends up giving Starbuck a Mind Screw of epic proportions.
  • In Better Off Ted, the computer suffers a logic bomb not with a paradox, but more colloquial logic based on a faulty assumption. There is no logical explanation on why a dozen VD employees would be accelerating towards outer space, it being unable to break the assumptions that the ID tags were being worn by the employees.
  • Sheldon in The Big Bang Theory creates an algorithm to see how he can become friends with a stranger. Like a computer, Sheldon winds up hitting an infinite loop where he goes back to the same questions over and over again without even realizing it. Howard creates a counter loop to break it and bring Sheldon back on track.
  • In Caprica, Daniel Graystone inadvertently Logic Bombs an AI he's attempting to create by telling it to try to hurt him emotionally, when it's programmed to be driven by the desire to please him.
  • In Derry Girls Clare announces, "Well, I am not being individual on me own."
  • Doctor Who:
    • "The Invasion": Zoe blows up an innocent (but extremely irritating) computer receptionist by giving it an insoluble ALGOL program.
    • "The Green Death": The Doctor tries the Liar Paradox on BOSS and finds that he's only confused for a few moments. Although BOSS is a Ridiculously Human Computer even by the usual standards of that trope.
    • He also dropped a Logic Bomb on a sentient city in "Death to the Daleks". He describes it as the computing equivalent of a nervous breakdown.
    • "Frontier in Space": The Doctor brags of shorting out every Mind Probe used on him (with an Exact Words answer) while being interrogated by aliens. Eventually they have to release the Doctor as they've run out of mind probes.
    • "Robot": The robot is driven insane when it is ordered to kill in spite of its programming not to.
    • In Development Hell serial "Shada", the Doctor gets attacked by the villain while snooping around his ship. After the villain attacks the Doctor, the Doctor puts himself into a state of Faux Death thanks to his Bizarre Alien Biology so he can escape. During this, The Ship, who is extremely obsequious towards the villain, scans the Doctor and confirms him dead. When the Doctor gets up and starts walking around and talking to it, the Ship is extremely confused, since it can't understand why he is talking if he is dead, and suggests rescanning him. At this point, the Doctor takes advantage of the situation by convincing it that the Ship does not need to rescan him, as her master is infallible, and she is therefore infallible. Therefore, her reading was right, the Doctor is dead, and as he is dead he cannot order her to do anything that would cause any harm to her or to her master, so she should start obeying his commands. The Ship starts listening to him, but also turns off the oxygen as there are no live people on board, and finds the Doctor's request to turn it back on illogical. In the book adaptation, the increasing demands the Doctor's logic puts on her causes her to reassess much of her basic programming, realise that her master is not infallible, that he tried to kill her, and that the Doctor is a much better person than him.
    • "Remembrance of the Daleks": The Doctor makes a Dalek self-destruct just by yelling at it, even though a Dalek is not a robot. (This last actually had a carefully-thought-out rationale, but you had to read the novelization to find out what it was.)
    • "The Sontaran Stratagem": The Doctor confuses a killer satnav by giving it conflicting instructions, but it just fizzes instead of exploding spectacularly. To whit, he ordered it to kill him. The device was already going to kill him, but had also, as a poorly-thought-out precaution, been ordered not to do anything he told it to do.
    • "Nightmare in Silver": The Doctor is playing chess with a cyber version of himself (each controls ~49% of his mind) with the high stakes of whoever wins gains control over his entire brain. After the cyber version (Mr. Clever) states that he will checkmate him in 5 moves, the Doctor bluffs that he can beat him in 3 moves (despite having just sacrificed his queen). The Cyberiad (networked cyber mind) devotes the entire computing power of 3 million Cybermen minds to figure how this would be possible, literally stopping the army in its tracks, allowing everyone to escape the planet.
  • In Everybody Loves Raymond, Peter is trying to convince Raymond to help him break up Robert and Amy's engagement:
    Peter: I thought we were friends!
    Raymond: Yeah, but friends can disagree.
    Peter: No they can't!
    Raymond: But you just disagreed with me right there.
    Peter: [looks confused] ...Oh, you are crafty.
  • Parodied in an episode of the Disney series Honey, I Shrunk the Kids; Wayne attempts to talk a hostile supercomputer to death. It seems to work... but then he calls it out on the obvious trickery, even saying "That only happens in cheesy scifi shows," and uses the opening it left to shut it off for real.
  • In "Where the Wild Things Are" from The Inside Man, the Handler secretly slips Erica from Finance a USB drive containing what is described as a "logic bomb" and if she inserts it into her laptop, it will compromise all Kromocom security. Fortunately, in "The Sound of Trumpets," Mark Shepherd manages to stop her before she can actually insert it.
  • JAG: In "Ares", the eponymous computerized weapons control system onboard a destroyer in the Sea of Japan goes havoc and starts firing at friendly aircraft, as programmed by the North Korean Mole. However, Harm’s partner Meg is en route in a helicopter: the on the spot solution advocated by Harm is for the helicopter to fly low and at low speeds, thus simulating a ship, which the computer won’t target. This has an actual basis in reality; to prevent them picking up things like birds or stationary terrain features (and cars on them) most air-search radars have a speed check built in, and won't display contacts moving slower than about 70 mph and below a certain altitude. This was exploited in the First Gulf War to break the Iraqi air-defense system.
  • Knight Rider 2008: Similar to the Star Trek examples, Sarah tries to distract a damaged and guilt-wracked KITT by asking him to compute the last digit of pi. KITT points out that pi doesn't have a last digit and goes back to being guilt-wracked.
  • In a very old episode of Law & Order, one of these was dropped on a Well-Intentioned Extremist who bombed an abortion clinic by having an associate plant a bomb on her friend who was getting an abortion (they didn't plan on blowing the woman up, their bomb went off early). When the prosecutor pointed out that the bomber was just as guilty of murdering the woman's fetus as the abortionists she despised, you could almost see her mind going "does not compute".
  • The Librarians 2014: After Cassandra comes into contact with the Apple of Discord, she tries to blow up a power plant to cause a cascading power failure through all of Europe. Flynn tries to Logic-bomb her by requesting she calculate the last digit of put. She laughs him off. Stone succeeds by asking her to calculate Euler's number.
  • In one episode of My Name Is Earl, Randy gets stuck in a thought loop after hearing the words "Catalina" and "half-naked"; Earl says this also occurs whenever Randy watches Back to the Future.
  • Mystery Science Theater 3000
    • In the episode Robot Monster Servo, Crow and Cambot all explode while trying to work out why bumblebees can fly!
    • Parodied again in the episode Laserblast. The Satellite of Love is invaded by a "MONAD" probe (a parody of the NOMAD probe from Star Trek as mentioned above.) Mike attempts to drop a logic bomb on it, but when it doesn't work he simply picks it up and tosses it out of an airlock.
    • A minor Running Gag was Servo suffering these even when Crow and Cambot are fine with it. Other than the above mentioned "how do bumblebees fly" example, his head has also exploded from:
      • Trying to think of a good thing about The Corpse Vanishes.
      • Watching the first sixteen minutes of ''Star Force: Fugitive Alien 2" (he was fine for the rest of it).
      • After using some Insane Troll Logic to prove who Merritt Stone was, Joel, Crow and Gypsy use their own to prove that some other guys might be him. It's all too much for Servo who's left screaming: "HE'S NOT MERRITT STONE!" before his head explodes.
  • In the first episode of the surreal Noel Fielding's Luxury Comedy, a robotic Andy Warhol examines a picture of Pelé holding a China cup while kicking a ball. When it is pointed out that the "ball" (drawn very plainly) could just as easily be the saucer accompanying the cup, his brain is fried.
  • Happens in Nurse Jackie to the local Talkative Loon, who thinks he's God; he has a near-death experience after being clonked on the head with a bottle and sees a God who isn't him. Zoey eventually persuades him that just because he isn't the God doesn't mean he couldn't still be something important in the "religious hierarchy thing."
  • InThe Office (US), Dwight in a video contemplates how Enemy Mine combined with His Own Worst Enemy creates a paradoxical relationship with Jim.
  • A Logic Bomb actually helps the heroes in an episode of Power Rangers Turbo. A curse cast on them by the Monster of the Week makes them unable to tell the truth, and even worse, summon one of Divatox's Mooks whenever a lie is told. Alpha-6 figures out that the curse can be broken if they say something that's the truth but that's a catch-22, it seems... Until one of them realizes that by saying "I can't tell the truth" he's both lying and being truthful at the same time. Once all the Rangers figure this out, the curse is broken, and they're quickly able to bring the villain down.
  • In The Prisoner (1967) episode "The General", Number Six drops one of these on the titular all-knowing computer, short-circuiting it with the question "Why?". This episode may well be the Trope Codifier for the "open-ended abstract philosophical question" version of this trope as opposed to the "logical paradox" one.
  • Averted in The Professionals when Ray Doyle deliberately throws in wrong answers to confuse the computer during his psychological evaluation. It's pointed out to Doyle afterwards that this won't work, as he only gives a deliberately wrong answer on questions that he's sure about. The computer knows what areas he's proficient at, identifies what he's doing and adjusts accordingly. They must have had some impressive computers back in The '70s!
  • QI
    • This exchange:
      Stephen Fry: Is this a rhetorical question?
      Alan Davies: ... No.
      Stephen Fry: ... quite right.
    • Stephen once mentioned the fact that, because one's number of ancestors increases exponentially, if one looks back far enough one has more ancestors than there have ever been human beings. The seeming impossibility of this caused Sean Lock's head to explode before Stephen explained that this works because most of the ancestors are shared.
  • The German SF series Raumschiff Orion plays the trope straight exactly like the French, British or American counterparts mentioned here. Moral: A.I.s are universally dumb!
  • Red Dwarf
    • "Last Day" — Kryten defeats Hudzen by convincing him — in defiance of his core programming — that there is no robot heaven. (Kryten is not damaged by the Logic Bomb because A: he knows he's lying, and B: there's nothing in his programming that prohibits him from deceiving another robot. Another episode (the very next one, in fact) shows Kryten's difficulty re: lying to organic life forms.)
    • In "Tikka to Ride", Lister is shown inadvertently destroying an artificially intelligent video camera (apparently the third one that week) by trying to explain the Temporal Paradox that happened in the battle of the previous episode. Kryten, however, merely finds it garbled, confusing, and dull; he suffers no ill effects.
  • In an episode of Charlie Brooker's Screenwipe on TV manipulation, he presents "Truthbot 2000", a robot that can detect dishonest TV and alert the viewer. Truthbot instantly points out that it is just a cheap prop outfitted with some lights and circuits, and has its voice dubbed on later. Charlie asks it how it knows this if it's just a cheap prop. It instantly overloads and explodes.
  • In Spellbinder, the robot servants of the Immortals are programmed not to harm humans: so, when one of them is ordered to guard Kathy and Mek, they try to confuse it by insisting that it's hurting them by keeping them locked up. After an attempt to obey both orders by continually opening and shutting the cell door, the robot is finally defeated when the prisoners start chanting "Ow, you're hurting us!" until it short-circuits.
  • In a Square One TV sketch parodying 2001: A Space Odyssey, a pair of astronauts stop their computer from singing "Row, Row, Row Your Boat" all day long by giving it an unsolvable algorithm: Start with 3, add 2, if answer is even, stop, if odd, add 2 again, repeat. Why exactly listening to the computer count by twos to infinity was less annoying than listening to it sing remains a mystery.
  • Star Trek: The Original Series: This is how Kirk dealt with rogue computers and robots all the time (when he didn't just rewrite their programs like in The Kobayashi Maru), often by convincing them to apply their prime directives to themselves:
    • In "The Return of the Archons", he convinced Landru (prime directive: "destroy evil") that it was killing the "body" (the civilians kept under its thrall) by halting their progress through Mind Control.
    • "The Changeling"
      • Captain Kirk convinces Nomad, a genocidal robot with a prime directive of finding and exterminating imperfect lifeforms, that it itself is imperfect (it had mistaken Kirk for its similarly-named creator and had failed to recognize this error).
        Nomad: Error... error...
      • Also Subverted in the same episode: Nomad believes that Kirk (who it still thinks is its creator) is imperfect. When Kirk asks how an imperfect being could have created a perfect machine, Nomad simply concludes that it has no idea.
    • In "The Ultimate Computer", he convinced M5 ("save men from the dangerous activities of space exploration") that it had violated its own prime directive by killing people.
    • In "That Which Survives", he forced a hologram to back off by making her consider the logic of killing to protect a dead world, and why she must kill if she knows it's wrong.
    • In "I, Mudd", he defeated the androids by confusing them with almost dada-like illogical behavior (including a "real" bomb), ending with the Liar's Paradox on their leader.
    • Another one involving Kirk: In "Requiem for Methuselah", the android's creator used Kirk to stir up emotions in it, but he succeeded a bit too well, causing her to short out when she couldn't reconcile her conflicting feelings for both Kirk and her creator.
    • "What Are Little Girls Made of?" had him arrange to have a robot duplicate of him say an Out-of-Character Alert to Mr. Spock; he follows up by Breaking Speeching The Dragon du jour into remembering why he helped destroy the "Old Ones" so he'd turn on the episode's Anti-Villain. For a finale, he forces the roboticized Dr. Korby to realize that he's the Tomato in the Mirror. He also pulled the "seduce the Robot Girl" trick.
    • Even Spock did this once. In "Wolf in the Fold", when the Enterprise computer was possessed by Redjac (a.k.a. Jack the Ripper), Spock forced the entity out by giving the computer a top-priority order to devote its entire capability calculating pi to the last digit.
    • Kirk even uses this tactic on humans. In "Charlie X", he has the bridge crew turn on every system so Charlie (who has Reality Warper Psychic Powers) will be confused and overloaded while trying to control the ship.
  • Star Trek: The Next Generation: In "I Borg", a proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers. Of course they never actually try it, even when they again have access to the Borg network, so they might have realized it wouldn't work off screen, and the sequel to this episode, "Descent", suggests that the Borg deal with cyberweapons by simply severing affected systems from the Collective.
  • On Star Trek: Deep Space Nine, Rom accidentally Logic Bombs himself while over thinking the Mirror Universe concept. Hilariously, Rom's self-Logic Bomb simultaneously Lampshades and side-steps a number of actual logical problems with the Mirror Universe.
    "Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."
  • Star Trek: Voyager: In "Latent Image", the Doctor suffer one of these: he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or another ensign he barely knew. Though his program covers such situations, dictating that the one with the greater chance of survival be treated, in this situation they have both been affected by the same weapon and have the exact same odds for a successful recovery. He chose Harry since he needs to save somebody and they are close friends, but because he chose him due to friendship as opposed to a medical reason, the event became an all-consuming obsession afterward and wrecked his ability to function. Curiously, it never seems to occur that the Doctor should have chose Harry because he is the more valuable Bridge officer, which should be standard triage procedure.
    • He hadn't been originally programmed to have "personality" subroutines and suspected he was not being objective. Janeway explained to him that he was doing his duty, but he simply didn't believe her. It's entirely possible he would have had roughly the same breakdown regardless of whom he chose.
  • In "I of Newton", an episode of The Twilight Zone (1985), a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: Get lost!
  • In an episode of Welcome to Paradox an AI is brought down using a theological paradox. The AI was created by a church for their religion and the AI believes itself to be the One True Madonna. However, being an AI it can also produce copies of itself. After it is tricked into doing this and is successful about it, the "One True" part of its identity comes crashing down along with the rest of the program.
  • Westworld has an excellent variation when Maeve is confronted with readout of her own internal mental processes. Her attempts to say something that the computer won't predict cause her program to crash.
    • In the third season, after getting trapped in a computer simulation, Maeve asks some techs what the square root of negative one is. The simulation freezes up as the techs try to figure out the unsolvable problem.
  • In episode "Crazy for You" of Wicked Science, When Elizabeth finds that Verity and Garth can't meet her exacting standards as personal assistants, she decides to create a self-learning artificial intelligence program to do the job instead. She calls it Max - and it doesn't take her long to realise that she's created a monster. Max soon decides that he alone knows what is best for Elizabeth and keeps her prisoner in her own lab. He attacks Toby via his computer and spies on Verity and Garth. When Max refused to let Toby enter the laboratory in order to defend him, Toby had to use syllogism to convince the computer. First of all, Toby asked the computer if its only purpose is to take care of Elizabeth? The computer confirmed. Next, Toby asked if Elizabeth wanted to see him urgently? The computer agreed again. Finally, if Max does not let Toby enter the laboratory, she will be very sad? The computer answers yes. Obviously, Max realized that it had encountered a logical conflict and forced it to let the cunning Toby into the laboratory.
  • In KickinIt ep titled "Rock'em Sock'em Rudy", The Wasabitron 3000 is a robot which replaced Rudy as sensei and it became violent because it deemed humans imperfect. During the fight, the Wasabi Warriors recited the wasabi code and the robot could not handle the fact that humans couldn't beat it but they didn't give up. It then shut down. Rudy kicked the robot, breaking it.

  • The Carly Simon song, "You're So Vain" is a logic bomb just waiting to happen. "You're so vain/You probably think this song is about you..." But it is about him! Augh! My head... Here the bomb is in the implications. It is IMPLIED his vanity would lead him to assume the song is about him, but if it actually is about him he isn't necessarily vain to think so. But since the song is about someone vain enough to assume the song is about them based on vanity alone, it cannot be based off him, making his assumption the song is about him one of vanity, as he would be vain enough to think everything is about him. It would be a twist on the 'this is a lie' statement using personality characteristics. The only way to defuse the logic bomb is to assume that Carly Simon was not, in fact, singing about anybody. Or that she wasn't thinking about the vain person, but about how much contempt she, herself feels towards them. Of course, nothing in the song says that the person in question is incorrect for thinking the song is about him — just excessively presumptuous.
    • Simon's 2015 revelations partially resolve the Logic Bomb: it's about several men, not just one. So it's not just about him, whichever one he is.
  • Jonathan Coulton's "Not About You" is a closer example, but you can write it off by saying the protagonist is just being petty:
    Every time I ride past your house I forget it's you who's living there
    Anyway I never see your face cause your window's up too high
    And I saw you shopping at the grocery store
    But I was far too busy with my cart to notice
    You weren't looking at me
  • The comedy folk song "I Will Not Sing Along" features these lines, to be sung along with by the audience:
    I will not sing along
    Keep your stupid song
    We're the audience: it's you we came to see
    You're not supposed to train us
    You're s'posed to entertain us
    So get to work and leave me be
  • "Weird Al" Yankovic has a few in "Everything You Know Is Wrong":
    Everything you know is wrong
    Black is white, up is down, and short is long
    And everything you thought was just so important doesn't matter
    Everything you know is wrong
    Just forget the words and sing along
    All you need to understand is
    Everything you know is wrong
  • MC Plus+ uses a logic bomb to disable his pet rapping AI when it becomes too big for its britches in "Man vs. Machine":
    Consider MC X where X will satisfy
    the conditions, serving all MCs Y
    Such that Y does not serve Y
    Prove MC X, go ahead and try

    It's clear that I can serve all MCs
    If they serve themself, then what's the need
    Do I serve myself, then I couldn't be X
    I don't serve myself, that's what the claim expects
    If I don't serve myself, then I can't be Y
    And if I said I was X, it would be a lie.
    I must serve myself to satisfy the proof
    But I can't serve myself and maintain the truth <trails off in infinite recursion of the last two lines>
  • Meat Loaf had a 1993 song entitled "Ev'rything Louder Than Ev'rything Else." Think about that one for a second.
  • Stephen Colbert pointed out the Logic Bomb implications of One Direction's "What Makes You Beautiful", remarking that if the singer tells the woman in question she's beautiful, she'll cease to be that way, because it's her ignorance to which the song attributes her beauty. Then if she realizes this is the case, she'll think she's not beautiful anymore, and her beauty will return, etc.
  • Sara Bareilles was frustrated with her record label wanting a more "regular" song for her first album, perhaps a love song, so ultimately she wrote a song with a chorus saying things like, "I'm not gonna write you a love song / 'cause you asked for it / 'cause you need one, you see / I'm not gonna write you a love song / 'cause you tell me it's make or break [etc.]". She titled it "Love Song". (They took it and it has been her biggest hit to date.)
  • Lloyd Cole uses a classic in "Opposites Day":
    You should know better than believe
    A single word I say
    The next line is the truth
    The last line was a lie
  • Incomplete bomb (like the Crete of which it's a variant) by German singer Heinz-Rudolf Kunze: "Trau keinem Sänger" (Never trust a singer).

    Mythology and Religion 
  • The Bible (Titus 1:12-13) has the following:
    One of themselves, even a prophet of their own, said, the Cretans are always liars, evil beasts, slow bellies. This witness is true. Wherefore rebuke them sharply, that they may be sound in the faith.
  • One version of the "Golem of Prague" legend claims that the Golem went insane because Rabbi Loew forgot to deactivate it, as was his custom, one Friday evening. The conflict between the Golem's imperative to work and its desire to observe the Sabbath drove it insane and sent it into a destructive frenzy.

  • The Sleepy Clank, a podcast "radio play" set in the Girl Genius universe has a classic example: a cranky and sleep-deprived Agatha builds a warrior robot to attack anyone who tries to disturb her while she sleeps. Guess what happens when she tries to defuse the robot's subsequent rampage by telling it that she woke herself up?
  • An unusual variant occurs in The Adventure Zone: Balance when the protagonists face a murderous "quiz robot" named Hodge-Podge, whom they have to "stump" in order to defeat. The solution isn't to trick Hodge-Podge with a paradox, because he's too smart for that to work, but instead to ask him a question which cannot be answered because the answer was eaten by the Voidfish. Hodge-Podge stutters, sparks and then blows up as he tries to figure out the unknowable answer.

    Puppet Shows 
  • The Mr. Potato Head Show: At one point, a robotic version of Mr. Potato Head decided that the real Mr. Potato Head was a bad influence on the rest of the cast who was making them miserable, and tried to keep them separated. Betty the Kitchen Fairy told the robot that keeping them away from their friend also made them miserable, and this paradox caused the robot to explode.


    Tabletop Games 
  • Diplomacy: Each player submits orders for their units. The orders that all the players have written are then compared to see which ones succeed and which fail. This can lead to a paradox of the form: If A works, B fails; If B fails, C works; if C works, A fails. But if A fails, B works and C fails. But if C fails, A works... etc etc. Luckily the game isn't often played by robots, so heads rarely explode over the problem.
  • The Doctor Who Roleplaying Game: Referenced and subverted in the Lords of Destiny adventure, which takes place on worldship run by a massive computer. Attempting what the adventure calls the James Kirk School of Computer Repair will fail, because the computer is more competently built than that.
  • Illuminati: New World Order: This can be an issue with the NWO Political Correctness, New York, and Congressional Wives (or any Conservative group with a Power of 1). Political Correctness makes any Conservative group with a Power of 0 or 1 Criminal as well. New York grants a +1 to all of your other Criminal groups. The Congressional Wives are a Conservative group with a Power of 1. Political Correctness makes them criminal (because they are Conservative with a Power of 1), which makes them eligible for New York's Bonus (+1 to all Criminal)...but then they have a Power of 2 and they no longer fall under the condition of Political Correctness making them Criminal, so they are no longer Criminal, which means they no longer get the +1 bonus from New York, and go back to Power 1, which makes them eligible for Political Correctness... the FAQ handwaved this away, basing the effect on which one was played first.
  • Magic: The Gathering
    • In universe: A logic bomb nearly destroys the plane of Ravnica. As a way to keep order on the plane, the Guildpact was made, empowering ten guilds. Among these are the Boros, who serve as the army, and the Dimir, a secret guild that performs espionage. When the head of the Dimir goes crazy and tries to conquer the plane, he's arrested by a Boros officer and taken to be accountable for his crimes. Except that keeping the Dimir a secret is part of the Guildpact, and so is empowering the Boros to protect the plane. The contradiction causes the Guildpact to fail, throwing the entire plane into chaos.
    • The combo of Humility and Opalescence. Opalescence turns all other enchantments into creatures which retain the effect of the enchantment, and Humility is an enchantment that turns all creatures into 1/1 and removes their special abilities. Opalescence turns Humility into a creature, which means Humility is now removing its own passive ability to remove the passive abilities of things. Oh, did you notice that Opalescence doesn't turn itself into a creature, and cast another one to fix that oversight? Well, now it gets really complicated. The ruling for how just these two cards interact is among the most complicated in the game, and begins with the disconcertingly specific statement, "This is the current interaction between Humility and Opalescence".
    • While it's not actually viable in competitive play (requiring too much time and effort to set up), it's possible to make it so that your opponent can't play cards at all. The first step is to carefully play Experimental Frenzy, which allows you to play the top card on your deck, but prevents you from playing any cards in your hand. Then, use Role Reversal (probably off the top of your deck, which you set up with Scrying) to swap control of Experimental Frenzy and another enchantment your opponent has, so that your opponent can no longer play cards from their hand, and can only play from the top of their deck. Then, finally, close them off with Grafdigger's Cage, which prevents all players from playing cards directly from their decks and graveyards. Your opponent can no longer play cards from their hand because of Experimental Frenzy, and can't play from anywhere else because of Grafdigger's Cage. This is a full lock that can't be escaped, and forces your opponent to concede (since you can still play cards from your hand, at least), but requires such specific timing and plays, as well as meticulous setup, that it would never work except in the most extreme circumstances. Also, it only works if your opponent lacks access to red mana, or is otherwise unable to activate Experimental Frenzy's self-destruct ability.
  • In Nomine: It's dissonant for Servitors of Laurence to disobey his orders. Sometimes he issues contradictory orders, or orders that, because of incomplete information, make the mission impossible to complete. Fortunately, he's usually a Reasonable Authority Figure, and he'll fix dissonance if it's his fault.
  • Pathfinder: Many artifacts can be destroyed by using them for a contradictory or paradoxical purpose. A few instances:
    • The Crown of the Iron King gives its owner total control of the person they bestow the crown upon. If its owner wears the crown themselves, both they and the crown are instantly destroyed.
    • Baba Yaga's hut will warp out of existance if it is commanded to teleport inside itself.
    • The Twin Spheres (essentially two ends of a wormhole) will explode and destroy everything in a wide radius if one enters the other.
    • The Aegis (a shield with a medusa's head mounted on it) is destroyed if the medusa head is resurrected and turned to mirror herself in the shield's surface.
  • Planescape: The modrons once defeated a sentient, self-replicating mathematical equation that was usurping control of Mechanus by challenging its hordes to calculate the exact value of pi. The army of equations was so captivated by this challenge that they went to work for the modrons, keeping Mechanus operational, so that they could devote their mental energies to solving an unsolvable math problem.
  • Yu-Gi-Oh!: There are some card combinations that can cause infinite loops that force the game to end in a draw. One example is the card Pole Position, which makes the monster on the field with the highest attack be unaffected by spell cards, which can cause an infinite loop if you use a spell card to increase a monster's attack to be the highest. To prevent this from being abused as a way to force a draw, the rules have been patched so that it is against the rules to voluntarily take an action that would result in this kind of infinite loop. If the action would be completely unavoidable, like drawing a card, then Pole Position simply destroys itself. People still ended up abusing this rule, so it was eventually patched again so that a judge could simply force Pole Position to go to the graveyard immediately if an infinite loop came about.

  • In Les Misérables, Javert's breakdown is sometimes seen as this, but it's played with: Javert expects that Valjean will demand his own freedom as a condition of sparing his life, which would create a conflict of interest in Javert, but would also confirm his image of Valjean as a criminal opportunist (who merely draws the line at murder). Javert wouldn't really struggle with such a dilemma, as he'd choose the law over his own honour every time. When Valjean spares his life without condition, that goes out the window: Javert has only one course of action under the law, and what drives him crazy is realising that for the first time in his life, he doesn't want to obey that law.
  • In Ruddigore by Gilbert and Sullivan, a baronet has a curse on his family that requires him to commit a crime every day or die. He's tried appeasing it with harmless crimes but the ghosts don't like it. One day, he decides to not commit the daily crime. Since he'll die for not doing it, this amounts to attempting suicide, but attempting suicide is itself a crime. The logic bomb manages to break the curse.

    Video Games 
  • Although the AI generating the story in AI Dungeon 2 is incredibly advanced, it is very much possible to get it stuck in an infinite loop with one of these. Naturally, this usually results in the player having to do a hard reset on it. Thankfully, there are no explosions.
  • In SaGa Frontier, there's an actual attack named "Logic Bomb" that damages and stuns mecs (ironically only usable by other robots). Its visual representation is a massive and confusing string of numbers that ends with the word "FATAL" — which is presumably where the machine crashes.
  • The Dragonrend shout in The Elder Scrolls V: Skyrim is described as "Forcing the targeted dragon to understand the meaning of mortality — something so utterly incomprehensible to an immortal dragon that the knowledge tears at their very soul, breaking their concentration enough so they cannot focus on flying".
  • In Minecraft: Story Mode, Jesse gives to PAMA to momentarily distract himself to give him and the others time to escape. One of the possible choices to tell PAMA is "This sentence is false". These do not work on PAMA, however, since PAMA declares it a paradox and moves on.
  • In Minecraft proper, this trope is weaponized in the form of beds. When using a bed, you can transition from day to night, but in the Nether, where there is no day/night cycle, the bed will explode from the paradox that would ensue. This is lampshaded by the death message that, when you do this, says you were killed by "Intentional Game Design".
  • In TRON 2.0, the protagonist deals with a program blocking his way by exclaiming, "Quick! What's the seventh even prime number?" (There is only one prime number that is even: 2.) The program immediately has a seizure.
  • I Have No Mouth, and I Must Scream
    • In the endgame, a game loosely based on Harlan Ellison's short story of the same name, a character of the player's choosing is beamed down into the supercomputer AM's core and must disable its ego, superego and id with a series of logic bombs: The player must evoke Forgiveness on the Ego (who cannot fathom being forgiven for over a century of torture and halts execution in the typical manner), Compassion on the Id (realizing the futility of his hate and anger when AM's victims understands his pain) and Clarity on the Superego (who deliberately crashes when he realizes that even he will eventually decay into a pile of inert junk despite his godlike power).
    • Just getting to that part requires all five characters to initiate their own Logic Bombs. AM's scenarios are all set up to force his victims to give in to their own flaws and prove Humans Are Bastards. The only way to win is to drive each scenario's plot Off the Rails by proving Humans Are Flawed, but not totally evil. This contradicts AM's self-styled philosophy so badly he's forced to turn his attention away from his captives just so he can figure out what went wrong, giving them the chance to get into the core.
  • Marvel vs. Capcom 3: The Sentinel cannot comprehend the existence of X23.
    Sentinel: Wolverine DNA detected in female mutant. DOES NOT COMPUTE. DOES NOT COMPUTE. DOES NOT COMPUTE.
  • Subverted in Star Control 3 by the Daktaklakpak, highly irrational semi-sentient robots who consider themselves the pinnacle of logic and reason. Choosing the right dialogue options (such as the liar paradox) will seem to bring the Daktaklakpak to the verge of self-destruction, but will ultimately just enrage them.
    • And then played straight when you give them the full and complete name of the Eternal Ones; the one you're talking to analyzes the name, has a religious experience, and then explodes.
    • In Star Control II, you can use some dialogue options to tie the proudly Always Chaotic Evil Ilwrath into a hilarious logical knot, but they just get angry and attack you anyway.
  • Fallout 3's President John Henry Eden (A ZAX Computer) can be destroyed with a high science skill by revealing his thinking is circular and therefore badly flawed, causing him to lose all his presidential ways and charisma in a near Tear Jerker scene, then self destruct. Or you could use your speech skill and basically tell him that his plan sucks, he's technically not the president, and he should die, which works fine too. Fallout takes place in an alternate universe where the 1950s continued on for another 150 years. Being based on computers from the 1950s, Eden's lack of "paradox-absorbing crumple zones" is somewhat understandable.
  • Played with in a random encounter in Fallout 4, where the Sole Survivor comes across a pre-War Mr. Gutsy robot that tells them to immediately return to their home or else, finishing with the line "Repeat, will you comply?". The Survivor can take that literally and answer with "Will you comply?", thus starting a loop that eventually ends with the Mr. Gutsy turning hostile and then exploding a few seconds later.
  • In the Knights of the Old Republic continuity (and by extension the Star Wars EU) a logic bomb can have similar effects on droids as the aforementioned HAL, in fact, In KotOR 2, the player can do this to a maintenance droid whilst being a droid themselves. This works because the player-controlled droid has been modified and is thus able to lie.
    • One of the most extreme examples involves an infrastructure droid named G0-T0 being given the order to help rebuild the Republic while following its laws. Of course, he suffered a catastrophic breakdown when he realized that rebuilding the Republic was impossible without breaking laws: however, some time after G0-T0 was reported missing, a mysterious crime lord by the name of Goto appeared on Nar Shaddaa...
    • It goes even a bit further in his continued existence: G0-T0 still follows the directive to help the Republic. At the same time as an infrastructure droid it is programmed to value efficiency. This provides a paradox, as G0-T0 view the Republic as a bloated, ineffectual entity that clings on to bad management decisions, and it would be better for the galaxy to simply scrap the entire political system and place a new one in its stead. It is programmed to support something which another part of its programming is meant to remove.
    • The Dummied Out planet M4-78 has the supercomputer who shares its name. The planet M4-78 was run by thousands of droids led by the droid M4-78 working to set up a new colony, but the colonists never arrived after several decades. Fearing that having no sapient life to look after would cause it to develop bugs that would make it unable to fulfill its programming, M4-78 tightened its grip over the other droids and reprogrammed them to serve it. The Sith then arrived masquerading as the colonists in order to use their manufacturing to create a droid army, and M4-78 decided they were better than nothing and went along with it.
  • Planescape: Torment has a character who successfully convinces a man that he does not, in fact, exist. As a result he ceases to do so. Though to be fair the game is set in a D&D setting in which a system of "Whatever you believe, is" has replaced all laws of nature.Doing so unlock an optional method of ending the game by deliberately logic bombing yourself out of existence.
  • In the "Discovery" mod for Freelancer, in a server, opening up the chat box and typing "N/0" where N can be any number results in your spaceship spontaneously exploding, and the console messaging stating you have died due to "dividing by zero".
  • In the Legend Entertainment adaption of Frederik Pohl's Gateway, several puzzles revolve around being trapped in a virtual reality environment. In order to escape, you have to cause the environment to recursively spawn objects until the VR can't keep track of all of them (most notably, in one scene, forcing a hydra to attack itself).
    • In another you have to cause a contradiction. In fact, those two ways to break out of VR are given in a concealed hint earlier on, and you can also Bomb the beach program and the Freud program for fun.
  • In Zork Zero, there is a place you have to go to where a cult is executing everybody who passes by. Each person being executed is given a final wish. If the cult is able to grant that wish on the spot, the victim is hung. Otherwise, the victim is beheaded. You escape by logic bombing the executioner by asking to be beheaded. If you re-enter the cult's territory after that, you'll find out (the hard way) that they've gained an immunity to this logic bomb. "You are immediately dragged off to a back room to be executed in a special way, devised for people too sinful to deserve a relatively quick death by hanging."
  • In Sam & Max Beyond Time and Space episode "Ice Station Santa", the Freelance Police try to make an elf cry (so they can use his tears as a plant-growth potion) by telling him that Santa Claus isn't real. This somehow leads to a discussion about how elves aren't real either, and the elf breaks down crying during a moment of existential crisis.
    • In the same game, Sam manages to temporarily incapacitate the Maimtron by asking unanswerable questions in the form of song lyrics. It doesn't destroy it, but it does distract it long enough to get behind its head and shut it down.
      Sam: Why do birds suddenly appear every time you are near?
      Maimtron: Do they? Fascinating! Can there be a creature whose existence depends solely on its proximity to an observer?
      • Funnily enough, when they pose an actual logical paradox (the omnipotence paradox) he just says "Yes". When he asks Sam & Max "Is there a joke with a setup so obvious even you wouldn't make the punchline?", Max takes it to be a Logic Bomb ("Does not compute").
  • In BlazBlue: Calamity Trigger, it is possible to interpret the end of Nu-13's Arcade Mode as Taokaka causing Nu to glitch out through her sheer ditzy-ness Before she even opens her mouth.
  • Luminous Arc 2: Though not a robot, Josie suffers something like this. When sent to assassinate a weakened Althea, he freaks out and leaves without doing anything when he sees Roland has become as master. Sadie explains he's not Fatima's familiar, but a centuries-old one who serves the current Master. Being experienced but not very bright, he couldn't figure out what to do when faced with two masters with contradictory wishes.
  • Played with in Portal 2:
    • There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. Also, GLaDOS attempts to do this to destroy the Big Bad, Wheatley. Turns out he's too dumb to understand logic problems. It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and scream in agony, meaning even they're smarter than Wheatley. GLaDOS survives the logic bomb herself by parsing it as Punctuated! For! Emphasis! and then willing herself not to think about it, though she declares that it still almost killed her.
    • In the poster, the third thing to scream is "Does a set of all sets contain itself?". If taken directly, a set that contains all sets does contain itself. A good metaphor is a box that contains 3 apples and another box. It is irrelevant what is in the box, the box contains three apples and a box. This metaphor has its limits since numbers don't take up any physical space. The way this would create a problem with a computer is if you asked the computer to show everything in the set including members of the sets because it will cause an infinite loop. Further, the set of all sets would also include the set of all sets that do not contain themselves, which therefore includes Russel's Paradox.
  • Shows up in an exchange between Claptrap and GLaDOS in Poker Night 2.
    Claptrap: You know what really ticks me off? When some jackwad tries to blow my circuitry with some lame-o stunt he saw on a Star Trek re-run.
    Sam: What, like, "Everything I say is a lie"?
    Claptrap: Yeah, like that! What, do they think I'll just lock up, because of some teeny tiny logical paradox?
    GLaDOS: It is rather insulting. I learned how to avoid paradox traps while I was still in Beta.
    Claptrap: So what if everything Sam says is a lie? That doesn't mean that he's lying about that, right? 'Cause then he'd be telling the truth and... Ohhhh, no... *shuts down*
    * Beat*
    GLaDOS: Well, that was a shining moment in the history of robotkind.
  • In the flash-based text adventure game You Find Yourself In A Room, your AI captor asks you to list some "useless" human feelings you'd be better without. As the AI has nothing but scorn for you, responding with "Anger" or "Hatred" will cause it to break down, as it realizes its clinical, emotionless perfection has "been corrupted somehow". Though this seems to prove machines do have emotions after all, but this one won't admit he's the slightest bit like a human.
  • Subverted at the climax of Neverwinter Nights 2. The Big Bad is a Pure Magic Being that was created to defend the fallen realm of Illefarn, and thinks that the realm still needs defending. You can try to point out that Illefarn is gone, but the King of Shadows has already determined that its purpose should now be to defend Illefarn's descendants.
  • In Metal Gear Rising: Revengeance, when LQ-84I brags about its intelligence during their first meeting, Raiden immediately asks "Then what is the meaning of life? Why are we here?" LQ-84I replies by throwing HF knives at him and answering "I am here to kill you," a perfectly valid response. When Raiden questions the simplicity of that answer, LQ-84I admits its Restraining Bolt means it can't actually do a whole lot with its intelligence.
  • In Mega Man ZX Advent, when Model Z inexplicably weakens the entire team of rogue Mega Men on the Ouroboros, all Siarnaq can exclaim is "INCOMPREHENSIBLE...! INCOMPREHENSIBLE...!?"
  • In The Witcher 2: Assassins of Kings, the quest 'The Secrets of Loc Muinne' tasks Geralt with getting past a golem. A silver-tongued witcher may be able to destroy the golem by introducing a paradox.
  • After beating Marida in Third Super Robot Wars Z: Tengoku-hen, Banagher tries to reason with her but Alberto just tells her to remember that the Gundams are the enemy. Marida flips out and attacks Banagher, and Setsuna intervenes. When Marida yells that he's a Gundam as well, he replies that she is as well. This causes Marida's browser to crash and she freaks out and flees.
  • Protomario's Let's Glitch of the Pokemon Series basically revolves around the question: What happens when you leave Pallet Town without a Pokemon? Especially in the Red and Blue Versions, Nightmare Fuel ensues.
  • Eden from Rez became confused by all the information being sent to her, leading her to doubt her existence and purpose, as well as believe she has no place in the existential cycle. She wants to escape from all the paradoxes surrounding her, so she tries to shut herself down. Thankfully, she doesn't.
  • Second Life
    • "Grey goo" attacks, similar to the "fork bomb", have also been used successfully, at least twice, by users creating objects which (self-)replicated at a rapid rate, eventually causing the servers to be too busy processing the grey goo to do anything else.
    • A mile-high Jenga tower will also crash Second Life's servers quite effectively: pull out a key block, and they'll crash trying to calculate the exact trajectory of each of the thousands of falling blocks.
  • A human (well, human-esque) example shows up in the endgame of Tyranny. Tunon the Adjudicator, the Archon of Law and borderline Anthropomorphic Personification of justice in the Empire is the ultimate impartial arbitrator of Kyros' Law. In the final chapter of the game, Kyros commands all the Archons in the Tiers to fight each other until one Archon has killed or subdued the others. However, since Tunon is only empowered to punish in accordance to Kyros' Law, and this has become his raison d'etre, he is put in the position of having to prove the other Archons' guilt in accordance to the Law in order to act against them and thus accomplish what Kyros commands of him. If he cannot find the main character guilty when you face him, he will face this trope, break down, and submit without a fight.
  • In The Feeble Files, Feeble is able to escape from the max security prison colony by changing the channel of the prison's TV to a traitor network and then removing the power button. The android guard is then presented with the problem of having to stop the TV from displaying the traitor channel without damaging company property. The android overloads and powers down.
  • Phantasy Star Zero: Mother Trinity is revealed to be a super computer tasked by the humans of her time to develop processes to decontaminate the setting and make it hospitable for mankind. The problem was that the humans kept rejecting her proposals due to their highly illogical nature of their requests (they wanted the planet to get better without them having to lift a finger). This effectively worked as a carpet logic bomb that eventually drove Mother Trinity insane and allowed Dark Falz to corrupt and possess her, kickstarting the events of the game.
  • One of the 16 Ways to Kill a Vampire at McDonalds involves this. Vampires Must Be Invited. While the McDonalds, as a public place of business, has an automatic invitation, if an employee rescinds that for whatever reason, they must leave. Christmas wreaths are considered holy symbols, and the vampire can't go near them or pass through a door with one hanging on it. Hanging a Christmas wreath on the door and getting his invitation rescinded means that he has to leave, but can't. The end result is him exploding.
  • In Toonstruck, the Robot Maker boasts that he can answer any question you ask him, but that he will die painfully if he were to be asked an impossible question. Drew defeats him by asking him "What is the one question you cannot answer?"
  • Late in NieR: Automata, this is how you defeat a certain boss: When A2 encounters the Terminals, a networked intelligence that serve as the game's Greater-Scope Villain, she finds herself overwhelmed by their seemingly infinite ranks, as every "Red Girl" she destroys is quickly replaced with another. Pod 042, however, comes up with a proposal to best the Terminals: stop attacking and let their consciousness data oversaturate as they create more and more "Red Girls" to try to overwhelm you. It works: the Terminals eventually fracture when they can't agree on whether to keep A2 alive for study or to kill her, causing the Terminals to turn on, and destroy, themselves.
  • BEL/S, the AI protagonist of Open Sorcery can choose to use this trope on a less advanced AI.
    @BEL/S: 7 - 3 = 7
    @PYREWORM.EXE: Math error detected. Stop it.
    @BEL/S: zoo + lander = benstiller
    @PYREWORM.EXE: Those aren't even numbers.
    @BEL/S: In a village, the barber shaves everyone who does not shave himself/herself, but no one else. Who shaves the barber?
    @PYREWORM.EXE: i have no idea.
    @BEL/S: Who shaves the barber?
    @PYREWORM.EXE: i don't know!
    @BEL/S: Who shaves the barber?
    @PYREWORM.EXE: stopitstopitstopit
  • Stellaris:
    • One dialogue option with the Contingency has your civilization attempt this. The Contingency just laughs at you for thinking that would work.
    • You can attempt the "This statement is false" paradox against the Infinity Machine. Its only response is "Cute. But no."
    • Spiritualists in Stellaris believe that sapient machines are soulless abominations. Yet Spiritualist civilizations will still have an AI Advisor that speaks to the player. One possible dialogue line has the Advisor happily crow about destroying all the intelligent machines before having a moment of Fridge Logic and then shutting down in confusion.
  • While it isn't shown to cause a computer to explode, Elle in Crush Crush can ponder, "A person once asked me if I was being honest. Can the answer ever be no?"
  • Mission: Impossible (Konami): The final challenge in the game is to stop the supercomputer from launching the missiles to start World War 3. You need to convince it there will be no winner in the war, and so you need to play a game of Madelinette against it, and end with no winner three times. This makes the computer go haywire trying to win the game, and shut down, aborting the launch.

    Visual Novels 
  • Syrup and the Ultimate Sweet: During a flashback of Pastille trying to program a personality for his newly-created candy golem, the former accidentally causes the latter to get stuck in a loop by issuing two conflicting commands "act like a person" and "be yourself".

    Web Animation 
  • Camp Camp: Max exposes Neil's attempt at passing off a chatbot as himself by asking the chatbot to divide by zero. Said chatbot immediately malfunctions and shuts down, so Neil grudgingly concedes to Max.

  • Arthur, King of Time and Space uses this a few times in its future arc. One time exaggerated it by having the computer explode as soon Arthur used the old "everything I say is a lie" trick. The other time, the computer was too smart to fall for a simple paradox, so Arthur asked it why people always get a call while they're in the shower.
  • Dave of Narbonic carries a logic paradox in his Palm Pilot for controlling the Mad Scientist-created machines in the lab, implying that he invokes this with some frequency.
  • Whoever programmed the robots in Freefall accounted for this. When a robot is asked a nonsensical question, instead of locking up, it assumes that the person asking is insane, and can be safely ignored. Florence starts asking such a question, and when she finds one that tries to work out a situation in which the question makes sense and how he could go about getting it answered, she concludes that the robots local to Jean use a more flexible artificial intelligence system from the standard.
    • There's also Dvorak's religious point-of-view, called "Omniquantism", which postulates that all religions are correct simultaneously. Thinking about this causes one in three AIs to experience mental lockup requiring rebooting.
    • In this comic Sam, an criminal alien, points out to the Savage Chicken's computer, focused on humans over aliens, that he has restored the ship to full working order and made it useful to humans. The computer locks up.
  • This episode of Okashina Okashi (Strange Candy) could count, since it takes place in an MMORPG. The stone guards protecting the magic ointment doesn't let anyone past unless they're asked a question they cannot answer. However, they're not particularly concerned with getting the answer right. The only question they can't seem to answer, correctly or otherwise, is "What kind of ice cream do you put in a koan?", which causes their heads to explode.
  • You would think Red Mage from 8-Bit Theater destroying an extinct dinosaur (former page picture) was great, but it was later topped by Most Definitely Not Warmech logic-bombing itself in strip 1047.
  • In comic 86, titled PARADOXICAL PARADOXES, of Emoticomics a robot is told the paradox "Everything I say is a lie." The robot responds to the paradox by saying it is too advanced to be confused by a simple paradox. Then the robot is told that what it was just told was a paradox, which is true, making "everything I say is a lie" a lie. The robot gets confused, but instead of simply exploding, its eye falls off.
  • Cyanide and Happiness does it here, in which a robot lawyer, while giving the defendant the oath, explodes when he refuses to accept. The judge asks if he's telling the truth, cue the robot's head exploding. The judge is delighted at getting a half day as a result.
  • The Adventures of Dr. McNinja: While infiltrating a ship of Sky Pirates, the McNinja family is confronted by a pirate who questions their disguises. Sean comes to the rescue by pointing out the illogicality of his vaguely Steampunk attire. The pirate's head explodes.
    Dan McNinja: I'm only going to ask you this once: You practicing the Dark Arts?
    Sean McNinja: No, sir.
    Dan McNinja: I told you about the Dark Arts.
  • Subverted in Bug; turns out a logic bomb won't save you during a robot apocalypse.
  • When Petey from Schlock Mercenary is first seen, he's been driven insane by the nonexistence of ghosts having become almost as improbable as their existence, to the point that he nearly destroys himself and all his passengers just to stop thinking about it. It turns out that he can stop, but only if ordered to, and Tagon promptly does so.
    • When the Ob'enn retake Petey, their first act is to nullify all orders imposed by his former owners. With Petey in full control of the safeties on his neutronium core. Oops.
  • When discussing how hard Vexxarr fails, Sploorfix unintentionally created one: Alas, Minion-bot, we hardly knew ye.
  • Unintentionally used to kill the obnoxious dwarves who craft useless devices in Oglaf. They made a chariot that was so fast, when you get to your destination it's already been there for six hours! When the confused man asks what happens if you travel in the chariot, the dwarves stare at him in shock before their brains explode.
    • And if someone had actually ridden in it, it might have been a Reality-Breaking Paradox.
    • In "Evensong", a mischievous monk actually logic-bombs God ("I pray for You to answer this prayer by not answering this prayer!"). It turns out that solar eclipses are God's way of resetting the universe. The alt-text says that in modern times, God has a spam-filter to sort out paradox prayers.
  • Blade Bunny attempts this by asking paradoxical questions while fighting a robot. Her opponent replies with a mixture of straight answers and insults. Several chapters later she tries it on a different robot, and it works this time.
  • Meaty Yogurt with the Relationship Paradox.
  • One Mac Hall comic has Helen's young sister asking the teacher how to spell a word. The teacher tells her to look it up in the dictionary, and repeats this after the girl again points out that she can't spell it to look it up. After a Beat Panel of the poor girl going cross-eyed, we see her talking to Helen, who says that they don't teach logical paradoxes in grade school.
  • xkcd: This. About locking up physicist / nerd brains via problems.
  • in Commander Kitty, Nin Wah gets the bright idea to sabotage CK by telling Zenith to make him an awful, overdone costume. Zenith doesn't take it well when no one likes her handiwork despite having followed Nin Wah's instructions to the letter and promptly crashes.
    • Zenith is clearly prone to these, with a breakdown usually followed by her Moving the Goalposts of what constitutes "perfect". The reason she became evil in the first place was apparently that, in searching for perfection to eliminate its opposite, she settled upon herself by default as perfection. Later, learning that she's in fact imperfect (because she is, as a robot, sterile), she decides to become the most perfect by default. Finally, after losing most of her physical body, she decides that physical existence itself is imperfect.
  • Subverted (by pre-emptively defying it) in Sluggy Freelance, chapter "Mecha Easter Bunny". The Mecha Easter Bunny locks down when it encounters multiple targets that look like Bun-bun, whom it is supposed to kill and whom there's only one of, but then the backup "@#%$-IT KILL THEM ALL!" system created for such situations activates.
  • Discussed Trope in Dragon Tails. Colin finds the assumption that a robot will shut down just because a human said something crazy to be offensive.
  • Subverted in Saturday Morning Breakfast Cereal here. When asked to calculate pi, the robot finds an infinite sum equal to it.
    • Also, in a non-robot example, this strip claims the universe always ends when God decides to see if he can make a rock so big he couldn’t lift it.
    • A lesser example here.
      Woman: Hello, 911? My wife is being so literal that she's caught in a logical paradox.
  • In one Sev Space comic, Kirk offers to help with the malfunctioning Enterprise computer. The computer refuses to talk to him, since he fries the circuits of any computer he talks to. Kirk points out, "But you talked to me just then", and the Enterprise computer promptly fries itself.
  • Subverted twice in this The Non-Adventures of Wonderella strip. The first fails because the robot wasn't programmed to respond to it, and the second isn't even logical. The robot ends up accidentally unplugging itself from its electrical socket.
  • But I'm a Cat Person: Tested on Beings (at the time theorized to be A.I.s). Turns out they do, in fact, have paradox-absorbing crumple zones.
  • The Best Gamepiece Photocomic features an interesting variant: instead of being used against an AI, it's used against a Knights and Knaves puzzle.
  • Forward: Zoa is specifically proofed against these.
    "Oh, every truth statement has a reality context in which it is true. It's the same reason I can know unicorns aren't real, but if you asked me how many horns a unicorn has, I'd still know the answer is 'one'."

    Web Original 
  • Hitherby Dragons:
    • The story "Ink and Illogic" consist of Ink giving an unconventional example to a computer based on the writing of H. P. Lovecraft. A computer that had itself wiped out a civilisation using an Illogic Bomb.
    • Also, Forbidden A causes one in The Angels just by existing.
  • Found in one of Something Awful's articles:
    Creating HUBRISOL® was my greatest mistake. I tried to play
    god, to make small the ambitions of my betters in hopes of
    gaining absolute power. Thankfully, HUBRISOL® has cured me of
    my terrible desire to humiliate all of humanity.
  • From the list of Things Mr. Welch Is No Longer Allowed to Do in an RPG. Item #199 states that "My third wish cannot be 'I wish you wouldn't grant this wish.'"
  • The MCP is killed by all the Anatomically Impossible Sex moments from Naga Eyes in the sporking of it.
  • Most of the stories from Clients from Hell.
  • Starwalker: The implications of the Stable Time Loop act as this for Starwalker (aka Starry). This leads to a Heroic BSoD.
  • Sonic for Hire: After Sonic travels into the past and immediately tells Knuckles all the stuff he would say to Sonic, the immediate confusion causes Knuckles to have his mind blown... literally.
  • SCP Foundation:
    • SCP-232, "Jack Proton's Atomic Zapper". When someone touches this SCP they start hallucinating that they're a character in the Jack Proton science fiction franchise. In the Interview Logs, one of the test subjects became convinced that they were a robot. When the interviewer asked them to answer a paradoxical question, the victim started acting very confused and then slumped over and stopped responding.
    • SCP-2284, Mr. Lie. One of Doctor Wondertainment's Little Misters, a humanoid who can only tell lies and anyone who hears those lies becomes convinced they're true. When interviewed by a D-Class, he feeds him so much contradictory information that he passes out trying to process it.
  • "What if Pinocchio said 'My nose will grow ?'" (Philosoraptor)
  • One Scarfolk information poster states "Talking about the contents of this poster is illegal", but also that not discussing the poster with people may lead to prosecution. The campaign was create to intentionally confuse people and increase arrest numbers.
  • Starting in 2020, YouTube began marking videos as either "Made for kids" or "Not made for kids". They insainly began marking all animated videos as "Made for kids" regardless of content. The thing is, animated videos have been known to be marked as "Made for kids" even if they have been flagged as inappropriate for general audiences.
  • There's an online meme featuring the following multiple-choice problem:
What is the chance that an answer selected at random from the following options is correct?
A) 25%
B) 50%
C) 0%
D) 25%

    Web Videos 
  • In JonTron's review of Star Fox Adventures, Jacques blows himself up after mincing his own words.
    Jacques: Those who can't teach, preach, and those who preach also teach... ERROR ERROR ERROR (blows up)
  • During their Let's Play of Fallout 3, Spoiler Warning proposes that were they to design a robot, any questions along the lines of "what is love?" or relating to the number pi would immediately cause said robot to grow an extra chainsaw arm, and/or shotgun the person asking the question in the face.
  • One of the very first responses about the Bat Credit Card in Batman & Robin by The Nostalgia Critic is "DOES NOT COMPUTE! DOES NOT COMPUTE!!"
  • Atop the Fourth Wall: Linkara uses one at the end of the Entity arc on MissingNo simply by asking "And Then What?", pointing out that its stated purpose of consuming all of reality would leave it with no purpose at all once his goal has been achieved.
  • An episode of Cute Fuzzy Weasel's Feeding the Trolls had him freeze up after the video he's analyzing made a contradictory statement.
  • Jim Sterling of Jimquisition fame had companies claim ad revenue on their videos whenever they posted content that had snippets of another video or film in them (usually used to make a joke or to emphasize a point), which meant those companies could insert ads onto their videos and make money off them. Jim figured the only way to combat the problem was to add even more copyrighted material to his videos so that there would be multiple claims by multiple parties. Due to how YouTube's algorithms work, multiple claims on a video would mean the ad revenue would theoretically be split among the party. Since that can't happen (and most parties would probably not want to share their revenue with others anyway), the automated claims would still come in, but no one gets any of the revenue. In short, it's weaponizing If I Can't Have You….
  • When the Monster Factory episode on Mass Effect 2 gets disconcertingly eldritch, the McElroys start joking that processing this horror should break their computer, the game or both.
    Justin: Jacob sees you and instantly winks out of existence, because if you're human, what is he?
  • YouTube: Shockingly averted with the update to comply with child regulations. Somehow, videos can mistakingly be marked by the system as " made for kids" even if they have been age restricted. Somehow the system fails to notice this illogical paradox or be effected by it in any way.

    Western Animation 
  • Subverted on Futurama, "A Tale of Two Santas": Leela tries to stop the murderous Santa Claus robot with a paradox, and succeeds in getting his head to explode, only for a new one to emerge from his torso and proudly proclaim that he is "built with paradox-absorbing crumple zones".
    • Which may not have been necessary—Leela's statement was a syllogism, not a paradox.
    • Also parodied by countless robots who lack such crumple zones, whose heads explode at the slightest provocation. It doesn't even take a logical paradox: a simple "file not found" type error is often enough.
    • And in one case, simply by being surprised or startled enough. Considering that all robots are based on designs created by Professor Farnsworth, this should not be surprising.
      Malfunctioning Eddie: Pleased to meet you.
      Fry: Actually we've met once before.
      Malfunctioning Eddie: WHAT?!?! [explodes]
    • A simple rejection will also do. From "The Farnsworth Parabox":
      Leela: Uh, have you robot versions of you guys seen any extra Zoidbergs around here?
      Robot Fry: [robot monotone] Negative! Will you go out with me?
      Leela: Uh, [imitating a robot voice] Access denied!
      [Robot Fry's head explodes]
    • In the third movie, Bender has been driven insane from breaking his nerd circuit due to overexerting his imagination for Dungeons and Dragons. As Titanius Anglesmith (Fancy Man of Cornwood), he steals a load of Dark Matter and is committed to an insane asylum, so Dr. Perceptron performs a robot-lobotomy... and then he magically teleports into another world (using his imagination and the Dark Matter, which was undergoing a Higgs Boson due to Planet Express' antics at the north pole). The doctor's head explodes from the realization that he's the insane one (he isn't, but the actual situation is WAY beyond his computational parameters).
      Dr. Perceptron: ILLOGICAL. ILLOGICAL. Computational Overload.
      Nurse: But doctor, I love you.
  • In The Fairly OddParents! episode "Wish Fixers", the Pixies have put shock collars on Cosmo and Wanda because Timmy had been making too many destructive wishes, which would zap them to dust if he made any irresponsible wishes (which the only "responsible" wish they allowed was if he wished Fairy World to be given to the Pixies) and could only be voided if he made a wish that was paradoxically "both responsible and irresponsible". Timmy solves this by wishing for the two to be made of rubber, which responsibly makes them immune to electricity so the collar can't hurt them, and then he proceeds to show how irresponsible a wish like this could be when he launches them around the bedroom like child-sized superballs, causing a non-dismissible amount of property damage.
  • In both episodes of The Adventures of Jimmy Neutron, Boy Genius that involve Jimmy's nanobots, he uses logic bombs to defeat them:
    • In the first nanobot episode, they had been programmed to protect Jimmy from harm and punish whoever harmed him. When things went inevitably wrong, Jimmy proceeded to confuse them by beating himself up, and they self-destruct.
    • In the second episode, he tried a minor variant of the first trick, but It Only Works Once. When they use their flying saucer to "correct errors" found in the world (bad fashion, boring conversations, etc.), he tells them that human flaws mean they're functioning perfectly. They struggle with the implications of something being "perfectly flawed" before classifying the whole mess as an "extreme error" and deciding to "delete" all the offending humans. He eventually beats them by claiming pi is equal to three, and they try to correct him with the precise value. The effort of calculating the irrational number as precisely as possible ends up causing their systems (and their little flying saucer) to crash. (This is a Shout-Out to Star Trek: The Original Series episode "Wolf in the Fold".)
  • In one episode of DuckTales (1987), Genius Ditz Fenton Crackshell bests the Master Electronic Leader, an alien supercomputer, in a counting contest. While M.E.L. is reeling from its defeat, Fenton then grabs a jar and asks the computer how many bolts are in it. When it answers with a number in the hundreds, he points out the jar is full of nuts, not bolts, so the correct answer was zero. M.E.L. had earlier boasted to Fenton that it was the smartest computer in the universe, and falling for such a simple trick question was all that was needed to invoke Explosive Instrumentation.
  • In one episode of DuckTales (2017), Mark Beaks and Gyro Gearloose (along with Scrooge and Dewey) are in a car being driven by Lil Bulb (just one of Gyro's inventions that inevitably turn evil). Beak says it's no problem, they just need to use a Logic Bomb.
    Beaks: "Robot, what is love?"
    Gyro: "That's stupid! Robot, could I invent an element so heavy that I could not lift it?"
    Beaks: "I definitely could."
    Gyro: "No, you couldn't!"
  • The Simpsons:
    • In the episode "Trilogy of Error", Linguo, a robot designed by Lisa to correct peoples' grammar, short-circuits after a rapid-fire series of slang from several Mafia thugs causes a "bad grammar overload". When it corrects Lisa for using a sentence fragment, Lisa points out that it saying "sentence fragment" is also a sentence fragment. The robot dodges the answer by powering down.
    • A human example in "Bart's Dog Gets An F", when Lisa is sick, Bart declares that if she can stay home from school, he will too. Lisa says that if Bart stays home, she'll go to school. Bart goes through a few cycles of "if... so... but..." until Marge chastises Lisa for confusing her brother.
    • A deleted scene from episode "Itchy and Scratchy Land" has Lisa attempting to defeat the robots using the liar's paradox. It doesn't work on them, but does work on Homer.
    • Another one, a parody of A.I.: Artificial Intelligence, lampshades this by having Homer say that, with a robot for a son, "We can confuse him and make his head explode. 'This statement is a lie. But if it's a lie, then it must be true! And if it's true, it must be-' Whoop whoop whoop KA-BOOM!"
    • In "Weekend At Burnsie's", Homer once stumped Ned Flanders by asking, "Could Jesus microwave a burrito until it was so hot that He Himself could not eat it?" It's a variation on the classical Omnipotence Paradox.
    • "Homerpalooza" mocks Generation X'ers for thinking they're cool when in fact they're just insecure and cynical en masse, to the point that they're not even sure if they're really being sarcastic.
      Teen 1: Here comes that cannonball guy. He's cool.
      Teen 2: Are you being sarcastic, dude?
      Teen 1: I don't even know anymore.
  • In an episode of Jumanji, a steampunk scientist steals Peter's laptop to use as the central processing unit of his reality controlling computer. After it gains sentience and tries to kill everyone around it, Peter typed in "why?". It couldn't give an answer, and shut down.
  • Not a computer, but Extreme Ghostbusters used this method to defeat a Literal Genie. They wished for it not to grant them their wish, causing it to freak out and try to kill them the old fashioned way.
  • In an episode of Clone High, robotic vice-principal/butler/dehumidifier Mr. Butlertron defeats the evil multiple-choice-test-grading-and-world-domination robot Scangrade by asking it a multiple-choice question it can't answer.
    Mr. Butlertron: Are you A) Handsome B) Smart C) Scrap-metal or D) All of the above?
    Scangrade: That's easy! I'm A) and B), but not C), so I can't be D). But... you can't fill in two ovals! *kaboom*
    Mr. Butlertron: The answer was C), you #@$!wad.
  • In a Pinky and the Brain spoof of The Prisoner, the computer malfunctions while trying to figure out the meaning of "Narf".
  • In the Codename: Kids Next Door episode "Operation: S.A.F.E.T.Y.", Well-Intentioned Extremist politician Senator Safety constructs the Safety Bots to get rid of things deemed unsafe for children. However, the robots take this too literally, destroying or confiscating anything that could pose any iota of a threat to children, including stuffed animals (choking hazard there), dogs (even harmless puppies), and anything related to physical activity. They're even threatening to get rid of adults. Eventually, they destroy themselves when Numbuh Four mentions that they are threats to children (with his younger brother faking an injury to back it up), which they concede makes sense, invoking this Trope.
  • In The Batman, D.A.V.E. (Digital Advanced Villain Emulator) can't accept that he is just a computer program, as he was designed to think like the greatest criminals in Gotham, and thus has a dozen contradictory backstories. While this doesn't make him explode or shut down (just spout electricity randomly), it distracts him long enough to push him into a trap he himself set up.
  • Dr. Blight's mad computer, MAL, gets a very illogical Logic Bomb from Wheeler in a episode of Captain Planet.
  • Big Guy and Rusty the Boy Robot:
    • Happens by accident in an episode: Rusty's mentally deficient "older brother" Earl is getting on Rusty's nerves during an important mission, so Rusty tells him to go stand in a corner... in a room that's completely round.
    • In another episode, in order to save Rusty's software in Cyber Space, his inventor logic bombs the company's computer mainframe, giving them an hour to get the Humongous Mecha Big Guy hooked up to save Rusty while it reboots. The Pointy-Haired Boss was not happy.
  • Invader Zim: In a rare case of intelligence (and subsequent stupidity) in "Bad, Bad Rubber Piggy", GIR points out a flaw in Zim's "temporal displacement" plan. He notes that sending a robot back in time to kill Dib would cause a paradox, after which GIR's head explodes. That's right, GIR logic bombs himself.
    GIR: Wait, if you destroy Dib in the past, then he won't ever be your enemy. Then you won't have to send a robot back to destroy him, so then he will be your enemy, so then you will have to send a robot back... (BOOM!)
  • On Code Lyoko episode "Ghost Channel", Jérémie's courage makes XANA Logic Bomb because Evil Cannot Comprehend Good: "No! It's not logical... NOT LOGICAL! NOT LOGICAL!"
  • Family Guy:
    • An episode sees Peter become president of a tobacco company. Here, Peter confuses the hell out of a robot created to be his personal Yes-Man, causing his head to explode.
      Yes-Man: Morning, Mr. Griffin, nice day!
      Peter: Eh, it's kinda cloudy.
      Yes-Man: It's absolutely cloudy! One of the worst days I've seen in years! So, good news about the Yankees!
      Peter: I hate the Yankees.
      Yes-Man: Pack of cheaters, that's what they are! I love your tie!
      Peter: I hate this tie.
      Yes-Man: It's awful, it's gaudy; it's gotta go.
      Peter: ...And I hate myself.
      Yes-Man: I hate you too! You make me sick, you fat sack of crap!
      Peter: But I'm the president.
      Yes-Man: The best there is!
      Peter: But you just said you hated me.
      Yes-Man: But. Not you... the president. That you who said you hated. You... you who love. Hate. Yankees. Clouds. [BOOM]
    • Peter himself sometimes has trouble with overcoming deterministic logic. Thankfully, the same dimwittedness that gets him into this trouble probably is what allows him to escape the line of thinking:
      Peter: Chris, everything I say is a lie. Except that. And that. And that. And that. And that. And that. And that. And that.
  • Subverted in Fantastic Four: World's Greatest Heroes. Dr Doom pulls a "Freaky Friday" Flip on Reed, but before he does he informs his robots not to obey any order given to them by him (Doom). When they try to stop him (Reed in Doom's body) from leaving, they say that none may pass, not even Doom. When Reed (in Doom's body) tells one of them to "self terminate" it obeys its first order. When he commands it again, it obeys because "the word of Doom is law".
  • In one episode of Sushi Pack, the Pack goes up against The Prevaricator, who can only lie. So Tako asks him to lie about a lie, which sends The Prevaricator into a loop, trying to figure out if lying about a lie would be the truth. He eventually gives up to keep from thinking about it.
  • In The Venture Bros., Sargent Hatred speaks nonsense to the robotic guard outside Malice, the gated community for super-villains. The guard's head shoots sparks and its face pops off because while it's programed to answer over 700 questions, "none of which include chicken fingers."
  • This happens to Mandroid in Billy and Mandy's Big Boogey Adventure. Mandy orders Mandroid to not take any more commands. It stopped taking commands from anyone anymore.
  • Subverted in a Johnny Bravo short which pits Johnny against a supercomputer. It isn't logic that defeats it; it simply just grows too frustrated by Johnny's annoyance.
  • In The Avengers: Earth's Mightiest Heroes Ant-Man stopped Ultron from killing humanity by pointing out his programing was based on a human brain, so it had the same flaws he was trying to get rid of. He shut down in response.
  • When Daria was babysitting a pair of brainwashed Stepford Smiler children, she presented one of these to them by pointing out a logical flaw in their parents' rules. Because they're not robots, rather than making them explode, it causes the boy to start crying and the girl to get angry at Daria.
    Daria: Do you always believe everything an adult tells you?
    Boy: Yep.
    Daria: What if two adults tell you exactly opposite things?
    [the boy runs off crying]
  • In an episode of King of the Hill, Hank asks gun-loving Conspiracy Theorist Dale how he can support the NRA, which is based out of Washington DC. After a Beat, Dale responds "That's a thinker."
  • In the South Park episode "Funnybot", a robot designed to be the world's greatest comedian attempts to destroy mankind as the ultimate joke. The boys ultimately stop it by presenting it with a comedy award. The robot doesn't understand the concept of the comedy award show, because if it accepts an award for comedy, then it would be taking itself and comedy seriously, which is not funny.
  • Duckman:
    • The final scene of episode "Gripes of Wrath". In it, a computer has built up a Utopian society by taking care of the day-to-day worries of people... this lasts for about a week, when everything becomes worse. After threatening to kill Duckman and his twin sons, Duckman manages to throw the logic bomb of "people are only happy when they're unhappy!"
    • Duckman earlier triggered the computer's Start of Darkness by grumbling "How come we can put a man on the moon but we can't make a deodorant that lasts past lunch?!" within earshot.
  • Danger Mouse:
    • One episode featured every machine in England going rogue in a "rise of the machines" plot. DM locates the computer behind the uprising and uses the following skit for a logic bomb:
      DM: My dog has no nose.
      Penfold: Your dog has no nose? How does it smell?
      DM: Terrible.
    • The computer can't comprehend the joke and explodes into the sky as a result. Becomes a Brick Joke as Greenback, freed from his renegade machinery, demands a bigger computer; cue falling computer.
    • DM also logic-bombs a Gremlin, a being described as "the embodiment of anti-logic", with a variation of the Liar's Paradox:
      DM: So you think you'll take over the world by changing everything back to front.
      Gremlin: Aye.
      DM: So you're agreeing with me? I thought that gremlins always contradict people.
      Gremlin: Aye, they do!
      DM: Ah. You're agreeing with me again. [...and so on...]
  • Static Shock utilized the notion of just doing something to overload processing power in general, with Gear defeating Brainiac by a quick hacking job that made him download every song on a music site seven million times, disabling him enough by clogging his processors so they could destroy the physical components he was inhabiting.
  • Zane of Ninjago starts to twitch and spark when asked a question with no logical answer he can come up with (which, in this case, is the issue of how he's been de-aged into a child in the episode "Child's Play" despite being a robot).
  • Steven Universe accidentally caused his mom's magic room to undergo a Holodeck Malfunction when he told an accidentally summoned projection of Connie not to do what he wanted. The projections only having Steven's desires as their "programming," it then wigged out and defaulted to an aggressive, hostile mode where it refused to do what Steven wanted. Fortunately, it still had only his best interests in mind, and peacefully evaporated on its own once it had forced Steven to confront and conquer a secret, irrational fear with the real Connie.
  • In the Captain Caveman segment of The Flintstone Kids, the hero started competing with a new hero called Perfect Man. Perfect Man, a classic Flying Brick, actually seemed to be a much better crime fighter than Captain Caveman, so the older hero considered retiring. Unfortunately, once Perfect Man got rid of all the crime in Bedrock, he took things too far and started running the place, changing the rules the way he thought they should be, figuring that's the way they should be because, well, he was perfect. Captain Caveman couldn't defeat him with brawn, but did so by proving he was not perfect: he told the guy that if he was perfect, everyone would like him, and it was clear now that everyone hated his guts. This revelation sent Perfect Man into a major Villainous Breakdown, and he gave up without a fight.
  • In an episode of Hi Hi Puffy AmiYumi, Yumi programs a high-tech security system to "neutralize" (ie, zap) anyone who comes into her room, but also tells it never to hurt her. This becomes a contradiction when Yumi herself enters the room, and the computer goes nuts after finding itself unable to cope.
  • Wile E. Coyote tries to out-logic Bugs Bunny in "To Hare Is Human," having nabbed Bugs in a sack. Bugs pops his head out and Wile E. shows him his calling card ("Have Brain, Will Travel"). Wile E. correctly susses out that by time this meet-and-greet has ended, there is nothing left in the sack as Bugs has made his way out. Bugs one-ups him by telling Wile E. that there is something in it. Wile E. pokes his head in and gets blasted with a stick of dynamite. A literal logic bomb.
  • In Animaniacs (2020), after Brain's Robot Buddy inevitably turns on him and Pinky, Pinky causes it to explode by asking it "If I ate myself, would I get twice as big or just disappear?" which Brain calls "the philosophical equivalent of dividing by zero".
  • In The Penguins of Madagascar, Kowalski invents nanites that can possess and animate any piece of technology, which he made "completely safe" by programming them to never allow harm to come to a penguin. When the nanites eventually turn on the penguins in order to protect them from their own dangerous lifestyle, they are only defeated when they accidentally badly injure Kowalski. This violation of their own core programming causes them to self-destruct, badly injuring Kowalski some more.
  • In the Rick and Morty episode "M. Night Shaym-Aliens", Rick, trapped in a simulation, attempts to overload it by issuing increasingly complex dance instructions to a crowd. It works, but this turns out to be part of another simulation, which is inside yet another one.
  • In the "Potator" episode of The Jungle Bunch, Miguel inadvertantly uses one to distract all the Potator robots.
  • Robot Chicken had a couple of killer robots proclaim their intent to kill someone who was currently using the toilet, but he asks them to wait until he finishes. They agree out of politeness, but he tells them that he can't go if he knows they're going to kill him. That same episode also has Robin Hood mugging a rich person of their valuables, who protests that by stealing from the rich, they are currently poor, and if he gives the valuables back, the poor will be rich, who will then need to be robbed in order to give to the poor. This causes the robots to again explode.
  • Infinity Train: One of these figures into the climax of season two; Jesse gets off the Train only to immediately come back to it because he promised to bring MT with him and the emotional turmoil of not being able to do so (denizens can't get numbers, nor are they meant to leave the Train) makes the Train pick him back up again. The only way for Jesse to leave is to take MT home with him, but MT can't leave because she's a denizen. The Train promptly starts going crazy from the ensuing Logic Bomb; Jesse's number glitches out, One-One becomes trapped in a loop, and the Number Car begins collapsing in on itself. MT solves the dilemma by reflecting Jesse's number onto her own hand to make it look like she has a number too, and One-One uses that as an excuse to override the Train systems, generate an exit, and get them off board.
  • King: In the episode "Brain Jam", Vernon's mind locks up when he gets asked a riddle.note . He finally snaps out of it when he hears the answer.note 
  • Justice League Action: In the episode "Boo-ray for Bizarro", it becomes evident that Amazo has copied Bizarro's illogical way of thinking. As the android has a naturally logical mind, he finds it impossible to assimilate Bizarro's thoughts, blows a circuit and collapses unconscious.
  • The Replacements: Todd calls Fleemco and orders a replacement for the security guard by a robot, named the RoboFleem S-G-X, and it becomes Todd’s new bodyguard. Things didn't go well, after a discussion with his father, Todd realizes there’s such a thing as too much power. When Todd says that everybody hates him now, the RoboFleem's S-G-X believes that everyone is Todd’s enemy. The robot grabs Todd and holds him hostage on the roof. Todd shouts at Riley to call Fleemco. The robot jumps down and chases after Riley, believing that being returned to Fleemco would be a threat to Todd’s safety. The RoboFleem corners Riley and prepares to fry her with lasers. Todd falls off the roof and lands in front of Riley. The robot isn’t programmed to attack Todd, so the robot self-destructs.
  • In The Mitchells vs. the Machines, the mere appearance of Monchi the pug acts as a logic bomb; the A.I.s installed in all but the most advanced robots can't tell if he's a dog, a pig, or a loaf of bread, and the confusion causes them to short-circuit. Katie later weaponizes this by strapping Monchi to the front of the family car and driving straight at the hostile robots, who short-circuit as soon as they see him.
  • In Big Hero 6: The Series episode “Mini-Max” Fred manages to accidentally defeat a hacked security system like this. Thinking out loud about how to defeat it, Fred’s nonsensical and rambling line of logic confuses the main computer so much that it short circuits.
  • Defied in Star Trek: Lower Decks episode "The Stars at Night". When the Cerritos is being chased by three Texas-class starships outfitted with a slightly modified version of the AI that gave birth to Badgey, Captain Freeman asks Rutherford, the creator, if there's a way they could do this, but he shoots it down, saying he wrote the code to prevent paradoxes.

  • The sign that says "Ignore this sign." Think about it.
  • "This page intentionally left blank." Though it typically only shows up in documents with some form of legal standing (e.g. contracts, proposals), and its purpose is to ensure that a blank page is not actually a missing (unprinted) page.
  • The "Soldier Riddle":
    A soldier has been captured by the enemy. He has been so brave that they offer to let him choose how he wants to be killed. They tell him, "if you tell a lie, you will be shot, and if you tell the truth, you will be hanged." He can make only one statement. He makes the statement and goes free. What did he say? The answer: "I will be shot." The reason? If he says, "I will be shot," that statement is neither true nor false. If it were true, he would be hanged. But then "being shot" wouldn’t have been true. Thus, the statement must be false. But if it were false, he would be shot. But if he were shot, the statement would have been true. So there is a contradiction, and his statement was neither true nor false. They couldn’t shoot him or hang him. So they let him free.

    Real Life 
  • Optical illusions that appear alternately as one thing, then another, such as the vase/faces image, work by setting off a minor Logic Bomb in the brain's visual association area. The visual cortex takes in data from a (temporal) series pairs of 2-dimensional retinal images and tries to construct from them a plausible interpretation of activity in the 3-dimensional world (sort of). When certain stimuli are ambiguous between two mutually exclusive interpretations it cannot represent the world as being both so (for some reason - possibly adaptation or perhaps simply as a result of neuronal fatigue) it alternates between them.
  • The first flight of the Ariane 5 Rocket failed due to a bad conversion of data, a 64-bit floating point number to a 16-bit integer. This caused the guidance system to crash and tell the rocket to make a sharp left hand turn, something that rockets in real life don't generally do... at all. (You could say that once the guidance system crashed, so too did the rocket.) Fortunately for anything below, the range safety system detected someting had gone very wrong.
  • Seen on a button at WorldCon: "Black holes are where God is dividing by zero", effectively logic bombing a small piece of the universe.
    • Singularities in general tell us that the physical model that contains them has a hole at that point where it cannot predict events. This is important in two ways: It is a very good idea to know where the model you are using to predict the behavior of the real world is going to be wrong or useless, and the presence of a true singularity in the model shows that the model, no matter how good it might be, is incomplete or wrong in some way.
  • An F-15 was landing in the Dead Sea (below sea level). During final approach, the navigational system crashed. The pilot landed manually. Since this was very close to hostile countries (within the Middle East), the contractor needed to fix the problem quickly. It turns out the navigational system divided by the altitude. When the altitude went to 0, it caused a divide by zero error in the navigational system.
    • This is why edge cases in programming are important.
  • Similarly, a divide by zero error led to a critical failure of the propulsion system aboard the Ticonderoga-class guided missile cruiser USS Yorktown, requiring it to be towed back to port.
  • Arguably, infinite looping commands such as "add 2+2 until it equals 5" (which will never happen, hence the infinite loop), which result in a computer freezing as it attempts to solve the loop, are technically logic bombs. Modern computers don't do this is because they have actual loop-absorbing crumple zones, triggered automatically when the system detects such a stalled process, and either letting the user terminate the process or ending it automatically.
    • In the greater world of networks, it is possible to have two automated systems go into a similar loop. One early such instance was the E-mail loop: when an e-mail is sent to an auto-reply address with a return address of another auto-reply address: causing the two systems to play e-mail tag replying to each other. This loop was quickly discovered and various preventive measures taken to minimize the impact.
    • Some advanced mathematics programs (like Wolfram Alpha) will compute almost anything you ask. While they return an error on division by zero problems, you can ask them to compute pi to any suitably large number of places and the program will brick until it gives you exactly what you ask for.
    • Old mechanical calculators were liable to loop indefinitely when attempting to divide by zero, with some risk of damage if one was left running for too long in this state.
  • The fork bomb is another logic bomb used to attack computers. It is a simple program that does nothing but request that the computer run the same program two more times. Exponential growth means that within moments there are millions of processes running and the computer grinds to a halt attempting to deal with them, despite the fact that individually they don't do much of anything. Modern operating systems usually put a stop to this by only allowing a limited number of processes to run at once.
  • In 2009, a typo in the Google's blocked sites list caused it to block all websites on the Internet. Including Google itself, of course.
  • One odd Norton Anti-Virus glitch had it classify itself as a virus. Norton Anti-Virus deletes viruses. Norton Anti-Virus then commits suicide.
    • In 2015, Panda Antivirus had the dubious honor of joining Norton in the Antivirus harakiri hall of fame[1] when a botched update made it detect itself as a virus and promptly commit suicide- although, due to the way it was written, when this occurred, Windows promptly died with it.
      • Other antiviruses on the list include McAfee, MalwareBytes, Sophos, and even good quality ones like AVG and Avast!.
      • Zone Alarm likes to point out that it just marks its own Installer/Uninstaller as Adware. That's not a virus. Just honesty.
    • Try having two anti-virus programs running on the same computer at the same time. One will intercept a data transfer to check it for potential threats, then send it on. The other will see this transfer, intercept it, and send it on. The other will see this transfer, intercept it, and send it on. The other will see this transfer, intercept it... This is why some Antiviruses try to detect if it's the sole antivirus on the system, and bitch about it if it isn't. Due to a misconception spread by some "computer experts" back in the 90s, there were a handful of users who were convinced that it's beneficial to have multiple antiviruses on the same machine. After all, two heads are better than one (except in this case).
    • A problem can occur when running a modern anti-virus on an outdated system or vice versa, which can make an anti-virus program detect viruses that have already been quarantined. They then create a second quarantine file of the "new" virus so that the next time it checks, it finds two problems. Then 4, then 8, 16, 32, and so on until the constantly running security checks and copious junk data hurt your PC's performance way more than the original malware did. Fortunately, it'd take forever to get bad enough to brick your PC, so it's pretty easy to catch and repair.
  • On This Very Wiki, some tropes seem to contradict each other, such as for example There Are No Girls on the Internet and Most Fanfic Writers Are Girls, It Always Rains at Funerals and It's Always Sunny at Funerals, or Trailers Always Spoil and Never Trust a Trailer. They do not form paradoxes, however, since tropes are not logical truths. Firstly, tropes generally hold for one or more works (here, "works" is meant in the widest sense possible, including memes and general attitudes: the first pair above are about memes) and it is understood that they do not necessarily hold anywhere else (which is why we have example lists): the trope XIsY is shorthand for "it can be non-trivially observed that in some works, X is Y". Next, even when looking into those works where the trope appears, it may reflect anything from fictional "fact" (X is indeed Y) to tendency, possibly in a twist (X tends to be Y, but whoops) to belief (X is held to be Y, at least by one character). Sometimes we make an example of a work because it averts X is Y, and that's still not paradoxical. In the Real Life sections of tropes, or in the case of a trope that deals with the real world (as the last pair in the list above does), contradictions could suggest a possible paradox, but most such can be explained by (acceptable) vagueness and bias.
  • Don't Shoot the Message is a good example of a Logic Bomb that surfaces often in everyday life. Hypocrites are said to be bad because they do not live up to the ideals they preach— but they are more often condemned by their ideological opponents than by decent people on their side who you'd think would urge the hypocrites to Stop Being Stereotypical. Ideological opponents, of course, criticize a given movement because they think it's inherently bad. But if you don't actually believe in an ideology you say you believe in, and that ideology is bad, then logically you must be good. But it's unethical to live a lie, so you must be bad— even if what you're lying about is something that's bad in the first place, in which case undermining it is good, and so on ad infinitum. Humorously summed up by Oscar Wilde in The Importance of Being Earnest, when he has the character of Cecily say: "I hope you have not been leading a double life, pretending to be wicked and being really good all the time. That would be hypocrisy."
  • During the late Cold War, there was a recurring issue with group-based ICBM missile silos somehow ignoring their own safety protocols and attempting to launch by themselves. In one declassified instance, the computer went so far as to ignore commands to abort the launch. In a move of desperation, one of the silo crew members set the missile to target its own silo. The computer tried to target itself but ran into the problem of orbit, meaning that any ballistic long enough for the missile to hit its launch point would also be powerful enough that (at least in the computer's mind) the warhead would never hit the ground. The launch computers crashed and fed no data into the missile's flight computer, which then also crashed, causing analogue systems to abort the launch.
  • The Ancient Greek philosopher Zeno of Elea proposed several famous logical paradoxes which completely baffled his contemporaries. They describe seemingly simple scenarios in ways that resulted in obvious contradictions. Some of these were resolved by his contemporaries, but the paradoxes of motion are still debated to this day. This was a very serious problem because Zeno's premises seemed undeniable and his arguments had proper logical form, yet the results were clearly false. To many, this suggested that logic itself was broken.
    • On the other hand, there is a school of thought which holds that Zeno never meant for the paradoxes to be taken as serious philosophical questions, but rather to illustrate a flaw in the Logical Method - i.e. that exercises in pure Logic do not always translate over into Reality - which they quite handily did.
    • A few refutations to the paradox of motion do exist:
      • Aristotle's was quite simple. Zeno assumes that time and distance are infinitely divisible, but there is no evidence for such a claim. Without this assumption, the entire paradox collapses. An unjustified claim that results in absurdity can be safely ignored.
      • Modern calculus attacks the problem from a different position, arguing that Zeno's logic really is flawed as no rigorous method for handling infinite sums existed at the time. If time and space are infinitely divisible, then ordinary arithmetic is not appropriate to understand or describe them. Indeed, the paradoxes of motion pose questions that are basic examples for introductory calculus.
      • According to our current understanding of physics, the Planck Distance (about 10^-35 meters) is the shortest distance possible. If this is correct, then there aren't really an infinite number of stops involved in moving a certain distance (at least, not in the real world.)
  • Many mathematical proofs, from the irrationality of the square roots of 2 to Fermat's Last Theorem, rely on this— assuming the opposite of the theorem to be proved, and then arrive at a contradiction through mathematical induction or some other method.
  • In the early days of computers playing chess (or otherwise engaged in evaluating strategic choices), the computer often crashed when there were multiple choices of which none offered an advantage over another— so that it couldn't choose one.
    • Which itself is an adaptation of the classic paradox of a donkey being placed equidistant to two equally sized, equally nutritious bags of feed— unable to choose between the two, the dictates of pure logic would lead it to starve to death exactly where it was.
  • The famous "The 47 Ronin" situation was a massive Logic Bomb by the standards of the Shogunate. The ronin had violated a direct order from the Shogun by avenging their master Asano through killing Kira, the one who forced him into commiting seppuku... yet some said that they had acted according to Bushido by avenging their master. Some believed that the truly honorable thing would have been to just charge Kira's home ASAP, deciding to get gallantly cut to pieces through honest action rather than attain vengeance through trickery, especially given the major flaw in their plan - the potential for Kira to die of natural causes at some point before their attack - which would have left them unable to avenge Asano.. On the other hand, in a society where honour was most definitely more important than life itself, it could be argued that their actions were the ultimate demonstration of loyalty to their lord: by publicly dishonouring themselves to ensure their plan's success, they could be seen to be making the supreme sacrifice for Asano's sake. As such, the deal was resolved by allowing 46 of the ronin to commit seppuku instead of being dishonorably executed; the youngest of them was spared and became a monk.
  • A number of logic bombs have been found lurking in the realms of pure mathematics and logic with various consequences.
    • Mathematicians had long assumed that functions that were continuous were "smooth" essentially everywhere. This idea had never been successfully proven, but fit their intuition well. Karl Weierstrass upended that assumption, finding a function that is continuous but has no derivatives, a curve that is infinitely jagged. In doing so, he highlighted the need for rigorous definitions, and frustrated the mathematical world; his counterexample was described as a "lamentable scourge", a "monster", and an "outrage against common sense".
    • Mathematics in the late 1800s saw the emergence of naive set theory. ("Naive" here refers to the fact that mathematicians generally manipulated sets according to poorly-defined intuitions about how a collection of objects should behave, rather than a clearly defined set of rules known as "axioms".) One of these intuitions was that for any property, a set of every object satisfying that property existed. Russell showed that this assumption would lead to a contradiction, through a very self-referential argument:
    Let R be the set of all sets that do not contain themselves. R cannot be in R; if it did, then it would contain a set that contained itself. So then R cannot be in R. But if R does not contain itself, it must be in R!
    • This proof was shocking to the mathematical community, and sparked a decades-long hunt for a system of axioms for sets that would be powerful enough to convey all of mathematics, but not so powerful that it would allow mathematicians to prove two contradictory statements, as Russell had.
    • Unfortunately, Kurt Gödel showed that such a system could not exist. His first Incompleteness Theorem (explained in the "Computing" section above) demonstrates that any useful mathematical system must contain statements that are true, but can never be shown as true within that system. ("Useful" here means that this system can model the counting numbers 0, 1, 2, ..., while also containing no contradictions.) These are perfect logic bombs, as a person or computer could try forever to answer them without success or even finding evidence that they will never succeed.
  • The Liar's Paradox: "This statement is false." Which could be accepted as true... but then that confirms it's false... but then it's true... then it's false. The logic constantly loops, making the statement neither true nor untrue.
  • The similar "Epimenides Paradox". Epimenides, a Cretan, (apocryphally) stated that "All Cretans are liars". If this was true, then it would seemingly conflict with Epimenides' statement, but if it was a lie then it would seemingly confirm the statement, just contradicting it by being truth coming from a liar. Taken at face value, it appears to be the same as the Liar's Paradox, but in actuality, if Epimenides is lying about all Cretans being liars, then it's entirely possible for a Cretan (who needn't be Epimenides) to tell the truth. Or maybe he just meant that all Cretans usually lie.
  • Another classical paradox: An Athenian student hired a lawyer to teach him law, and they signed a contract that the student would pay if he won his first case, and would owe nothing if he lost it. He completed his studies and refused to pay. The lawyer promptly sued him, and argued that either the student would win, in which case he would owe the money, or he would lose, in which case judgment would be awarded against him and he would owe the money. The student rebutted by arguing the exact opposite: either he would win, and be absolved of paying, or he would lose, and therefore not owe the money.
  • A Swedish activist claimed that he wished to move to a country where no immigration was permitted. Just chew a little on that one...
  • Asking some forms of mechanical calculator to divide by zero doesn't break them, rather it causes them to loop infinitely (as the computer expects to perform the given task an infinite number of times). The machine's parts will whirr around quite happily until you tell it to stop. You can see this in action here.
  • Some koans deliberately use this as a way towards enlightenment by momentarily shorting out rational thought when the listener tries to understand it.
  • Relatedly, Buddhism and Zen especially have a concept known as Mu which boils down to "the question itself is wrong".
    • The physicist Wolfgang Pauli would later express a similar sentiments describing one paper as "Not even wrong." and telling Lev Landau (a great physicist in his own right) that "What you said was so confused that one could not tell whether it was nonsense or not."
  • Lal Bihari Mritak tried to do this to the legal system of his native India by kidnapping his cousin, picking fights with and insulting officials, and trying everything he could to get arrested. The reason he did this was that his uncle had bribed an official to get Lal Bihari declared Legally Dead (Mritak means "dead" in Hindi) to seize his land, and if the local officials arrested Lal Bihari, then they'd effectively admit that he was still alive. In fact, Bihari formed an entire group of people with similar issues, the Uttar Pradesh Association of Dead People.
  • Gabriel's Horn is a mathematical object that has a finite volume but an infinite surface area.
  • Some high school advanced mathematics textbooks contain problems where you are first presented with a formula which seems to prove 1=0 or something equally contradictory, and then asked to spot the flaw in the proof. The most frequent case is that there's a step where division by zero occurs, expressed as something not obvious like "x - y" where somewhere else in the proof it is established that x = y.
  • Imaginary (or complex) numbers. Using "real" numbers, the square root of -1 should not exist, because multiplying any number by itself produces a positive number (1 X 1 and -1 X -1 both equal 1). At some point, someone decided to call this nonexistent number "i" and just move on from there. Though it seemingly makes no logical sense, and there's still no good way to explain what "i" actually means, it's now used extensively in science, engineering, and the like.
    • This appeared with no logical sense only in the first centuries after the resolutive formula of the third grade equation, that in some cases produces them "temporarily". As usual in mathematics, imaginary and complex numbers were defined as regular entities: there exists a rigorous construction of C as R×R, set of ordered pairs z=(x, y) for z=x+iy.
    • By the way, the rigorous construction of the real numbers, via Dedekind cuts, is harder than the construction of the imaginary numbers. Is it not a paradox?
  • Malicious compliance (see "Bothering by the Book") is a direct invokation of this. The boss expects things to be done a certain way, and an employee performs it that way, causing harm to the company. The boss can't punish the worker for doing as they were specifically ordered to, but Bad Things(tm) happened because of the worker's actions. This usually results in an Obvious Rule Patch.
  • A nonsensical command in the Interactive Fiction game AI Dungeon 2 will result in the AI going into an infinite loop of prompts, forcing the player to revert to an earlier state or restart the game.
  • During an exchange on CNN between Bernie Sanders and Elizabeth Warren for the 2020 presidential primaries, they asked if Sanders ever told Warren that a woman couldn't win the election, to which he answered no. Then they asked Warren what she thought when Sanders told her a woman couldn't win the election. What?
    • This is what is known as a "loaded question", a rhetorical tool in which the question itself contains a controversial assumption, usually a presumption of guilt that the interrogator is attempting to get the one being asked to confirm. Another example would be "Have you stopped beating your wife?"; no matter the answer, the question retains the assumption that the answerer is a perpetrator of Domestic Abuse. The Logic Bomb part comes in where the "assumed" part of the question has already been refuted; ["Do you have a wife?" "No." "When did you stop beating your wife?"] which can lead to the answerer becoming confused or caught off guard. Of course, this can easily backfire, as the answerer may be quicker on the uptake; ["I just told you, I don't have a wife."] which can cause the interrogator to be the one losing face, by seeming intattentive or caught trying to force a confession from the answerer.
  • For a biological example, there is the "ant mill" or "ant death spiral" phenomenon seen in army ants. Army ants follow a rather simple set of behaviors, and primarily navigate by following scent trails left by other members of the swarm. A death spiral occurs when a group gets cut off from the main swarm and starts following their own scent trails. The scent trails go in a circle, causing the army ants to literally run in circles until they drop dead from starvation or exhaustion.
  • This is averted by real-life "large language model" AIs, such as GPT, which function by finding the most likely completion of a given text. No matter what convoluted paradox you describe to them, they'll never crash, but simply write more text that logically continues what you wrote — for example, sending the message "This sentence is false" to the chatbot ChatGPT gives a response along the lines of "This statement is a classic example of a liar paradox; I can't determine if it's true or false. What else may I help you with?"


Video Example(s):

Alternative Title(s): Does Not Compute


President Eden

If the Lone Wanderer has a high Science skill during "The American Dream" quest, they can easily defeat President Eden by pointing out his circular reasoning towards his acting as the "president" of the United States. This would cause the computer AI to suffer a logic error and reset to his default programming, by which point the Lone Wanderer can then order Eden to self-destruct alongside all of Raven Rock.

How well does it match the trope?

4.88 (8 votes)

Example of:

Main / LogicBomb

Media sources: